16,004,647 members
Sign in
Sign in
Email
Password
Forgot your password?
Sign in with
home
articles
Browse Topics
>
Latest Articles
Top Articles
Posting/Update Guidelines
Article Help Forum
Submit an article or tip
Import GitHub Project
Import your Blog
quick answers
Q&A
Ask a Question
View Unanswered Questions
View All Questions
View C# questions
View C++ questions
View Javascript questions
View Visual Basic questions
View .NET questions
discussions
forums
CodeProject.AI Server
All Message Boards...
Application Lifecycle
>
Running a Business
Sales / Marketing
Collaboration / Beta Testing
Work Issues
Design and Architecture
Artificial Intelligence
ASP.NET
JavaScript
Internet of Things
C / C++ / MFC
>
ATL / WTL / STL
Managed C++/CLI
C#
Free Tools
Objective-C and Swift
Database
Hardware & Devices
>
System Admin
Hosting and Servers
Java
Linux Programming
Python
.NET (Core and Framework)
Android
iOS
Mobile
WPF
Visual Basic
Web Development
Site Bugs / Suggestions
Spam and Abuse Watch
features
features
Competitions
News
The Insider Newsletter
The Daily Build Newsletter
Newsletter archive
Surveys
CodeProject Stuff
community
lounge
Who's Who
Most Valuable Professionals
The Lounge
The CodeProject Blog
Where I Am: Member Photos
The Insider News
The Weird & The Wonderful
help
?
What is 'CodeProject'?
General FAQ
Ask a Question
Bugs and Suggestions
Article Help Forum
About Us
Search within:
Articles
Quick Answers
Messages
Comments by SriNivas IT (Top 17 by date)
SriNivas IT
26-Jul-17 3:21am
View
Thank you so much !!!
SriNivas IT
26-Jul-17 2:28am
View
Pls look into the matter,
i was so disappointed :(
SriNivas IT
11-Apr-13 5:59am
View
ok! but the problem is that there is lot of time in fetch first record also (of many rows).
but in back end it takes 10 seconds and using datareader it takes more that 10 minuts even for first record .
SriNivas IT
9-Apr-13 9:48am
View
what is the problem in datareader or any other????
Please help me???
SriNivas IT
5-Apr-13 5:07am
View
Please help me to resolve this problem..!
SriNivas IT
5-Apr-13 4:53am
View
Ok the code i pasted here
TransactionOptions options = new TransactionOptions();
options.Timeout = new TimeSpan(0, 10, 4);
options.IsolationLevel = System.Transactions.IsolationLevel.ReadCommitted;
using (TransactionScope transactionScope = new TransactionScope(TransactionScopeOption.Required, options))
{
try
{
serviceDetail.ServiceRequestNumber = ServicerequestNo;
if (HttpContext.Current.Session[EnumSession.EntityID.ToString()] != "")
{
serviceDetail.EntityID = Convert.ToInt32(HttpContext.Current.Session[EnumSession.EntityID.ToString()]);
}
if (customerDetail.enableCustomerPanel == true)
{
dictParam = new Dictionary<string, parameteritem="">();
dictParam.Add("@FirstName", new ParameterItem(0, DbType.String, customerDetail.FirstName, ParameterDirection.Input));
dictParam.Add("@LastName", new ParameterItem(1, DbType.String, customerDetail.LastName, ParameterDirection.Input));
dictParam.Add("@Address", new ParameterItem(2, DbType.String, customerDetail.Address, ParameterDirection.Input));
dictParam.Add("@CityID", new ParameterItem(3, DbType.Int32, customerDetail.CityID, ParameterDirection.Input));
dictParam.Add("@PinCode", new ParameterItem(4, DbType.String, customerDetail.PinCode, ParameterDirection.Input));
dictParam.Add("@Mobile", new ParameterItem(5, DbType.String, customerDetail.Mobile, ParameterDirection.Input));
dictParam.Add("@Phone", new ParameterItem(6, DbType.String, customerDetail.Phone, ParameterDirection.Input));
dictParam.Add("@UserId", new ParameterItem(7, DbType.Int32, HttpContext.Current.Session[EnumSession.UserID.ToString()], ParameterDirection.Input));
this.Connection.AddParameters(dictParam);
customerDetail.CustomerID = Convert.ToInt32(this.Connection.ExecuteScalar(CommandType.StoredProcedure, Constants.USP_INSERT_CUSTOMER));
this.Connection.ClearParameters();
}
if (productDetail.enableProductPanel == true)
{
......
and also save data three more methods
SriNivas IT
22-Mar-13 6:32am
View
There is issue of procedure performance ,proc takes more than 10 minuts
so in some place i use
OUTER APPLY with where condition
in place of LEFT JOIN
SriNivas IT
20-Mar-13 5:50am
View
ok thanks alot gvprabhu !
it works!!!
SriNivas IT
20-Mar-13 2:12am
View
ok, gvprabu but i want to create non-unique index actually for fast searching of data.
from this solution unique key constraint violates and error message comes
Violation of UNIQUE KEY constraint 'UQ__#6D181FE__2C5D12DB71DCD509'
SriNivas IT
28-Feb-13 8:15am
View
I replaced union to union all
and with common table expression i used with distinct due to duplicacy
Like that
with cte_result
(
select colums.... from A
union all
select colums.... from B
union all
select colums.... from C
....
....
....
....
)
select distinct * from cte_result
can this increase some performance????
SriNivas IT
27-Feb-13 5:17am
View
OK !
But as i say that my problem here is that
sp_1 returning rows 17736
sp_2 returning rows 17731
how i can find out which rows are extra in this huge result
SriNivas IT
27-Feb-13 5:08am
View
Actually SP has more than 50 columns and i just want to store this result very short time Like select * from (select * from Table_Name) mainly i want to compare result set of 2 SPs whose result set returns 17736 and 17731 rows respectively now i want to intersect result for this only
SriNivas IT
27-Feb-13 5:08am
View
Actually SP has more than 50 columns and i just want to store this result very short time
Like select * from (select * from Table_Name)
mainly i want to compare result set of 2 SPs whose result set returns 17736 and 17731 rows respectively now i want to intersect result for this only.
SriNivas IT
27-Feb-13 4:56am
View
But I can not use sp_configure proc to change setting can i not do that like
select * from (Exec SP_Name 'p1',p2,'p3')
???
SriNivas IT
25-Feb-13 4:05am
View
Ya thanks
Rahul
SriNivas IT
23-Feb-13 6:22am
View
Actually Arun you may understand by query situation as below:
Let A is a table which has 100000 or more records and i fire query as
select top 400 col1 from A
then affected rows are as 400
but i want to find total scanned rows which are defenately 100000
can it possible????
SriNivas IT
22-Feb-13 1:38am
View
Hi GVPrabu,
I want to return 100000 (in rows affected message or any medium) not 500 rows
Thanks
Shreeniwas
Show More