|
myCmd.Parameters.Add("@startDate", sDate)
myCmd.Parameters.Add("@endDate", eDate)
is't correct i do like this?
10 records means, retrive 10 rows of record or more.
author date
a ....
b ...
and so on
|
|
|
|
|
sebastian yeok wrote: myCmd.Parameters.Add("@startDate", sDate)
myCmd.Parameters.Add("@endDate", eDate)
is't correct i do like this?
Assuming your SQL Statement now contains a WHERE clause that uses the parameters @startDate and @endDate then I would assume it is correct.
sebastian yeok wrote: 10 records means, retrive 10 rows of record or more.
The context of this statement has been lost. What does this refer to? (You may find it useful to use the "Quote Selected Text" button to insert a quote from the previous post so that the context of statements is not lost.)
ColinMackay.net
Scottish Developers are looking for speakers for user group sessions over the next few months. Do you want to know more?
|
|
|
|
|
Hello,
I have made a search but could not find how to indent correctly SQL statements.
For instance;
SELECT * FROM a WHERE att1 IN ( SELECT att8 FROM b WHERE att2 = 10 AND att3 = 20 AND att4 = 30 GROUP BY att8 HAVING count(*) > 3 )
How this sample statement can be indented correctly(standard way)?
Any opinion or any article would be appriciated.
Kind Regards,
Sarp
|
|
|
|
|
I've not come across a standard way of doing this. My particular scheme is something like this:
SELECT Column1, Column2, Column3
FROM MyTable AS mt
INNER JOIN SomeOtherTable AS sot ON mt.pk = sot.fk
INNER JOIN (SELECT fk, ColumnA, ColumnB, ColumnC
FROM InnerTable
WHERE ColumnD = 10) AS it ON mt.pk = it.fk
WHERE mt.Column4 IS NOT NULL
ORDER BY Column2;
ColinMackay.net
Scottish Developers are looking for speakers for user group sessions over the next few months. Do you want to know more?
|
|
|
|
|
I write my SQL like this:
SELECT
filed1,
field2
FROM
table1,
table2
WHERE
condition 1
AND
contidion 2
ORDER BY
field
LIMIT
1
DESC
Q:What does the derived class in C# tell to it's parent?
A:All your base are belong to us!
|
|
|
|
|
Colin Angus Mackay wrote: My particular scheme is something like
I do the same here. My old Oracle book from my undergraduate database course does it this way.
PJC
|
|
|
|
|
Thanks all for your opinions.
|
|
|
|
|
hi, i have to question here hope someone can help me.
1) INSERT INTO table2 (col1, col2, col3....)
SELECT col1,col2, col3
FROM table1
WHERE someting.....
the select statement used the .ExecuteReader and the insert statement used .ExecuteNonQuery how to join this both used in question (1)? guide me step by step.
2) by refer to question (1), how to move 1000 or more rocord from table1 to table2 ?. how to i store the 1000 records from table1. is there any solution to this problem.
anyone help appreciate! thanks
|
|
|
|
|
sebastian yeok wrote: the select statement used the .ExecuteReader and the insert statement used .ExecuteNonQuery how to join this both used in question (1)?
A better way of looking at is is:
I expect results back so I use ExecuteReader()
I don't expect results back so I use ExecuteNonQuery()
The SQL you supply is correct, you don't expect results back to the calling application. You therefore use ExecuteNonQuery()
sebastian yeok wrote: how to move 1000 or more rocord from table1 to table2 ?. how to i store the 1000 records from table1. is there any solution to this problem.
I don't understand what the problem is that you are having. In (1) you already manage to copy the rows from one table to another. If you don't want the rows to continue to exist in the originating table you perform a DELETE operation with the same criteria in the WHERE clause.
ColinMackay.net
Scottish Developers are looking for speakers for user group sessions over the next few months. Do you want to know more?
|
|
|
|
|
Hi, I have two identical tables in 2 seperate databases. 1 is the "central" DB and its contents can change. Basically I want to write a poece of code to run periodically to ensure the contents of table 2 are up to date with that of table 1. Id imagine this is a relativaly common task so I was wondering are there any smart ways of doing it or any existing code knocking around? Thanks in advance,
|
|
|
|
|
You could try something like this:
SELECT a.pk, b.pk
FROM FirstDatabase.dbo.TableName AS a
FULL OUTER JOIN SecondDatabase.dbo.TableName AS b WHERE a.pk = b.pk
WHERE a.pk IS NULL
OR b.pk IS NULL
pk = primary key, if you have a compound key then you will need all the columns that make up the primary key.
The results of the query should (I haven't tested it) return any rows that exist in one database, but don't in the other.
If you want to return all rows that have differences then you might want to add to the WHERE clause:
OR a.column1 <> b.column1
OR a.column2 <> b.column2 ...and so on for each of the columns.
ColinMackay.net
Scottish Developers are looking for speakers for user group sessions over the next few months. Do you want to know more?
|
|
|
|
|
Sorry I should have mentioned the two DB's are actually running on 2 different servers
|
|
|
|
|
That's okay. The naming convention extends to servers. Just add in the server name like this:
ServerName.DatabaseName.SchemaName.TableName
SchemaName is dbo , unless you've set it up otherwise.
You will also have to link the two servers together. You might find this useful: MSDN: Configuring Linked Servers[^]
ColinMackay.net
Scottish Developers are looking for speakers for user group sessions over the next few months. Do you want to know more?
|
|
|
|
|
I'm having performance issue with this ASP.NET app. From SQL Server log there're MANY MANY log which resembles the following:
SQL Server log: Login succeeded for user 'APPL_ACCOUNT'. Connection: Non-Trusted.
I'm talking thousands in one morning. And I am suspecting that this is the cause of the performance degradation. I thought when you
<br />
conn.Open();<br />
ADO.NET draws from ADO.NET connection pool using existing connections in the pool and you don't "login" again? Is this assumption right? I ran a very very simple test:
<br />
using System;<br />
using System.Data;<br />
using System.Data.SqlClient;<br />
<br />
static void Main(string[] args)<br />
{<br />
String s_conn = "Data Source=127.0.0.1;Initial Catalog=pubs;User Id=sa;Password=secret;Max Pool Size=80;Min Pool Size=30;";<br />
IDbConnection oconn;<br />
Int32 i;<br />
<br />
oconn = new SqlConnection (s_conn);<br />
for(i=0; i<100; i++) <br />
{<br />
oconn.Open();<br />
<br />
oconn.Close();<br />
}<br />
<br />
return;<br />
}<br />
This did NOT generate a bunch of "Login succeeded" in my SQL Server's Server Log. But then, the surprise was, there was NOT even ONE "Login succeeded" registered.
Btw, I'm using NHibernate... And the way NHibernate open an ISession (corresponds to a IDbConnection) is as follows:
<br />
Dim _nhibernate_conn_factory ISessionFactory = BuildFactory() 'This is time consuming so we only create it once...<br />
<br />
Public Function GetWarehouseNHibernateSession() As ISession Implements IWarehouseConnectionManager.GetWarehouseNHibernateSession<br />
Dim conn As ISession<br />
Dim maxRetry, retryFreq As Int32<br />
<br />
Try<br />
<br />
conn = nhibernate_conn_factory.OpenSession()<br />
<br />
Catch ex As System.Data.SqlClient.SqlException<br />
'handle the exception<br />
Catch ex As Exception<br />
'handle the exception<br />
End Try<br />
<br />
Return conn<br />
End Function<br />
My finding is, for each "OpenSession" there's a corresponding entry in SQL Server's log:
"SQL Server log: Login succeeded for user 'APPL_ACCOUNT'. Connection: Non-Trusted."
Is this normal? On one of my page there's 35 OpenSession - seems like this is what's slowing down the application. Advice? Thanks Thanks!
If you want to look deeper into NHibernate's code, look here
ADO.NET connection pooling REF: http://www.15seconds.com/issue/040830.htm
NHibernate REF: http://nhibernate.sourceforge.net/NHibernateEg/NHibernateEg.Tutorial1A.html
|
|
|
|
|
I just finished reading about NHibernate a couple of days ago (I've only played with it for a while, haven't used it myself).
From my understanding, you need to use and keep just one session object for each user (and each page) that is logged in. All the samples I saw only use one session, and for updating the sample either loads from the database before updating, or stores the session in the ASP.Net session variable for later use.
Apologies if it's a bit unclear. I'm a bit tired after a weekend full of wedding (not mine) and moving out.
Edbert
Sydney, Australia
"A day without sunshine is like, you know, night."
|
|
|
|
|
Hey thanks. Just one question first. SessionImpl implements ISerializable. Do you think that it will be compatible with NLB (Network Load Balancing)+StateServer if I cache NHibernate.ISession/SessImpl in HttpSession?
|
|
|
|
|
I reckon it can if you use MSSQL Server for the HttpSession, but if you use in-proc session with multiple servers I'm not sure.
You don't actually need to cache the NHibernate.ISession itself.
I saw a different implementation which requery the database for the previous state before updating with data from user, and then commiting to the database (it was a sample on how to use NHibernate for ASP.Net, I don't like the sample for having to requery the database only for that).
There is a good BugTracker[^] project using ASP.Net 2.0 and NHibernate that you might want to look at for more real-life sample of using NHibernate.
Edbert
Sydney, Australia
|
|
|
|
|
I was thinking of caching NHibernate.ISession/SessionImpl in HttpSession on session start then close it in session ends. But then I want to make sure SessionImpl is compatible with NLB+StateServer before I make the changes. But, then... alternatively, I can cache it as page member variable - that's more messy/cluttered though.
|
|
|
|
|
try
<br />
conn.Open();
or
<br />
sess.Open();
If the connection/session comes from ADO.NET connection pool, a "login" will be executed against your SQL server instance. Now, depending on audit level configured on your particular SQL server instance (None, failure only, success only, All), if audit level = "All" then all your successful login would be logged on your SQL server Server Log with a message which resembles "Login succeeded... "
|
|
|
|
|
We're having some performance problems with our database where we are having to scan tables in order to find data created by a particular user. "Users" are just just stored in a table with name, password, etc. Some of our stored procedures require we find data belonging to a particular user.
Our database guy suggested a possible fix: each user will use SQL authentication to connect to SQL, passing his own user name and password in the SQL connection string. The index view will then only look at the user's data, it will be a view on the user's data only. He says this can be done because the view is showing data only for that user (identified in the connection string).
To me, I thought this sounds like a hack, but not being a big database guy, I don't really know if this is feasible or a good solution. Any thoughts from you guys?
Tech, life, family, faith: Give me a visit.
I'm currently blogging about: Connor's Christmas Spectacular!
Judah Himango
|
|
|
|
|
The pro is that the view will return only the user's data because, I'm guessing, that it will be defined as something like this:
CREATE VIEW SELECT * FROM MyTable WHERE User = USER(); However, I don't see where the performance gain is on this because it will still be doing the same as before (it's just hidden).
Passing the user name and password in the connection string means that your application will lose some of its ability to use connection pooling. (If this is a thick client windows application with only one user at a time using it then this isn't really a problem, however if you have a web application with many users accessing it then losing the ability to effectively use connection pooling could introduce performance problems - However, I've never done any testing on that so I don't know. I'm flagging it as it may be a concern and something to look in to).
From your description of the problem I'd say that you have a problem with indexes rather than anything else. It is also the easiest solution because it won't require views to be created or code to be changed to have dynamic connection strings and so on.
Look at what columns are being referenced most often in the where clause (and it sounds like it will be something like the UserId column) and index it. If you are using SQL Server 2000 there is an index tuning wizard you may want to look at.
Does this help?
ColinMackay.net
Scottish Developers are looking for speakers for user group sessions over the next few months. Do you want to know more?
|
|
|
|
|
Yeah I think it does help. I believe the performance gain is by the fact the view is created based on the current SQL user, something about how that works.
For indexing, there are certain tables we cannot index due to the table have TEXT fields. I understand the SQL 2005 now has varchar(MAX) which is index-able and can store something like 2 billions characters... we will have to look into that more though.
Thanks Colin.
Tech, life, family, faith: Give me a visit.
I'm currently blogging about: Connor's Christmas Spectacular!
Judah Himango
|
|
|
|
|
I doubt creating a per-user connection string will be of much help.
You might want to have a look at SQL 2005 partitioning[^] instead.
From what I have learned/been told (I haven't tested this myself, but there's a performance statistic on the article) partitioning segments your data for better performance - you can even store the data on different hard disks.
Edbert
Sydney, Australia
|
|
|
|
|
According to our DB guy, it isn't that the per-user connection itself gives good perf, but rather, it allows you to create indexed views showing data only for that user...
Thanks for the paritioning pointer, I'll have a look at that.
|
|
|
|
|
There is two tables HP_AccountName and T_Values. HP_AccountName includes all main accounts,T_Values includes all operations about accounts. Sample Tables r fallowing.
HP_AccountName............................|T_Values (An Account can be writen different lines. )
------------------------------------------|-----------------------------
HP_Level..HP_No..HP_AccountName...........|....MFD_AccountNo..MFD_Total
1.........1......Main1Account.............|....1200254........10,000
2.........10.....Main1SubAccount1.........|....1002865........15,000
3.........100....SafeBoxes................|....1009431........20,000
2.........12.....Main1SubAccount2.........|....2005454.........0,500
3.........120....Customers................|....2000024.........5,000
1.........2......Main2Account.............|....1205471........35,000
2.........20.....Main2SubAccount1.........|....1205471.........0,600
3.........200....Renders..................|....1205471.........0,400
...............................................2006300........48,500
All I want is a result Set Like
HP_No..HP_AccountName.........TOTAL
1......Main1Account...........81,000
10.......Main1SubAccount1.....35,000
100........SafeBoxes..........35,000
20.......Main1SubAccount2.....46,000
120........Customers..........46,000
2......Main2Account...........54,000
20.......Main2SubAccount1.....54,000
200........Renders............54,000
I worte something like that but i couldnt do axactly what i want
Select SPACE(HP_Level*3)+ HP_AccountName,
(Select SUM(MFD_Total) FROM T_VALUES WHERE MFD_AccountNo LIKE HP_No + '%') FROM T_ACCOUNTS
WHERE HP_Level<=1 order by HP_No
-- modified at 7:22 Thursday 16th February, 2006
|
|
|
|