|
And how are we meant to optimise it without knowing anything of the underlying database structure?
Or were you hoping someone would be bothered to read all the code and suggest an alternative method?
Try examaining the execution plan and look for the biggest gains.
Bob
Ashfield Consultants Ltd
|
|
|
|
|
If you are using SQL 2005 and SQL Server management studio, you can execute this stored procedure in the query window and choose the option "Display Estimated Execution Plan" under the "Query" menu.
By doing this it will show you where the table scans and index reads are. If you see a table scan, you might want to consider adding and index.
david
|
|
|
|
|
Hi,
I just wanted to know the term I am looking for what it is called so I can research further.
Say I have an C# application that connects to an SQL server located somewhere in a network via connection string. What is the communication layer/protocol that happens in between? For sure the machine name specified in the connection string is converted to an IP address first so there definitely is a DNS query happening beforehand. How does it know that the application is allowed to access the database? Is it in behalf of the user logged in to the machine where the application is being run?
Can anyone shed some light for me on this?
Thank you.
----------------------------------------------------------
"unzip; strip; touch; finger; mount; fsck; more; yes; unmount; sleep" - my daily unix command list
|
|
|
|
|
You are referring to a Data Access Layer (DAL), there are many different styles and types of DAL including a pattern from Microsoft.
I usually create a standard SQL ID (userid and password) for each application (database), this allows you to manage your users internal to the app and extend the credential functionality without having to involve active directory. This is a personal preference I have found useful.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Thanks for the reply. It's not the DAL I think I am referring to but how the connection string is passed, how the application uses the network/transport protocol to communicates to the server. I am talking about the network/transport layer stuff and how connections are authenticated. I hope am asking the right question here
----------------------------------------------------------
"unzip; strip; touch; finger; mount; fsck; more; yes; unmount; sleep" - my daily unix command list
|
|
|
|
|
swjam wrote: I hope am asking the right question here
Nope - once you select either TCP/IP or named pipes (does anyone still use named pipes) then the transport layer is irrelevant. Authentication is a whole other issues and there are many schools of thought on that.
We construct our connection string in the DAL, passing in the server and the encrypted password. The DAL does the decryption.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
The connection string is passed downwards to the SQL server ADO provider. So, yes, the provider will pull out the server name and connect to it (performing the DNS query along the way).
The database server will be responsible for verifying credentials.
In reality theres quite a bit happening under the hood. In the most simple terms, the ADO/OLEDB provider expose a common interface for querying a database server. The initialisation parameters are provided using a connection string, because they can vary a fair bit between providers. How the provider talks to the server is now nicely abstracted. The SQL Server provider just translates the "friendly" ADO interface into SQL Server specific guff.
There'll likely be a fair bit more detail on the MSDN. If you need more, search for ADO.Net related stuff.
|
|
|
|
|
Hello,
Does anyone know how can I pass parameters to the auto generated GetBy of the visual studio 2005 auto generated table adapter ?
Thanks,
Berlus
|
|
|
|
|
Hello,
I am required to supply q programtic solution to import\export the data and the struture (also foreign keys and views) of a table that resides at the sql server using c#.
Is there a easy, auto generated way to do so ? I already let visual studio 2005 generate the code for the adapters of the various tables in order to retrieve\update the data.
Thanks,
Berlus
|
|
|
|
|
Berlus wrote: I already let visual studio 2005 generate the code for the adapters
Which means you have no idea how your database is structured. These bloody auto generate things are a disaster and should be taken out and shot.
Get into query analyser and look into the system table in the master database. Get to know the guts of your database and how it hangs together. 2005-8 has the information schema views but they are based on the sys tables.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Mycroft Holmes wrote: Which means you have no idea how your database is structured. These bloody auto generate things are a disaster and should be taken out and shot.
Hurrah, someone that agrees with me.
Keep the auto generate stuff for the gui parts, learn what your app is actually doing to the database (and how) and most of your problems disappear.
Bob
Ashfield Consultants Ltd
|
|
|
|
|
I use a lot of auto generated code, for the stored procs and the class to service the table, mind you I wrote the generator so I know whats under the hood. My greatest worry is that the BOSS is going to demand DB agnostic code - there go all my stored procs.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Mycroft Holmes wrote: I use a lot of auto generated code, for the stored procs and the class to service the table, mind you I wrote the generator so I know whats under the hood.
Snap. I don't consider that sort of code autogenerated (although I suppose it is really), its the rubbish generated by VS that I object to.
Bob
Ashfield Consultants Ltd
|
|
|
|
|
Have you looked at DTS in SQL Server 2000? It's for this sort of thing...
|
|
|
|
|
Ive never found a decent solution to this problem, and as its coming up again I thought i'd throw this out to the wider community.
Im dealing with a database (SQL2005) which is:
1) Quite Large and growing rapidly
2) High traffic 24/7
As is often the case, one of the things that is required is reporting on various aspects of the data. However, as we know any reads (especially for the purpose of reports) will adversly affect the performance of the live activity going on.
In the past I have solved this mainly 1 way, and that is to on a scheduled basis to take the delta changes in the tables im interested in and write that data to another source and only report against this "achived" data. The problem with this is that you dont get "live reports" you get reports on data older than x hours (where x is your scheduled interval).
I am in a situation where reporting on Live data is pretty essential, but keeping the database as performant as possible is just as essential. To me these 2 requirements seem at loggerheads with each other. I obviously cant report on the live tables, as the overall application would slow to a halt. I also cant archive data on a timely basis because then the reports are not live.
Any thoughts? Is there some new-fangled sql tool that solves this problem? Have you faced a similar problem and found a suitable solution? Is there a gap in my sql knowoledge on this subject?
|
|
|
|
|
there's no single tool to solve your problem that i'm aware of, but in large high traffic databases everything comes into play. Index's clustered and otherwise, foreign keys, extra joins or operations, having conditionals in the where instead of part of the join can affect performance of a query or SP, even the physical layout of the data on the hard drive(s) come into play. I recently had to insert 400 million records into a database that wasnt large enough and it took forever waiting for the database to insert, grow, insert more, grow...so recreated the DB(not usually an option) to the correct size greatly improved performance of the inserts and any query run against them since it was all contigious data on the disk. With large amounts of inserts and updates to a table its possible any index could be corrupted and the DB is trying to re organize it each time, you may also have the wrong kind of indexes or too many of them on a table. you could run your query in query analyzer or parts of your query and look at the execution plan or the estimated execution plan to see what it taking up your time and try to optimize that section. after a few weeks of trying and some changes we can now search out 1 record in that 400 million, and tell you various things about it (no its not just one big table with everything it joins to several other tables with 1M+ records in them using a 7 or 10 char string) in less than a second.
btw we're using sql server 2k. it worked almost as fast on our old 2 proc server..now we have dual quad cores. you may have to upgrade your hardware, it may not solve the problem but it may make life a little better. without knowing your database or the data you are trying to report on i doubt anyone would be able to help you more than general pointers, if you have specific questions about queries or how to optimize tables feel free to ask.
Please remember to rate helpful or unhelpful answers, it lets us and people reading the forums know if our answers are any good.
|
|
|
|
|
Thanks for your answer, however it did not help with my question. Ive worked with SQL Server enough years to understand how indexes work, and how to write performant queries.
|
|
|
|
|
A common solution is to use replication to maintain a reporting version of the database. Obviously is not quite real time, but its usually near enough. It also allows for some reindexing/denormailisation to aid reporting performance.
Hope this is of some interest.
Bob
Ashfield Consultants Ltd
|
|
|
|
|
Thanks Bob, yes it very much is of some interest.
I know SQL Server very well indeed, that is except for replication which ive not honestly come into contact with. I had a feeling it was relevant to this situation so I started reading up last night. You have just confirmed that my hunch was at least along the right lines.
Thanks again.
|
|
|
|
|
Bob, any idea what a reasonably lag time is for the replicated system (presumably configurable but guidlines?)
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Naturally, it depends on transaction volumes and how important it is, but normally for a reporting database I would expect between 1 and 5 minutes, as opposed to a hot standby where it should be as close as possible to real time.
This msdn article gives a good overview of the overheads etc.
Bob
Ashfield Consultants Ltd
|
|
|
|
|
As Bob said replication could be one solution. Especially transactional replication in your case.
Another option could be log shipping. With log shipping you may have to redesign your backup strategy, but on the other hand you gain a standby database, which can be used for reporting.
The need to optimize rises from a bad design
|
|
|
|
|
guys, i want to make a program using vb6. This program can update sql server 2000 server database based on data taken from MS access 2000. The algorithm that i applied is first, the program read the data from Access database ,then, it is saved temporary. next it will open the connection to the sql server then, compare the data between sql server and Access. If the data is different, the data in sql server will be replaced with the data in Access. Fortunately, until now i am stuck in how i can connect this two database together and compare the value.
Does the algorithm that i applied contain some mistakes and is there any better suggest algorithm for this problem?
Could anyone help me to give some code in essential part of the program?
Please. I need your help.
|
|
|
|
|
Its one way of doing it, but why not have 2 connections open, 1 to Access and 1 to SQL Server and cut out the temporary save? All you need to do then is run a loop over the 2 recordset (ensure you order the results the same way) and soe the details for insert/updatw/delete. When both recordsets are ended apply the changes.
There are probably several other ways too.
Bob
Ashfield Consultants Ltd
|
|
|
|
|
Hi
I want run a query like this:
select * from sp_MyTable_Select
i want make a query from result of some stored procedures
like this:
Select * from sp_fKeys Where COLUMN_NAME='X'
i know it is wrong
help me to found how can i do this....
tnx
|
|
|
|