|
I'm not certain what you're after but if you want to undo modifications, it's done using transactions. However, transaction should never wait for user input. Other mechanism could be that you log changes to be able to revert them afterwards.
This could be done by adding more tables and logic to your application. But the downside is that mechanism like this would easily be very complex unless it's well restricted. For example what if a record insertion is reverted after few days and other records are added meanwhile whcih then depend on this record.
|
|
|
|
|
I got a requirement for a buisness web application that will handle 100 concurrent users requests. we designed our data tire. we will use SQl server 2005 and we will use OLAP services. the question is how can we determinate the best physical hardware requirment for database server. i mean we need to know min phyiscal memory size , processing power. for the server that will be managed by windows 2003 server OS. known that the client wants max page loading duration on max load(100 concurent users) is 8 seconds.
Note:
1-we did tried to set up server with 2 giga ram, and dual core processor. on first 30 concurent users it was ok. after user 50. the processor,memory utilization is 100% and things got worse . over 60 seconds per page
2- we use tool to test performance that simulate 100 users request the application every second.
marcoryos
|
|
|
|
|
My suggestion are as
1. Intel Xeon Quad Core2 (Two Processors i.e. 8 Cores)
2. RAM 4 GB
3. SATA 2 HDD
4. Windows 2003
We are able to handle nearly 500 Concurrent Users and CPU utilization is 10%(MAX).
|
|
|
|
|
Remove OLAP services into another DB that is not on your OLTP db. Reporting cripples OLTP databases.
I can squeeze 100 users onto an ASP.NET/SQL Express Business Application using a shared VM with good design techniques so if your dedicated server is not fast enough consider looking at the system architecture. Good design scales a lot faster than hardware upgrades.
Need software developed? Offering C# development all over the United States, ERL GLOBAL, Inc is the only call you will have to make.
If you don't ask questions the answers won't stand in your way.
Most of this sig is for Google, not ego.
|
|
|
|
|
Most likely the problem is in the overhead that OLAP services cause. The main target for OLAP services is reporting, not transaction handling. Instead of OLAP use the OLTP engine of SQL Server in other words the normal SQL Server engine and design the relational model so that it supports continuous data manipulation and retrieval.
|
|
|
|
|
Thanks alot for replies, so far i got that we should take alook at our design again, and try to seperate db engieen and OLAP to be in two servers instead of one, Am i correct?
marcoryos
|
|
|
|
|
I think that's a good way to go.
|
|
|
|
|
Dear all, I have a linked-server on my database. My senior in my company make it like that, because in the future, we will have some servers to handle some applications in the company. But after we make it so and running it, the transfer is so damn slow. Is it because the linked-server? Are there any better ways to access database from multi-server in one application. Using mirror or something? If you have any reference for this, please help me. Thanks.
- No Signature Available -
|
|
|
|
|
Indra PR wrote: the transfer is so damn slow
Operations done over linked server may be performed very differently than those executed locally. You can use execution plan to investigate what's being done and why it's slow. So you should investigate each problem statement (especially multi database joins are often problematic).
The idea in optimizing multi server statements is that you transfer minimal amount of data using the linked server so in many cases the SQL may have to be rewritten.
Indra PR wrote: Are there any better ways to access database from multi-server in one application
Linked server is intended for this. Of course there are other techniques such as replication etc, but the data is not typically up to date in those scenarios or you cannot update the data on the other end (stand-by solution etc)
|
|
|
|
|
Mika Wendelius wrote: especially multi database joins are often problematic
I've read something about this before, it is said that the performance will be better if we create a temporary table from the table in other database first. Then we join the temporary table with the 2nd table. Is that right?
Mika Wendelius wrote: Linked server is intended for this. Of course there are other techniques such as replication etc, but the data is not typically up to date in those scenarios or you cannot update the data on the other end (stand-by solution etc)
If it so, then it is right for me to use linked-server, and I don't have any other choice of this, the only way is to optimize the query. Isn't it? Correct me if I'm wrong.
- No Signature Available -
|
|
|
|
|
Indra PR wrote: I've read something about this before, it is said that the performance will be better if we create a temporary table from the table in other database first.
That can be part of the solution but not the solution itself. The main point is how much data you tranfer through the link. Fore example if you have a query like
select ...
from linkedservertable
join localtable
where ...
SQL Server may have transfer the whole linked server table to this server before the join or row elimination can be done. If the same query can be written to format
select ...
from localtable
(possibly join to linkedservertavle)
where joiningcolumn in (select keycolumns
from linkedservertable
where restrictive conditions...)
the performance may be very different since the elimination may be done at the linked server. This is a very simple example, but hopefully points out the idea. Temporary tables can be used for that exact purpose if the operation cannot be re-written otherwise.
Indra PR wrote: then it is right for me to use linked-server, and I don't have any other choice of this
Based on your description I would say that linked server is a good way to handle your situation.
One more thing. If you insert/update/delete your data on both servers in a same transaction you are forced to use distributed transactions. MS DTC takes care of this, but it's good to know that this will add extra overhead to transaction handling, which then again makes the operation a bit slower. However if you want the transaction to be ACID, that's the correct way to do it.
|
|
|
|
|
Dear all, I have some problems with SQL Server 2000 that I want to ask. I am using the database with OLEDB connection that is called from VB.NET. The application that I make is an ERP software.
1. Sometimes, and it is often happened in the application, is an error saying "Connection is busy".
2. Then sometimes the application also shows a dialog box with a connection error saying "Cannot create a new transaction becase capacity was exceeded".
I don't know how to solve them, I've tried to look for the answers from Google, but the only answer that I found is that I should update the service pack into the latest version. But I've checked my database, and its version is SP4. From the application itself, some people said that maybe I forgot to close the connection, but I've made only one function to fill a datatable, and I've put the Connection.Close() in the end after doing every transaction. Are there any better answer for these?
- No Signature Available -
|
|
|
|
|
Check your connection pooling settings. You may be hitting the limit. Also check that you close the connections properly using Close method after you have used them and that you end the transactions gracefully.
|
|
|
|
|
I still don't know about the connection pooling Mika. Is it possible that the problem was caused by the fact that a lot of users of the application were accessing the database at the same time? Can you give me a little description about that, and what should I do to check it, or maybe a link of reference. Thanks before.
- No Signature Available -
|
|
|
|
|
Indra PR wrote: I still don't know about the connection pooling Mika
Connection pooling is quite well decsribed here: OLE DB, ODBC, and Oracle Connection Pooling [^].
Indra PR wrote: Is it possible that the problem was caused by the fact that a lot of users of the application were accessing the database at the same time
If you have a lots of users simultaneously (thousands of them) or if you have limited maximum amount of connections, that's possible. You can check the max connection count using:
SELECT @@MAX_CONNECTIONS
You can find the maximum capacity specifications here: http://msdn.microsoft.com/en-us/library/aa933149.aspx[^]
|
|
|
|
|
Indra PR wrote: I've put the Connection.Close()
although, you have put close connection request after every command. Does your program make sure that connection.close() is executed every time.
I mean to say, If there is a error during the transaction, is close() method called.
try using code as
try
{
Connection.Open();
}
catch(Exception EX)
{
}
finally
{
Connection.Close();
}
One thing you should try checking is
1. When Your application starts giving the error reported, at that instance of time how many users are connected to your server and on the particular DB.
2. How many requests from the application are running in pool of DB
|
|
|
|
|
Yep, my code is exactly as the same as yours. I still don't know about pool? Before, Mika also said something about DB pooling, or pooling connection. What is that? I've read the reference but still don't understand I only know that DB pooling is a feature in ADODB to minimize cost of opening connection, we have to set it first or what?
- No Signature Available -
|
|
|
|
|
What is the Edition of your SQL Server
|
|
|
|
|
SQL Server 2000 Enterprise Edition
- No Signature Available -
|
|
|
|
|
actually i have a database table have four fields rnkid, rnkartid,rnkuserid and rnk marks i want sum all the marks which have same rnk art id . pls hel tell how
|
|
|
|
|
If I understood you correctly, your query could be something like:
select sum(rnkmarks)
from your table
group by rnkartid
See SUM[^] and GROUP BY[^] for more details.
|
|
|
|
|
Hi,
I'm new to T-Sql programming in Sql server with C++ Application.
Plz provides some examples.
With Regards
Mahesh
|
|
|
|
|
|
Hi,
I need to transfer the data from xls sheet to sqlserver but actually the thing is if the source excel file has different sheets, in each sheet i have the data
and i need to move the entire data( all the data that is present in all sheets of the excel file) to a single table into sql server
like wise I have many xls files ( which have many sheets ) .
for eg:
excel file 1:
-> sheet 1
-> sheet 2
-> sheet 3
excel file 2:
-> sheet 1
-> sheet 2
-> sheet 3
excel file 3:
-> sheet 1
-> sheet 2
-> sheet 3
now i need to get the data from all of the files and i need to insert into a single table ( sql server) in ssis package
so plz help me by giving the solution .
thanks
|
|
|
|
|
AFAIK you would have to do this sheet by sheet. For more info: Excel Source[^]
|
|
|
|