|
I would suggest that you first start by profiling the application so you can determine exactly what is taking the most time.
|
|
|
|
|
Seems to me there is a fundamental flaw in the process, the web services should be your bottleneck but for a database to only be updating 30k records an hour seems ludicrous.
If the cursor/loop is the problem then look a little further afield.
Can you construct an xml dataset on the UI end and pass in a bulk set of updates.
How about doing a bunch of bulk inserts into another table and then using a proc to do the updates to your transaction table. Target tables could be created for each instance and truncated after the update. Or cycle through 2 target tables per instance.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Hi,
why this statement returns error on mysql?!
SELECT schema_name, CAST(sum(data_length) / 1024 AS UNSIGNED) AS db_size FROM information_schema.schemata INNER JOIN information_schema.tables ON information_schema.schemata.schema_name = information_schema.tables.table_schema GROUP BY information_schema.schemata.schema_name
|
|
|
|
|
It worked perfectly for me - MySQL version 5.1.45.
|
|
|
|
|
jrahma wrote: why this statement returns error on mysql?!
It'd help if you'd post an error-message
I are Troll
|
|
|
|
|
My Sp reading XML file data with help of OPENXML in SQL.
There is slight problem in this.
<Name_Address>
<name>JCB SALES PVT</name>
<address>24, SALAROURIA ARENA ADUGODI</address>
<address>HOSUR MAIN ROAD, Honolulu</address>
<country>N</country>
</Name_Address>
and my SQL query is
SELECT @address = CONVERT(VARCHAR(150),[TEXT]) FROM OPENXML(@idoc,'/Name_Address/address', 0) where [text] is not null
in @address i am getting last address tag value i.e
HOSUR MAIN ROAD, Honolulu
But it should be
24, SALAROURIA ARENA ADUGODI, HOSUR MAIN ROAD, Honolulu
How can i achieve this ?
Help me, guide me to do this.
regards
|
|
|
|
|
Hum Dum wrote: SELECT @address = CONVERT(VARCHAR(150),[TEXT]) FROM OPENXML(@idoc,'/Name_Address/address', 0) where [text] is not null
AFAIK the result is having two rows returning and At a time of converting SQL might picking up the second row and converting it accordingly, As It returns two rows having one column.
But I can Provide you another solution for that. To first Retrieve the result and using COALESCE you can JOIN them together to produce expected result.
Here's my work for you only.
SELECT @address = COALESCE(@address+',' ,'') + [address]
FROM
(
SELECT * FROM
OPENXML (@idoc, 'Name_Address/address',3)
WITH ([address] varchar(1000) '.')
) P
SELECT @address
It's working all right.
|
|
|
|
|
I'm looking into creating a custom database application that will access our companies database which is located on a SQL server. Previously we were using Access as the front end, however we are going to be implementing new 3rd party software that will our servers will need to communicate with, however to try and program this from access was going to be difficult. So now on to my question.
When my program load, it runs a query against the database to load the records into a dataset. I have no problem with loading records, updating records, or creating records. However I do have a problem if Person #1 is using my software and Person #2 updates/changes a record in the database after Person #1 has loaded my software. Person #1 will not see the changes Person #2 made unless they restart the program or I force the software to reload the entire database. However, I have no way of knowing if someone made changes to a record in order to update the dataset. Is there a way to have my software automatically detect changes to a record and automatically update the dataset? Or will I have to have the program update the dataset every few minutes? The problem is that the database contains >10,000 records and to have it update every few minutes could severely slow down the program, especially if someone is working over VPN. Any ideas would be greatly appreciated.
BTW I am working in VB.NET 3.5.
|
|
|
|
|
DisIsHoody wrote: load the records into a dataset
Yeah, don't do that.
Redesign your application; do it the right way.
|
|
|
|
|
DisIsHoody wrote: database contains >10,000 records
DisIsHoody wrote: Previously we were using Access as the front end
I suspect your database design is a complete disaster, probably evolved over a number of years by a power user (possibly yourself) and now you want to use it seriously. I make the STRONG suggestion to have a professional look at it.
Your database probably needs a complete redesign, and your existing data migrated to the new structure. PLEASE do not do this yourself, get a professional otherwise you will be back here in a year or two needing to do it all over again.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I know the pain of migrating an application from MS-Access to SQL server ... I'm in the middle of it now.
You might want to take a strategy like this ...
You have a grid full of records that a user may update.
Each record should have two additional fields: modified_datetime,modified_by
Create a button (or something) which puts the record into "edit" mode; at this time, re-fetch the record so that you are sure you have the most current version.
During the "save" of the record, you will need to re-fetch the record and check that the modified_datetime has not changed, if no change then update the record with your new values and updated modified_datetime,modified_by values.
If someone has modified the record, you can inform your user that someone has already modified the record and he/she has to "re-edit" the record.
Something to think about.
I also agree with the other members that you might want to take this time to re-design your database design and not do a straight Ms-Access to MS-SQL import.
|
|
|
|
|
Not a very good Problem you are experiencing. I once came across the Problem , but mine was different , i was having Concurrency issues with 50 000 users. Luckily the System was written in a manner that another use must not See other user changes when he is busy until he is done and Committed the Data.
The Problem here is that a Dataset is ADO.NET Feature and it means you connect to the database and take Data and disconnect , and when you want to do some updates, you reconnect and do the Update and disconnect again.
So it will be a bit Difficult to Update the Data on other machines unless they query again. This is more difficult in Windows Applications. This is what i like about CodeProject, People Come with Real Problems , i think this will lead me to write an Article about this, Hope to write it soon. i will see what i can do tonight.
Good Luck
Vuyiswa Maseko,
Spoted in Daniweb-- Sorry to rant. I hate websites. They are just wierd. They don't behave like normal code.
C#/VB.NET/ASP.NET/SQL7/2000/2005/2008
http://www.vuyiswamaseko.com
vuyiswa@its.co.za
http://www.itsabacus.co.za/itsabacus/
|
|
|
|
|
Believe me...I know the database needs a whole re-write. It was originally designed by the president of the company when the formed the business. We have been trying to get him to hire a professional to completely re-write it, however from his perspective it works right now so why should we change it. So unfortunately I'm stuck working with the database in it's current format. As for the idea of having a column to monitor the data/time the record was last updated...I think this will work pretty good for the time being. Thanks for the idea. I appreciate all the responses.
Dominick
|
|
|
|
|
Sounds to me like the program should be re-written, not the database. Of course this may be a simplistic accessment because I am not familiar with your problem domain, but, why would you have the program always load the entire database rather than just the record currently being worked on? Most progams that work with a database have some sort of search function to allow a user to find the record they need (returning only minimal identifying data), then a detail form/page that is used to work on that record. This way you are never loading much more data than you need and since the record is getting loaded just before working on it and saved immediately after, it should always be relatively current. The only concurrancy problem you have then is if two users are working on the same record at once.
Hope this helps.
Kevin Rucker, Application Programmer
QSS Group, Inc.
United States Coast Guard OSC
Kevin.D.Rucker@uscg.mil
"Programming is an art form that fights back." -- Chad Hower
|
|
|
|
|
is there any way to determine what's the table physical size (in bytes)?
|
|
|
|
|
Yes[^]
Would you want to try to Google the answers to the other questions yourself, or would you like me to post links to various sections of the manual?
I are Troll
|
|
|
|
|
That depends on the density of the disk.
|
|
|
|
|
Hi,
How can I read the database physical file size?
|
|
|
|
|
You need to query the information_schema database, for this and your other posts. Can I suggest you install a graphical front end for MySQL (I'd recommend HeidiSQL[^]) - apart form showing you all you need to know about this, you can view the SQL it uses to query the database in it's output window.
|
|
|
|
|
|
Hi,
How can I get the server properties in mysql, for example, physical size?
Regards,
|
|
|
|
|
Hi,
I have a website with a textbox.. user will write the sql query in the textbox.. how can I execute the code on mysql?
Thanks,
|
|
|
|
|
With crossed fingers.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
posting about Crystal Reports here is like discussing gay marriage on a catholic church’s website.[Nishant Sivakumar]
|
|
|
|
|
I suggest you read up on SQL Injection
|
|
|
|
|
That's not a good practice But If you want to go ahead with this only then I can provide you little suggestion.
Just Put you query inside Execute() and give SqlCommand To Execute .
|
|
|
|