|
My memories are VERY dim, but I'm sure Mr G and Mr W know.
In Jet versions before version 4, a page locking model is used, and in Jet 4, a record locking model is employed. Can't remember how you set it up for locking or not.
Now for something that belongs in Coding Horrors, except it doesn't really qualify as coding:
The weirdest db corruption I ever had to fix was a shared dB4 database, using Codebase. In this setup, index files are "external", and some smartass self-appointed admin managed to have two of the six clients running from the shared db files on the server but with LOCAL C drive index files.
Cheers,
Peter
Software rusts. Simon Stephenson, ca 1994.
|
|
|
|
|
Thanks Peter. After some Googling on record locks, I think MY CODE is the problem, not Access.
|
|
|
|
|
Log the exceptions - that is better than asking (and believing in) the users.
With Access, you must make sure that the parameters are added in the correct sequence, they all share the same name "?".
|
|
|
|
|
Thanks Bernard!
Very good suggestion.... I'll implement some sort of logging.
Thanks also, I did check that the parameters are added in the correct order. I have tested and tested the code over and over on my laptop, and I don't get errors, this is why I'm leaning towards some sort of concurrency issue. I also checked the locking settings on the Database.... Oops - they were set to None!!!!
|
|
|
|
|
I wrote: maybe no locking at all, which would cause your problems.
Richard.Berry100 wrote: I also checked the locking settings on the Database.... Oops - they were set to None!!!!
No further comment required.
Peter
Software rusts. Simon Stephenson, ca 1994.
|
|
|
|
|
I have a TSQL stored procedure that can run for a few minutes and return a few million records, I need to display that data in an ASP.NET Grid (Infragistics WebDataGrid to be precise). Obviously I don't want return all data at once and need to setup some kind of paging options - every time user selects another page - another portion of data is loaded from the DB. But I can't run the SP every time new page is requested - it would take too much time.
What would be the best way to persist data from the SP, so when user selects a new page - new data portion would be loaded by a simple SELECT... WHERE from that temp data storage?
|
|
|
|
|
|
Not exactly. These methods assume that data is stored in a table. In my case to retrieve the data an expensive SP is run, and *after* that user needs to do filtering/paging/sorting on that data. These ad-hoc requests are made from ASP.NET code and needs to be done pretty fast. Do I need to store resultset in an actual table (#temp tables won't work in this scenario) or are there other ways?
|
|
|
|
|
Trekstuff wrote: What would be the best way to persist data from the SP
Put it in a (temporary?) table, and page from there
Bastard Programmer from Hell
|
|
|
|
|
I am coming to this conclusion myself, but normal #temp tables won't persist between connections, if I am to use ##global or permanent tables I'd need to know when to drop them - and I am not sure about that yet.
|
|
|
|
|
Trekstuff wrote: if I am to use ##global or permanent tables I'd need to know when to drop them - and I am not sure about that yet.
After a period of inactivity on the users' session? Keep a list of users and their temp-tables with a timestamp. Update the timestamp on each visit, delete the temp-tables with a timestamp older than two hours.
Anyone waiting for more than two hours before they go the the next page can wait the extra ten minutes that it takes to recreate the temp-table.
It might be handy to have these "temp" tables in a different database; that way they won't pollute the backups of your production-database.
Disclaimer; I'm a winforms-programmer, and there may be easier ways to detect the end of a session.
Bastard Programmer from Hell
|
|
|
|
|
Yup, I think this is what it's shaping out to be: A dedicated DB that'd hold temp permanent tables with cached data.
Thanks!
|
|
|
|
|
You're welcome
|
|
|
|
|
I'll add my support to this one, we have used it successfully. We found that there were only a very limited number of users performing this task and so we would flag a datset with their ID and store the results in a single table. We then gave the user the option to either query the "cache" or refresh it with a new process of the proc.
We then of course indexed the dammed table to death, which killed the insert/delete but made queries acceptably fast.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Yup, looks like this is the way we're going to take as well. Thanks
|
|
|
|
|
Trekstuff wrote: and return a few million records, I need to display that data in an ASP.NET Grid
Your requirements are badly flawed.
If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.
|
|
|
|
|
I apologize, I must've not made my intentions clear. User won't need to see all those few million records at once. He/she would need to make ad-hoc request to this dataset: filter/sort/search it on the fly and expect results fast.
Thanks for the math lesson though
|
|
|
|
|
Then the best way is to ask the user for the criteria and then build an appropriate query from that.
|
|
|
|
|
User may have dozes of different criteria he/she would want to run ad-hoc. Those criteria aren't know in advance and need to be passed to data on the fly.
|
|
|
|
|
Trekstuff wrote: <layer>User may have dozes of different criteria he/she would want to run ad-hoc.
May and will are different.
And standard business users will not have those requirements.
For example anyone dealing with customers only needs a couple of criteria to find the the customer/order.
And if you really meant "ad-hoc" then you have a power user and they should be proficient in SQL and have a SQL application, not a created app.
|
|
|
|
|
Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away
|
|
|
|
|
Trekstuff wrote: Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away
If they want unlimited queries that is exactly what is needed and wrapping it in another app doesn't change that.
And that is exactly the situation at a bank that I worked at. The VP in charge of the merchant credit services often (more than once a week) did custom queries directly into the production database.
|
|
|
|
|
jschell wrote: If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.
..assuming that they're looking at the records themselves, and not some kind of visualization of those records (like a chart) - that's assuming that they're looking at the data at all; for all I know they could be preparing an export to some Microsoft-Access database.
Bastard Programmer from Hell
|
|
|
|
|
This seems like a bad idea; no one is ever going to look at a million plus records; at best they'll look at a small subset. At the very least get them to filter before going to the database and only ever return as many records as they actually need to see, perhaps restricting to a managabale number.
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair.
nils illegitimus carborundum
me, me, me
|
|
|
|
|
That's the point, I don't know in advance what filters are going to be used; I can only restrict the data somewhat by initial call too the SP; but I cannot call SP for every filter request - this call is to expensive.
For example: user needs to work with data for the past 3 month - this Date filter I can pass to SP and this call can returns several million of records. Once the data is obtained - user may want to: Get only rows which "Name" field start with "D" or which SSN ends with 74 or group by LastName and get the count etc. etc. - these requests will limit data to a much smaller subset, but are unpredictable and made to the original large resultset returned by the SP - instead of SP itself
|
|
|
|