|
i want the query plz help
|
|
|
|
|
Write your own code!
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
|
On the last project I was on, we utilized MS Access and while I was for the most part able to get around in Access, I found Access performed very, very slowly for even simple tasks once the DB grew to a certain size, and I was not at all impressed with the Report builder as I found it was almost always easier to export the data to Excel to generate my reports.
Fast forward a few months and I am about to start a new project and I have been given quite a bit of say in how we set up our database. I do have a vague idea of how I would implement it but I am very curious to see others opinions as well to make sure I have all bases covered.
Here are the 'contraints' if you will for how the database will need to perform.
1. There will be a few technologically savvy employees (approx 4-5) on site and the remainder of the workforce will be fairly illiterate, technologically speaking (they will be able to surf the web, do data entry in spreadsheets, type documents, but not much else). All employees will need to be able to access the data from reports easily and quickly without having to memorize commands or know how to navigate a database etc. My fellow techies have experience mostly with MS Access, and a couple know SQL, I am more familiar with NoSQL type frameworks, but have a basic understanding of SQL and can get by in such a system.
2. Data will come from 4 sources.
a.) Primarily, most data will come via spreadsheets. These spreadsheets will often be fairly complex with multiple tables, hidden columns and rows, formulas, reference cells, etc.
b.) The next largest chunk of data will come from field reports hand written by the foreman on site.
c.) The next largest chunk will come from simple verbal communication, or emails, word documents, or even typed letters.
3. Data will need to auto-generate reports. The more autonomous this can be made, the better. We might have as many as 10 different daily reports as well as hundreds of weekly, monthly, and yearly reports that we will have to generate. Ideally, this could be performed by non-techie users simply by providing date ranges or other simple constraints, and having templates scoop the required data from the database.
4. Data will need to be backed up often (at least once maybe more per day, to prevent data loss or erroneous operations, and preferably will not interfere with it's usage at all. It needs to scale well, be reliable and not be subject to data loss, or easily let data fall between the cracks.
In terms of importance I believe it should be arranged in the following hierarchy-
1. Reliability (no data loss, works as expected, no crashes, bugs, or unexplainable quirks)
2. Ease of Use (both in terms of non-techies accessing the data and power users overseeing the data and building report templates)
3. Performance (scales well, doesn't take minutes to perform simple queries)
In my head I see a few potential ways of allowing this to happen but invariably I see potential pitfalls in any of my strategies. I would be thrilled to see a high-level concept of how someone with more DB experience than myself would tackle this situation.
|
|
|
|
|
Sea_Sharp wrote: 1. All employees will need to be able to access the
data from reports easily and quickly without having to memorize commands That means you need a UI. That probably means you'd need to write one, generate one, or use Access as your front-end.
Sea_Sharp wrote: 2. Data will come from 4 sources. The database doesn't care where the data comes from. A is a simple import-operation. B and C cannot be automated, someone will have to enter the data. D is missing.
Sea_Sharp wrote: 3. Data will need to auto-generate reports. Data doesn't do things, it just sits there. Databases are also usually not responsible for generating a report; you will probably need to create a report-template that gets filled dynamically, with a filter for each property. I'd put the database in SQL Server, link from Access to it, and exclaim it's "done". Access has a low learning-curve and everyone could click together the reports as they see fit. Excell could probably do the same.
Sea_Sharp wrote: 4. Data will need to be backed up often (at least once maybe more per day, to prevent data loss or erroneous operations Backups do not prevent errors; any error gets backed up along with the rest of the data. Any major database-server supports backup-operations.
Sea_Sharp wrote: In terms of importance I believe it should be arranged in the following hierarchy It doesn't work that way. There's a comparison of database-servers here[^] - take a look at the capabilities, compare them to your needs, pick one. And make sure it is SQL Server that you pick. SQL Server Express is free, you could design your db in Access (and have a working Access-db as a backup if this experiment fails), and simply import it in SQL Server when you're ready (using the upsize-wizard in Access)
"No bugs" is something that can never be guaranteed in complex systems; there's however a difference in patch-frequency. I know most people disable their auto-update, but I kinda like being updated regularly.
You can also safely assume that databases store data without loss, unless it's beta-software. And yes, all major database-servers return data on simple queries in nearly no time.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Eddy Vluggen wrote: And make sure it is SQL Server that you pick
In other news, Kim Jong Un got 101% of the votes.
|
|
|
|
|
Eddy Vluggen: That means you need a UI. That probably means you'd need to write one, generate one, or use Access as your front-end.
My thoughts were to restrict non-techie access completely from the database. They can enter data via spreadsheets which we would subsequently import. And to get data they can simply use the spreadsheets or pdfs that the reporting would generate. I found on the last project, having many people access the database at once resulted in many issues.
Eddy Vluggen: The database doesn't care where the data comes from. A is a simple import-operation. B and C cannot be automated, someone will have to enter the data. D is missing.
I actually merged D with C and forgot to change it to 3 sources after the edit. Agreed that the database doesn't care how it gets it's info. This was I suppose an extension to the previous point that if we hire people to do data entry, it would probably be best in my opinion to have those workers enter the data into spreadsheets that would be imported to the database rather than having them enter the data straight into it with forms. The less people clicking around in the db not knowing exactly what they are doing, the better IMO.
Eddy Vluggen: Data doesn't do things, it just sits there. Databases are also usually not responsible for generating a report; you will probably need to create a report-template that gets filled dynamically, with a filter for each property. I'd put the database in SQL Server, link from Access to it, and exclaim it's "done". Access has a low learning-curve and everyone could click together the reports as they see fit. Excell could probably do the same.
I guess my wording here was awkward. I know that data itself cannot do things. I meant that I would like to set up some process by which we can generate report blanks, and pull the data from the database to auto-populate the reports. I know Excel can do it via exports but then it becomes the issue of everyone wasting their time trying to make the reports look professional, consistent, and legible, not the end of the world mind you but it gets old quick when you have to generate reports continuously. As mentioned earlier I was completely unimpressed with the UI for Access's report builder but looking around I have not seen any alternatives that seem any better. My mindset is based more around the NoSQL movement so I would simply write a script to scrape the data off a document database and generate the report via HTML & CSS or XML & XSLT, but I am not sure how one might do that efficiently with an SQL database especially considering how complex joins can end up being, nor how flexible some frameworks would be about this.
Eddy Vluggen: Backups do not prevent errors; any error gets backed up along with the rest of the data. Any major database-server supports backup-operations.
Once again I agree with you, backups do not prevent errors, however they do provide a point of return if something bad does happen. Losing even a couple days worth of data entry due to a corrupted database file would not fly with our executives. Especially if connections to the database occur over a network, I find them to be very finicky and prone to small bugs. You should have seen how bad our MS Access db got corrupted just by having 2-3 people working on it simultaneously. Typos are bound to happen, very difficult to prevent them all. But if we have several snapshots of the database at any given point and an issue occurs we can restore to the most recent point that does not contain the errors and continue.
Eddy Vluggen: It doesn't work that way. There's a comparison of database-servers here[^] - take a look at the capabilities, compare them to your needs, pick one. And make sure it is SQL Server that you pick. SQL Server Express is free, you could design your db in Access (and have a working Access-db as a backup if this experiment fails), and simply import it in SQL Server when you're ready (using the upsize-wizard in Access)
I think in terms of frameworks, there is always a little give and take between performance, stability, and ease of access. Ideally they would not conflict, but the world is far from ideal and resources are finite. There will always be a point where the effort you have to put in greatly exceeds the value you get out. In that sense, I put the hierarchy as a way of saying if one thing HAD to be sacrificed I would say speed and or usability over stability. What good is speed or usability if the framework hiccups over network connections and loses input or corrupts files without error warnings.
Thanks for your responses. I think in a lot of ways you were actually agreeing with my mindset on how to proceed. I still am very undecided on the reporting process, but I hope to have that worked out soon. Access could be used in a pinch, but as I mentioned earlier I think it would make a lot more sense to do it HTML and CSS or XML and XSLT for professional looking reports to the client, and unformatted spreadsheets for internal usage. Any advice on how I might implement that would be greatly appreciated.
|
|
|
|
|
Sea_Sharp wrote: My thoughts were to restrict non-techie access completely from the database. Ditto; the Access-frontend would link purely to readonly-views on their own version of the database; a simple restored backup of the production-database (on Sql Server). Then again, most people would not offer Access to an end-user and claim that it's the UI they should work with.
Sea_Sharp wrote: I found on the last project, having many people access the database at once resulted in many issues.
Yes and no. Access does it's job well, but it was never meant to store a lot of data and be accessed by multiple users simultaneous. A database-server is meant for that job, and Acces isn't a server-application - "just" a desktop app.
Sea_Sharp wrote: I guess my wording here was awkward. Sorry for my tone/wording; it'll be equally (if not more) awkward (or hostile) at some points.
Sea_Sharp wrote: I know Excel can do it via exports but then it becomes the issue of everyone wasting their time trying to make the reports look professional, consistent, and
legible ..aaah, yes, users are resourceful.
Sea_Sharp wrote: As mentioned earlier I was completely unimpressed with the UI for Access's report builder but looking around I have not seen any alternatives that seem any better. The Query-Builder looks scary, but is a very powerful tool; combine that with the reporting, and you have a "simple" UI - although others will certainly disagree there. Combine that with the free runtime-version of Access, and you got a UI that will be hard for the end-user to "break".
Sea_Sharp wrote: You should have seen how bad our MS Access db got corrupted just by having 2-3 people working on it simultaneously. Yes, especially when they're mucking in the same tables. When moving to SQL Server, be sure to read up on locking and transactions - though the concept does exist in Access, it doesn't help to improve multi-user access. Locking is what keeps the data consistent in a multi-user db, but it also introduces a bottleneck.
Sea_Sharp wrote: I mentioned earlier I think it would make a lot more sense to do it HTML and CSS or XML and XSLT for professional looking reports to the client, and unformatted
spreadsheets for internal usage You want an introduction to ASP.NET[^] or PHP[^]. You can easily read from the database, and have your CSS applied. You can modify the data and output it as HTML or XML. With or without transformations. And you could introduce new things as you learn; start with a basic formatted list that simply shows the content of a table. Next, create one with an extra filter.
It may take a bit, but there's always the forum to help
--edit
Added hyperlinks
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Eddy's answer is deep and clean, but I would add one more - don't do it alone. Not at the beginning at least. Hire some expert to help you with the design phase and put the project on the right tracks...
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
|
|
|
|
|
Like Eddy I am prejudiced to SQL Server, partially because I have been working with it for years but mainly because it has the most support resources on the interweb.
You are discussing multiple requirements, the primary on is to set up a database to store and manage your data, here I agree that you should get in a professional to design your data structure, it is by far the most critical component.
Next is the user interface, you need to build at least 1 application, your non tech access to the data and processes. You also need to create and interface for the existing Excel sheets into the data.
Lastly you need a reporting strategy, I would recommend SQL Server Reporting Services (SSRS) this can be set up as both server based or embedded in you application.
I believe you are in the position of a power user just dipping into becoming a developer and your tool horizon is limited to Office products, you might want to expand it to include a proper development environment.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Hi,
I am using below MySQL stored procedure to add a new job order into MySql table.
I would like to do the following modification in the code:
I want to add an additional parameter called param_max_auto_weekday
then I want to check the WeekDay of the param_job_order_date
and repeat the code until reaching param_max_auto_weekday
so if the WeekDay(param_job_order_date) = Sunday
then It should INSERT INTO job_orders for every Sunday in the week until param_max_auto_weekday
Here is the code... with Thanks:
CREATE DEFINER=`root`@`localhost` PROCEDURE `sp_add_new_job_order`(IN param_customer_id int, IN param_cleaner_id int, IN param_job_order_date date, IN param_job_order_date_to date, IN param_start_time time, IN param_end_time time, IN param_job_order_note text, IN param_is_contract bit, IN param_contract_id int, IN param_total_hours decimal(11, 6), IN param_created_user int, OUT param_record_identity int)
BEGIN
INSERT INTO job_orders (customer_id, cleaner_id, job_order_date, job_order_date_to, start_time, end_time, job_order_note, is_contract, contract_id, total_hours, created_date, created_user) VALUES (param_customer_id, param_cleaner_id, param_job_order_date, param_job_order_date_to, param_start_time, param_end_time, param_job_order_note, param_is_contract, param_contract_id, param_total_hours, NOW(), param_created_user);
SET param_record_identity = LAST_INSERT_ID();
UPDATE customers SET allow_delete = FALSE WHERE customer_id = param_customer_id;
IF (param_is_contract = TRUE) THEN
BEGIN
UPDATE job_orders SET job_order_status = 6 WHERE job_order_id = param_record_identity;
UPDATE contracts SET remaining_hours = remaining_hours - param_total_hours WHERE contract_id = param_contract_id;
END;
END IF;
CALL sp_add_event_log("JOBORDER", param_record_identity, param_created_user, "Job order was created.");
CALL sp_add_event_log("CUSTOMER", param_customer_id, param_created_user, "Job order was created.");
END
Technology News @ www.JassimRahma.com
|
|
|
|
|
That's not a question, that's work.
Did you run into any problems while adding the param? Why is the post here? Did you receive an error?
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
yeah but my questions is how to start with it?
if today is 10th April 2014 and I want list of every Sunday until 10th April 2020?
Do you I need to loop?! what's the best way in terms of performance?
Technology News @ www.JassimRahma.com
|
|
|
|
|
I wonder, does MySQL only support lower case? That is almost unreadable.
I would do that differently, create a query based on the number of days to be inserted, and insert in one query rather than a loop.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Hi,
I am currently using stored procedures to do an action after my INSERT, UPDATE or DELETE.
I just wanted to ask for your advise..
do you recommend to continue using the stored procedures or start using triggers?
my simple example is an event-log table... I insert a row in this table if a row was inserted in employees table.
Thanks,
Jassim
Technology News @ www.JassimRahma.com
|
|
|
|
|
For logging purpose, Trigger is fine. Personally I prefer Stored procedure because you can't control the Trigger.
And Trigger Disadvantages[^]
thatrajaCode converters | Education Needed
No thanks, I am all stocked up. - Luc Pattyn
When you're wrestling a gorilla, you don't stop when you're tired, you stop when the gorilla is - Henry Minute
|
|
|
|
|
Your link shows very important information. And from that information, I conclude that it may be a bad decision to use them for logging purposes: in a transaction rollback, also the trigger actions will be rolled back. I.e. when you log by inserting into a table, that will be rolled back, too!
|
|
|
|
|
Sorry for the delay. For beginners(yes he is), Triggers not suggestable. We can't control that. As Eddy said, we can't dubug too. Personally I use Stored procedure instead of trigger. But I rarely use triggers for tasks such as audit trail(During Login/Logout).
thatrajaCode converters | Education Needed
No thanks, I am all stocked up. - Luc Pattyn
When you're wrestling a gorilla, you don't stop when you're tired, you stop when the gorilla is - Henry Minute
|
|
|
|
|
Jassim Rahma wrote: my simple example is an event-log table... I insert a row in this table if a row was inserted in employees table. Can be done both ways. Triggers *could* be harder to debug though.
I cannot recommend anything, as that would depend on the requirements of the audit-trail. How does your "event log table" look, and what are you planning on doing with it?
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Jassim Rahma wrote: or start using triggers?
As with anything it depends on the usage.
Jassim Rahma wrote: my simple example is an event-log table...
Might note that in my experience doing this is often driven by the need for audit logging and it is seldom sufficient as it ignores user, reason and fails to allow deletions to be handled well.
|
|
|
|
|
so in this case you pefer to do it through SPs?
Technology News @ www.JassimRahma.com
|
|
|
|
|
Hi,
Is is possible to create a trigger ON SELECT in MySQL?
If not, any other way to do it?
Thanks,
Jassim
Technology News @ www.JassimRahma.com
|
|
|
|
|
Jassim Rahma wrote: Is is possible to create a trigger ON SELECT in MySQL?
Only 3 events INSERT, UPDATE & DELETE. CREATE TRIGGER Syntax[^] & its Syntax and Examples[^]
Jassim Rahma wrote: If not, any other way to do it?
What do you want? explain clearly(ON SELECT).
thatrajaCode converters | Education Needed
No thanks, I am all stocked up. - Luc Pattyn
When you're wrestling a gorilla, you don't stop when you're tired, you stop when the gorilla is - Henry Minute
|
|
|
|
|
Jassim Rahma wrote: Is is possible to create a trigger ON SELECT in MySQL? No. That's why you added the second question, ain't it?
Jassim Rahma wrote: If not, any other way to do it? No. You can fetch from a SP and add that functionality yourself. Triggers aren't meant as an audit-trail, that's hardly secure.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Dear all,
I'm currently trying to make a DB system scratch which I can use as backbone for my Qt application. Since I will store big amounts of data [photo RAW data] in there, I figured that a classic relational DB might not be the best solution. What I'm planning is now a classic relational DB [MariaDB] as persistent storage, cached by a No-SQL Database [MongoDB].
In the end the whole system shall be built up like this.
Here's a little additional explanation:
Persistent Storage Access
The Persistent Storage Access handles the access to the persistent MariaDB database which holds all the application data. It supplies the data used to build the Cache and if there are Update or Creation Queries, but also for Get Queries if the cache can't supply the requested item.
Cache Access
The Cache Access handles the access to the MongoDB database which is used to cache the access to the Persistent Storage. It contains a copy of the most recently added, updated or requested items.
Data Access API
The Data Access API coordinates access between Cache and Persistent Storage. If an item requested can't be found in the cache, the DAA gets it from the Persistent Storage and also adds it to the Cache. If an item is added, the DAA saves it into the Persistent Storage and also adds it to the cache. All this coordination is needed to take out complexity from other parts, e.g. the business classes.
I I'm in doubt whether the MongoDB will really give me any performance boosts, so if anyone could share their thoughts on what I plan here?
I will never again mention that Dalek Dave was the poster of the One Millionth Lounge Post, nor that it was complete drivel.
How to ask a question
|
|
|
|
|