|
Not sure I understand. Also, can I use a PIVOT to create a table available through a GridView or would it be just for reporting?
Thanx again!
|
|
|
|
|
This is a simpler way to make a crosstab:
Select UserID
,Max(Case When RoleID = 'SecRole1' Then 'Y' Else 'N' End) as SecRole1
,Max(Case When RoleID = 'SecRole2' Then 'Y' Else 'N' End) as SecRole2
,Max(Case When RoleID = 'SecRole3' Then 'Y' Else 'N' End) as SecRole3
,Max(Case When RoleID = 'SecRole4' Then 'Y' Else 'N' End) as SecRole4
,...
From UserRoles
Group By UserID
SQL is for datacollection, it doesn't care whether you use the data for reporting or grids...
"When did ignorance become a point of view" - Dilbert
|
|
|
|
|
Not being able to get the @colname variable to work, here's what I've now been working on. If it works, there will be 100+ case statements. Will what you recommend above work with an Update? My code seems to go through the table being updated once rather than through the CSEEmployeeRoles table which has the multiple records per user. I've done things like this in SQL reports before but never had this problem.
update table1 set
AuditView = case when table2.SecurityRole='Audit View' and AuditView='No' then 'Yes' else AuditView end,
BasicArchive = case when table2.SecurityRole='Basic Archive' then 'Yes' else BasicArchive end,
BasicModify = case when table2.SecurityRole='Basic Modify' then 'Yes' else BasicModify end,
BasicView = case when table2.SecurityRole='Basic View' then 'Yes' else BasicView end,
CentralScanOperations = case when table2.SecurityRole='Central Scan Operations' then 'Yes' else CentralScanOperations end,
CreateorMaintainParticipant = case when table2.SecurityRole='Create or Maintain Participant' then 'Yes' else CreateorMaintainParticipant end,
CreateRefundParticipant = case when table2.SecurityRole='Create Refund Participant' then 'Yes' else CreateRefundParticipant end,
from EmergencyContact.dbo.CSEEmployeeRoles as table2 right join EmergencyContact.dbo.CSERolesRolledUp as table1
on (table2.EmployeeID = table1.EmployeeID)
Thanx again, Jörgen, for your input.
|
|
|
|
|
Both Piebald and Jorgen have essentially answered your question
You MUST not store the data again (unless you are going to normalise the data as suggested by piebald)
You should use a pivot query to present the data in the format the user requires. This article [^]may help (shameless plug)
I would suggest that you do both, restructure the data when you receive it from the State, just b/c someone else has a crappy format (or you are actually receiving denormalised data designed for output) does not mean you can't change it in your loading process.
Create a view (or at least a stored proc) based on the article that supplies the data in the format the users require.
Oh and give Piebald and Jorgen an upvote for the valid answers.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Shameless plug. I think not. It totally relevant to the OP.
If I hadn't forgotten that I already bookmarked it, I would have linked to it myself.
I guess that's just how bad your brain works when you're home with the flu.
"When did ignorance become a point of view" - Dilbert
|
|
|
|
|
You are right, both their responses were very helpful and I am now working on using a Pivot to do this. Not sure yet how the end result can be used by a GridView to display the data to the user.
Just opened your article and seems to be really helpful. Thanx!
However, my original question had to do with using a variable to reference a column for updating. From what I've read, this should work ... as I understand it. Sadly, it does not for me. So, final question ... is this doable and if it is, what am I doing wrong? See original question and code.
Thanx again and I will give upvotes to both.
modified on Tuesday, December 21, 2010 12:11 PM
|
|
|
|
|
The only way to do it is the one you have used, dynamic sql.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
But it does not work and I am stumped. Thanx for your help.
I am working through your Pivot article too see if it applies to what I am trying to do. Good article!
|
|
|
|
|
JTRizos wrote: select @command= 'update EmergencyContact.dbo.CSERolesRolledUp set '+@colname+' = "Yes"where EmployeeID = '+@Eid
Print this variable and run it directly in SSMS, see if it updates, I think you problem is simply formatting the string!
try changing "Yes" to 'Yes'
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Thanx for the suggestion.
Printing the variable @command displays the correct Update command:
update EmergencyContact.dbo.CSERolesRolledUp set AddRequesterInformation = "Yes" where EmployeeID = 38
But I get an "Invalid column name 'Yes'" error and no update is done.
Feeling a bit more reassured that this will work but need to figure out why the error. At least you did not say this would not work.
Thanx again and Merry Christmas!
|
|
|
|
|
change the quotes around yes to single quotes '
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I got it to work. Followed your advise from your previous reply and after a few tries, bingo, it worked. Here's the code in case it can help others. I am leaving the print statements I used for testing.
Declare @colname varchar(200), @Eid varchar(50),@message varchar(80),@command varchar(200)
Declare my_cursor CURSOR
For Select replace(replace(replace(SecurityRole,' ',''),'/',''),'-','') as SecRole,EmployeeID
from EmergencyContact.dbo.CSEEmployeeRoles
order by EmployeeID, SecRole
open my_cursor
fetch next from my_cursor into @colname,@Eid
while @@fetch_status = 0
begin
select @message = @colname+' '+@Eid
--print @message
select @command= 'update EmergencyContact.dbo.CSERolesRolledUp set '+@colname+' = ''Yes''
where EmployeeID = '+@Eid
exec (@command)
print @UpdDate
--print @colname
--print @command
fetch next from my_cursor into @colname,@Eid
--print @colname
end
close my_cursor
deallocate my_cursor
Thanx again for your help. Having never used Cursor or Dynamic SQL before, I just needed to know it can be done and I was on the right track.
Happy Holidays!
|
|
|
|
|
Looking for some expertise with explains. I've a database procedure that is refusing to behave properly since it has been migrated from Oracle 9i to Oracle 10g. It runs completely in under 3 minutes on my Oracle 9i database, but on 10g it will take either 32 minutes or 10 minutes to complete depending upon whether the statistics are estimated or the table is analyzed. I've been using the DBMS_PROFILER and have narrowed it down to a for loop involving a cursor. However when I explain the cursor, I not only have two different plans, 9i vs 10g, but the costs are completely different, except that 10g is the much reduced costs!
Oracle 9i EXPLAIN
Operation Options Object Optimizer
------------------------------ --------------- ------------------------- -------------
SELECT STATEMENT Cost = 1405 CHOOSE
FILTER
TABLE ACCESS FULL TRANSACTION_TABLE ANALYZED
FILTER
NESTED LOOPS OUTER
MERGE JOIN CARTESIAN
FILTER
NESTED LOOPS OUTER
INDEX UNIQUE SCAN XPKTRANSACTION ANALYZED
INDEX RANGE SCAN XPKTRANSACTIONLOT ANALYZED
BUFFER SORT
INDEX RANGE SCAN XPKTRANSACTION ANALYZED
INDEX RANGE SCAN XPKTRANSACTIONLOT ANALYZED
Oracle 10g EXPLAIN
Operation Options Object Optimize
------------------------------ --------------- ------------------------- ---------
SELECT STATEMENT Cost = 374 ALL_ROWS
FILTER
TABLE ACCESS BY INDEX ROWID TRANSACTION_TABLE ANALYZED
INDEX RANGE SCAN IXTRANSACTION_FIFO ANALYZED
FILTER
HASH JOIN RIGHT OUTER
INDEX RANGE SCAN XPKTRANSACTIONLOT ANALYZED
NESTED LOOPS
FILTER
NESTED LOOPS OUTER
INDEX UNIQUE SCAN XPKTRANSACTION ANALYZED
INDEX RANGE SCAN XPKTRANSACTIONLOT ANALYZED
INDEX RANGE SCAN XPKTRANSACTION ANALYZED
Anybody with some good explain foo that could suggest some things to try in order to speed this puppy up. Thanks.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
posting about Crystal Reports here is like discussing gay marriage on a catholic church’s website.[Nishant Sivakumar]
|
|
|
|
|
Not easy to say as I don't know the size of the tables or how the query looks like
I would also like to see the cost, card and byte for every level in the plan.
But, more often than one might believe, a full table scan might be the faster choice.
So try the query on the 10g with a FULL hint.
Or as <a href="http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:9422487749968">Tom Kyte</a>[<a href="http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:9422487749968" target="_blank" title="New Window">^</a>] has said: "full scans are NOT ALWAYS evil, indexes are NOT ALWAYS good"
"When did ignorance become a point of view" - Dilbert
|
|
|
|
|
Jörgen Andersson wrote: Or as Tom Kyte[^] has said: "full scans are NOT ALWAYS evil, indexes are NOT ALWAYS good"
For many years I tried to nail down the best practice when indexing and tuning a database (generally sql server). Every time I had a DBA cornered I would quiz them on this subject and every time I would get back the namby pamby response "it depends", what sort of response is that, I want to lay down immutable rules that say this is how to set up the indexing, "it depends" just does not cut it.
Ah thats better, needed that, it's been slow day here.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
A DBA that doesn't say "it depends", hasn't been presented a specific enough case.
But I have found one "rule" that hasn't failed me yet when it's about aggregating data. It's all about minimizing the amount of data (in bytes, not necessarily rows) that the db has to handle.
Say for example that you want to join together several tables, some holding descriptive texts and some other holding data that should be aggregated.
If you make a query that makes the joins and the aggregation in one go, then the db will join first and aggregate later, creating millions of rows with text that will just be grouped when the aggregation happens. Creating all this in memory, or in worst case the HD, takes time.
Instead I'm breaking down all queries into smaller parts that I materialize using subquery factoring (CTE for you SQLServer people) and then I join the results from them together.
So I'm making one or more queries that aggregate the data, and one or more separate queries for the descriptions, and then I join the aggregated results together in the end.
This method is often magnitudes faster than a onepiece query.
"When did ignorance become a point of view" - Dilbert
|
|
|
|
|
That's one of my "rules" too.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
posting about Crystal Reports here is like discussing gay marriage on a catholic church’s website.[Nishant Sivakumar]
|
|
|
|
|
I'm going to try different hints for the cursor to see what happens. Thanks.
Ultimately I expect to have to re-develop this whole procedure. It was originally written for Oracle 7 and then had several changes when migrated to 8. Since then it has not been touched, but I think I'll be heavily testing over the holidays.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
posting about Crystal Reports here is like discussing gay marriage on a catholic church’s website.[Nishant Sivakumar]
|
|
|
|
|
Hi
Recently i faced an issue on sql database.
The database is in production which holds the data of all the employees in employee table
Recently we found that values in employee table(not all values) got changed. Bdays, Status are the columns which we identified and not sure if there is more data that got modified.
I need to find what exactly has happened, like any transactions that took place on the table? Or any other other way to find what has happened on database.
Thanks in advance
Naina
Naina
|
|
|
|
|
You could check list of recently ran query in SQL.
See HERE[^].
|
|
|
|
|
Hi
Thanks for the query, but the query gives only today's transactions. Is there any way to get old transactions?
Naina
Naina
|
|
|
|
|
Hi all,
I am using oracle transactions in c#.net in .net frameowrk 2008.The WCF web application giving error "Can't serialize access for the transaction" when multiple user searches for same record in a table in Oracle 11g can anybody tell how can it be resolved.I have increased the initrans value to 100 it is not giving error now, but what if the user gets above 1000 limit.will initrans works.
Thanks in advance.
|
|
|
|
|
|
I am trying to load a DB up with about 3100 tables of data using C#.
Somewhere around table 1770 I get the following message
"Microsoft SQL Native Client: Not enough storage is available to complete this operation."
Does anyone know how to fix this problem, any help would be greatly appreciated.
Thanks in advance,
Michael
|
|
|
|
|
Well you are doing something wrong, we load 1000s of table every day, peak is about 2k per hour. There is not enough info to be able to help here.
How are you loading the data in, BCP, BulkCopy, 1 record at a time?
Is you client having the problem or is it just relaying the error from the server.
Have you checked the transaction log size of the server, the event log of the server, the database properties of the server.
Never underestimate the power of human stupidity
RAH
|
|
|
|