|
djj55 wrote: You would need a common key column as some SQL programs do not keep the order of the table.
I am not sure what are you saying.but I do have a column that is comon for for both tables.
|
|
|
|
|
You may need to explain the terminology here because I have no idea what "cell one" means in your question.
If you use the term "field", "record", etc., you can make your question more understandable by others.
As for your question, you can make a query on the union of the two tables to find the averages.
|
|
|
|
|
Sory for the confusion. I should tell you that I am really very very beginer about database.
I will find out how to do union and query it.
thanks.
|
|
|
|
|
Hello, I am looking for a way to find what tables are used within a stored procedure. I have around 300 stored procedures that I would like to know what if any tables are used.
I can script the stored procedures to a file that can be investigated.
Thank you for any leads
djj
|
|
|
|
|
djj55 wrote: I have around 300 stored procedures that I would like to know what if any tables are used.
If it's Sql2005, then you could use the sp_depends [^] procedure to list all the dependencies. It's replaced in Sql2008 with something new, but it would probably still work.
Hope this helps
I are Troll
|
|
|
|
|
Thank you for the reply.
As I am currently working with 2000 sp_depends is the ticket. The problem is getting the output into a table.
Thank you again,
djj
|
|
|
|
|
You're welcome
There are no 'real' options to get the multiple result-tables into a temptable under Sql 2000 that I'm aware of. In Sql2005, you could try using a CLR sproc.
As a hack, perhaps you could execute a DOS-command from SQL? If you can execute isql.exe, then you can pipe the result of the command to a textfile. It would get real messy if you try to parse the resulting textfile using SQL, so it might be easier to execute a small console application that parses it and writes it to the database.
I are Troll
|
|
|
|
|
Thank you, I will look into the "DOS" solution tomorrow. Sometime in the next three months we are upgrading to 2008.
|
|
|
|
|
Hi frd's
I am using windows server 2003 and oracle 10g Express Edition.
Unlike SQL server oracle 10g service is not starting at system start up
Can anybody suggest me a solution for this.
Thanks & Regards,
Anil Chelasani
|
|
|
|
|
Can you check the path "ORACLE_BASE\ORACLE_HOME\database"? There could be an error-log for ORADIM there.
I are Troll
|
|
|
|
|
Hi!
I have a table that contains a Date and Minutes. I want to create a query to group the entries by Date so that instead of seeing more than one entry on a day, you will only see one entry date with the total minutes?
Can anyone shed some light on this issue?
Thank you!!
Illegal Operation
|
|
|
|
|
Try this
Sample Data
declare @t table(dt date, tm int)
insert into @t
select '2010-01-05',15 union all
select '2010-01-05',16 union all
select '2010-01-05',18 union all
select '2010-01-05',36 union all
select '2010-01-05',59 union all
select '2010-01-06',01 union all
select '2010-01-06',20 union all
select '2010-01-07',30 union all
select '2009-12-21',34 union all
select '2009-12-22',12
Query
select dt,CntMins = COUNT(tm),TotalMins = SUM(tm) from @t
group by dt
--order by SUM(tm) desc --[Incase you want to see the result in Descending Order]
Output
dt CntMins TotalMins
2009-12-21 1 34
2009-12-22 1 12
2010-01-05 5 144
2010-01-06 2 21
2010-01-07 1 30
Niladri Biswas
|
|
|
|
|
Group by only the date part of the datetime, modified being the datetime field
GROUP BY CONVERT(DATETIME,Modified,103)
[edit] the above does not work, if you are usiong 2008 the following will work using the new DATE data type.
SELECT COUNT(*)
FROM EquityCounter
GROUP BY CONVERT(DATE,Modified)
If you are using 2005 I would suggest chopping up the string daettime like so:
GROUP BY CONVERT(DATETIME,LEFT(CONVERT(VARCHAR(50),Modified),11))
[/edit]
Still gotta be better than a temp table!
Never underestimate the power of human stupidity
RAH
modified on Tuesday, January 5, 2010 1:16 AM
|
|
|
|
|
Mycroft Holmes wrote: Still gotta be better than a temp table!
The temp table in the answer above was only to provide example data methinks!
|
|
|
|
|
Me didn't look at the details of the answer, just looked at the complexity and shook the head. I don't think Nilandri answered the question which was to group by the date only in a datetime field that has the time component in it.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
How bad would a stored proc have to be that could not return more than 3000 records before timing out, regardless of how big your timeout value was? We're talking SQL Server 2005 running on an Itanium server...
.45 ACP - because shooting twice is just silly ----- "Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001
|
|
|
|
|
John Simmons / outlaw programmer wrote: regardless of how big your timeout value was?
Even with a demanding query you'd have a hard time with timing out "regardless how big the value". I'd go for an endless loop in a cursor.
Can you count the cursors, temporary tables and the udf's?
I are Troll
|
|
|
|
|
Well, it does eventually come back (that's how they know about the 3000 record limit). An endless loop wouldn't come back at all...
.45 ACP - because shooting twice is just silly ----- "Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001
|
|
|
|
|
The latest SQL Management Studio has a cool query-analyzer that even proposes indexes. It might help in tracking down the worst offenders - that way you don't have to plow through all of them.
What's it doing? "Just" some TSQL, or is it calling a webservice and writing the results to a networkdrive?
--edit--
What MyCroft said, and I'm still curious to what it's doing
I are Troll
|
|
|
|
|
I dunno - the boss says it times out, shrugs his shoulders, and we all laugh a little at the problem.
.45 ACP - because shooting twice is just silly ----- "Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001
|
|
|
|
|
John Simmons / outlaw programmer wrote: How bad would a stored proc have to be
Really crappy, I've seen this where SSMS performs the query in seconds and the UI takes ages. Is this the case?
Have you looked at the execution plan, it usually suggests indexes if there are glaring requirements. Otherwise just look for the most expensive operation and try and optimise it.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Ok so is this the same proc you are bitching about in the Lounge - and you need to ask the question.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
No (at least I don't think so). I'm not bitching about it either. Here, I asked a question. In the Liunge, I merely stated my amazement at somethig regarding the job.
.45 ACP - because shooting twice is just silly ----- "Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001
|
|
|
|
|
So how did the execution plan investigation go?
I just noted the coincidence of the enquiry here and the Lounge entry and wondered if the very slow proc was 1500 lines of unformatted tsql which would probably account for the slow response
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I'm not entirely sure what sort of response you're looking for here. How bad would a stored proc have to be? Well, I guess that depends on what it does. If it's a simple select, then it would have to be pretty bad. If it's a massive stored proc that does a zillion different things and only gets run once a year then I guess that it might not be badly written, it might just be slow and time-consuming, and you'll either have to live with it or find a different approach (don't do it all in one stored proc, break it down into manageable chunks, something like that).
I don't know that anyone will really be able to give you much advice beyond what you've already got from the others, certainly not without knowing more detail about the stored proc.
|
|
|
|