|
Colin Angus Mackay wrote: I tend to write them explicitly so that anyone maintaining the code knows what I was doing.
Mmm. More used to that. All for unambiguity...
Colin Angus Mackay wrote: Why the guy put "NAME NAME"? No idea, but it just adds to the confusion, doesn't it?
I like to give the benefit of the doubt that it was a simple copy paste error.
|
|
|
|
|
Paddy Boyd wrote: second NAME becomes an alias for the first NAME, d'oh!
Just noticed you message modification. Oh, well.... I hope it helps others too.
|
|
|
|
|
Hi all,
I got a table with trigger. this trigger will raise an error when certain validation is not met. but i'm not able to catch this error in stored produre when i try to insert an record to this table. Need your help. any idea on this?
Thanks in advance!
|
|
|
|
|
@@ERROR[^]
--EricDV Sig---------
Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them.
- Laurence J. Peters
|
|
|
|
|
Folks,
I have a DTS package which loads a text file into a table on SQL server 2000 database. The source column is numbers but occaisionally has "N.A."
The target column is FLOAT and nullable. How do I get DTS to translate "N.A." to NULL?
Code that does the transfer is:-
Public Sub CustomTask_Translate_BidPrice(ByVal oCustomTask1 As Object, ByVal ColumnPosition As Integer)
Dim oTransformation As DTS.Transformation2
Dim oTransProps As DTS.Properties
Dim oColumn As DTS.Column
Set oTransformation = oCustomTask1.Transformations.New("DTSPump.DataPumpTransformCopy")
oTransformation.Name = "DTSTransformation__Bidprice"
oTransformation.TransformFlags = 63
oTransformation.ForceSourceBlobsBuffered = 0
oTransformation.ForceBlobsInMemory = False
oTransformation.InMemoryBlobSize = 1048576
oTransformation.TransformPhases = 4
Set oColumn = oTransformation.SourceColumns.New("PX_BID", 1)
oColumn.Name = "PX_BID"
oColumn.Ordinal = ColumnPosition
oColumn.Flags = 32
oColumn.Size = 255
oColumn.DataType = 129
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True
oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing
Set oColumn = oTransformation.DestinationColumns.New("BidPrice", 1)
oColumn.Name = "BidPrice"
oColumn.Ordinal = 4
oColumn.Flags = 8
oColumn.Size = 50
oColumn.DataType = DBDataType.DBTYPE_NUMERIC
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True
oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing
'oTransformation.
Set oTransProps = oTransformation.TransformServerProperties
Set oTransProps = Nothing
oCustomTask1.Transformations.Add oTransformation
Set oTransformation = Nothing
End Sub
|
|
|
|
|
Hi friends.. This is selvaraj.K.My requirement is set of pending records will be sent automatically concerned person from SQL Server 2000.The set of records get from one stroed procedure.I hope you catch my point.If anyone know regarding this please suggest or advice or send example as soon as possible....
Thanks in advance
Selvaraj.K
|
|
|
|
|
I have a single client application that physically connects to a number of instruments in a lab. These instruments transmit large amounts of data (via RS232 & TCP/IP) which are all stored on our SQL server.
My first question is, performance wise, will it be better to create a seperate "Instrument##MessagesIn" table for every instrument or to have a common table for all instruments and only distinguish between them using an InstrumentID column?
My second question is, what are the advantages of using a seperate DB for certain data on a single server, the amount of history data recorded by this system is literally millions of records in tblHistory. Will a second database (on the same server) dedicated to history tables show any performance gain, or maybe relieve overhead, memory or CPU on my main database?
Regards
you can't forget something you never knew...
|
|
|
|
|
It's always a bad idea to dynamically create tables. It invariably means creating dynamic SQL to access and modify data in the database, which will cause many more query plans to be cached. Also, it's difficult to know when to drop a given table.
Keeping your history data in a separate database is not really necessary, but you should consider keeping it on a separate disk (or disk array) from the transactional data, so that overhead of long running queries on the history does not impact the I/O performance of the immediate data. Also, you can get locking problems if you don't separate historical and live data: if your history queries tend to read a lot of rows, that can cause updates or inserts to block until the history query's transaction is committed or rolled back. This can also impact any OLTP work which uses table scans (this should be avoided if at all possible) since the cost of the table scan to find a given row is on average the cost of reading half the number of rows in the table.
To do this in a single database requires understanding additional data files and filegroups so that you can place a specific table in a specific file or group of files which will be on a particular disk. It's often simpler to keep it in a completely separate database. This will also generally be easier to migrate to a separate server later, if required.
|
|
|
|
|
If you are just inserting new records then it will make little difference whether you have 1 table or many. There are other issues with splitting the data out to seperate tables namely that you will need to have additional code to determine which table to insert into and new code to handle the insertion. In addition, your history table will need to be populated by new code.
Moving the history table to another server is not worthwhile. If performance does ever begin to drop, you could look at whether having a history table for a time period that can be periodically archived off to the final history table would help.
Ian
|
|
|
|
|
Hi,
TO improve the performance of operations on a table I want to partition the table. So can someone let me know how to do this in SQL Server 2000.
Regards,
Uma
|
|
|
|
|
|
Hi
I want to create a partitioned table. can u give me the syntax for it.
Regards,
Uma
|
|
|
|
|
|
I would like to take the field values of a row in a DataTable and copy them to a new row in the same DataTable. Below is the code I used to do this.
Dim newrow As DataRow = Me.MyDataSet.MyDataTable.Rows(0)
Me.MyDataSet.MyDataTable.Rows.Add(newrow)
An exception is thrown saying this row already belongs to this table.
How can I accomplish this?
Thanks
|
|
|
|
|
Why on earth would you want to?
It is already there!
Steve
|
|
|
|
|
I have no idea why you would want to do this, but it can be accomplished with the ImportRow method (among other ways).
Me.MyDataSet.MyDataTable.ImportRow(newrow)
Why do you want to duplicate a row in the same datatable?
|
|
|
|
|
The data table contains patient information. One row, however, contains default values for any new patients. The user can edit any existing patient in the data table. When the user clicks on New the values from the default row are used to populate the new row with default values.
The primary key field cannot be included in the ImportRow. I think and error will occur when updating back to the database. Can this field be excluded?
|
|
|
|
|
Dim defaultRow As DataRow = Me.MyDataSet.MyDataTable.Rows(0)
Dim newRow as Datarow = Me.MyDataSet.MyDataTable.Newrow
'Set newRow values for all necessary columns
newRow.Item("MyColumn") = defaultRow.Item("MyColumn")
Me.MyDataSet.MyDataTable.Rows.Add(newrow)
I haven't tested this, but it should work. I would think, however, that it would be easier to simply specify default values for your columns in the dataset, or even at the database level.
|
|
|
|
|
There are hundreds of fields that are used for default values so your last idea would be to lengthy.
The importrow does not work because the table has a primary key field and that field cannot be imported. I tried unsuccessfully to remove that field from the datarow.
Any other ideas?
|
|
|
|
|
You could try passing the values from the defaultRow to the newRow using the ItemArray method, and then change the value of the primary key with the Item method.
newRow.ItemArray = defaultRow.ItemArray
newRow.Item("PrimaryKey") = NewPrimaryKeyValue
|
|
|
|
|
Being the field is a unique primary key VB might not allow you to assign a value to that field. But I will try this. Thanks.
|
|
|
|
|
I'm not sure what you mean. A primary key field needs to be assigned a value just like any other field.
|
|
|
|
|
The error I get is this:
Column 'MyTableid' is constrained to be unique
No matter what value I assign to the primary key field, even if the value I assign is unique to the table, this error is thrown.
|
|
|
|
|
Why don't you just set the columns in each row to have default values?
When you create your table, you can set the columns to have default values!
Then, if you do not change or provide a value for a particular field, the default value is inserted automatically. No need to copy a default row at all...
Steve
-- modified at 13:12 Tuesday 22nd August, 2006
|
|
|
|
|
That sounds good. But the table has a few hundred fields. Is there any way to set the columns to have default values without alot of coding?
|
|
|
|