|
While i use the bulk insert to import the csv file into my SQL server database, i am facing problem as the csv file have column with money data such as 1,111,111.00.
As i set the FIELDTERMINATOR =',', it will separate out the value 1,111,111.00 to few of columns as the value have comma in it.
Any suggestion to solve this problem?
Thank You.
|
|
|
|
|
I have installed and configured MS SQL 2005 Express. Both SQL Server and SQL Server Browser login set to "Local System", Shared Memory, Named Pipes and all the stuff set correctly.
If I want to connect to database created in the SQL Server Management Studio, everything is OK, no User ID or Password needed, because of the trusted connection.
Then I've added a database in my project and set Windows Authentication in the "Modify Connection...". After running the program, I've got exception number 4060:
Cannot open database "AddressBook" requested by the login. The login failed. Login failed for user '<my system="" user="" name="">'.
This is the connection string I've used:
SqlConnection myConnection = new SqlConnection(
"server=.\\SQLEXPRESS;" +
"Trusted_Connection=yes;" +
"database=AddressBook;");
Connection test in VS is also OK, only accessing the database (.mdf) programatically using SqlConnection goes wrong.
Maybe it's something easy to fix, but I tried books, Google and MSDN without any progress, so I feel desperate from this
|
|
|
|
|
It sounds like you're trying to work with a detached file, one that isn't a permanently configured database. If that's what you're trying to do, you need to specify the filename in the connection string, using the AttachDBFilename keyword.
For more information, see the documentation for SqlConnection.ConnectionString .
|
|
|
|
|
I'm just looking for a bit of education here. I was reviewing an old piece of code and found the something like the following:
SELECT NAME NAME
FROM TABLE
Where NAME would be a field name.
This doesn't throw an error and returns a single column of data (just NAME).
I'm just wondering why? And why doesn't it fall over? I had a quick search under books online, but (as you can probably see from the thread title), i'm not sure how to phrase that search..
-- modified at 9:16 Tuesday 22nd August, 2006
Just figured it out - second NAME becomes an alias for the first NAME, d'oh!
|
|
|
|
|
Paddy Boyd wrote: I'm just wondering why?
You can create aliases for column names. I tend to write them explicitly so that anyone maintaining the code knows what I was doing. e.g.
SELECT RealColumnName AS AliasName
FROM TableName
The "AS" is optional and can be left out. As you can probably tell, it makes the code much easier to read if the "AS" is left is because it reduces the ambiguity, especially if someone was not aware of aliases, or that they can be declared without the "AS".
Why the guy put "NAME NAME"? No idea, but it just adds to the confusion, doesn't it? Perhaps he wanted to waste the time of those that came after him. Perhaps he was just an idiot. That is something I guess we'll never know.
|
|
|
|
|
Colin Angus Mackay wrote: I tend to write them explicitly so that anyone maintaining the code knows what I was doing.
Mmm. More used to that. All for unambiguity...
Colin Angus Mackay wrote: Why the guy put "NAME NAME"? No idea, but it just adds to the confusion, doesn't it?
I like to give the benefit of the doubt that it was a simple copy paste error.
|
|
|
|
|
Paddy Boyd wrote: second NAME becomes an alias for the first NAME, d'oh!
Just noticed you message modification. Oh, well.... I hope it helps others too.
|
|
|
|
|
Hi all,
I got a table with trigger. this trigger will raise an error when certain validation is not met. but i'm not able to catch this error in stored produre when i try to insert an record to this table. Need your help. any idea on this?
Thanks in advance!
|
|
|
|
|
@@ERROR[^]
--EricDV Sig---------
Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them.
- Laurence J. Peters
|
|
|
|
|
Folks,
I have a DTS package which loads a text file into a table on SQL server 2000 database. The source column is numbers but occaisionally has "N.A."
The target column is FLOAT and nullable. How do I get DTS to translate "N.A." to NULL?
Code that does the transfer is:-
Public Sub CustomTask_Translate_BidPrice(ByVal oCustomTask1 As Object, ByVal ColumnPosition As Integer)
Dim oTransformation As DTS.Transformation2
Dim oTransProps As DTS.Properties
Dim oColumn As DTS.Column
Set oTransformation = oCustomTask1.Transformations.New("DTSPump.DataPumpTransformCopy")
oTransformation.Name = "DTSTransformation__Bidprice"
oTransformation.TransformFlags = 63
oTransformation.ForceSourceBlobsBuffered = 0
oTransformation.ForceBlobsInMemory = False
oTransformation.InMemoryBlobSize = 1048576
oTransformation.TransformPhases = 4
Set oColumn = oTransformation.SourceColumns.New("PX_BID", 1)
oColumn.Name = "PX_BID"
oColumn.Ordinal = ColumnPosition
oColumn.Flags = 32
oColumn.Size = 255
oColumn.DataType = 129
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True
oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing
Set oColumn = oTransformation.DestinationColumns.New("BidPrice", 1)
oColumn.Name = "BidPrice"
oColumn.Ordinal = 4
oColumn.Flags = 8
oColumn.Size = 50
oColumn.DataType = DBDataType.DBTYPE_NUMERIC
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True
oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing
'oTransformation.
Set oTransProps = oTransformation.TransformServerProperties
Set oTransProps = Nothing
oCustomTask1.Transformations.Add oTransformation
Set oTransformation = Nothing
End Sub
|
|
|
|
|
Hi friends.. This is selvaraj.K.My requirement is set of pending records will be sent automatically concerned person from SQL Server 2000.The set of records get from one stroed procedure.I hope you catch my point.If anyone know regarding this please suggest or advice or send example as soon as possible....
Thanks in advance
Selvaraj.K
|
|
|
|
|
I have a single client application that physically connects to a number of instruments in a lab. These instruments transmit large amounts of data (via RS232 & TCP/IP) which are all stored on our SQL server.
My first question is, performance wise, will it be better to create a seperate "Instrument##MessagesIn" table for every instrument or to have a common table for all instruments and only distinguish between them using an InstrumentID column?
My second question is, what are the advantages of using a seperate DB for certain data on a single server, the amount of history data recorded by this system is literally millions of records in tblHistory. Will a second database (on the same server) dedicated to history tables show any performance gain, or maybe relieve overhead, memory or CPU on my main database?
Regards
you can't forget something you never knew...
|
|
|
|
|
It's always a bad idea to dynamically create tables. It invariably means creating dynamic SQL to access and modify data in the database, which will cause many more query plans to be cached. Also, it's difficult to know when to drop a given table.
Keeping your history data in a separate database is not really necessary, but you should consider keeping it on a separate disk (or disk array) from the transactional data, so that overhead of long running queries on the history does not impact the I/O performance of the immediate data. Also, you can get locking problems if you don't separate historical and live data: if your history queries tend to read a lot of rows, that can cause updates or inserts to block until the history query's transaction is committed or rolled back. This can also impact any OLTP work which uses table scans (this should be avoided if at all possible) since the cost of the table scan to find a given row is on average the cost of reading half the number of rows in the table.
To do this in a single database requires understanding additional data files and filegroups so that you can place a specific table in a specific file or group of files which will be on a particular disk. It's often simpler to keep it in a completely separate database. This will also generally be easier to migrate to a separate server later, if required.
|
|
|
|
|
If you are just inserting new records then it will make little difference whether you have 1 table or many. There are other issues with splitting the data out to seperate tables namely that you will need to have additional code to determine which table to insert into and new code to handle the insertion. In addition, your history table will need to be populated by new code.
Moving the history table to another server is not worthwhile. If performance does ever begin to drop, you could look at whether having a history table for a time period that can be periodically archived off to the final history table would help.
Ian
|
|
|
|
|
Hi,
TO improve the performance of operations on a table I want to partition the table. So can someone let me know how to do this in SQL Server 2000.
Regards,
Uma
|
|
|
|
|
|
Hi
I want to create a partitioned table. can u give me the syntax for it.
Regards,
Uma
|
|
|
|
|
|
I would like to take the field values of a row in a DataTable and copy them to a new row in the same DataTable. Below is the code I used to do this.
Dim newrow As DataRow = Me.MyDataSet.MyDataTable.Rows(0)
Me.MyDataSet.MyDataTable.Rows.Add(newrow)
An exception is thrown saying this row already belongs to this table.
How can I accomplish this?
Thanks
|
|
|
|
|
Why on earth would you want to?
It is already there!
Steve
|
|
|
|
|
I have no idea why you would want to do this, but it can be accomplished with the ImportRow method (among other ways).
Me.MyDataSet.MyDataTable.ImportRow(newrow)
Why do you want to duplicate a row in the same datatable?
|
|
|
|
|
The data table contains patient information. One row, however, contains default values for any new patients. The user can edit any existing patient in the data table. When the user clicks on New the values from the default row are used to populate the new row with default values.
The primary key field cannot be included in the ImportRow. I think and error will occur when updating back to the database. Can this field be excluded?
|
|
|
|
|
Dim defaultRow As DataRow = Me.MyDataSet.MyDataTable.Rows(0)
Dim newRow as Datarow = Me.MyDataSet.MyDataTable.Newrow
'Set newRow values for all necessary columns
newRow.Item("MyColumn") = defaultRow.Item("MyColumn")
Me.MyDataSet.MyDataTable.Rows.Add(newrow)
I haven't tested this, but it should work. I would think, however, that it would be easier to simply specify default values for your columns in the dataset, or even at the database level.
|
|
|
|
|
There are hundreds of fields that are used for default values so your last idea would be to lengthy.
The importrow does not work because the table has a primary key field and that field cannot be imported. I tried unsuccessfully to remove that field from the datarow.
Any other ideas?
|
|
|
|
|
You could try passing the values from the defaultRow to the newRow using the ItemArray method, and then change the value of the primary key with the Item method.
newRow.ItemArray = defaultRow.ItemArray
newRow.Item("PrimaryKey") = NewPrimaryKeyValue
|
|
|
|