Introduction
Working with Databases can often be boring. Opening the connection, checking if it is right, preparing the command, executing, checking, retrieving and managing the result, closing the connection are all dull operations that I didn't want to repeat any more. I built a class that automatically executes them all. And it does much and much more.
Note:
There is a solution with two projects into the attached source code. The first project (Database) contains the class I am talking about. The second one (Uso Database) contains a Main routine of a console application showing how to use the class resident in the first project. In that routine, I use some different connections to show how the Database class works. For this purpose, I browse through a SqlServer database and through a MySql database where the standard NorthWind content is loaded.
Here you can find the download of the scripts that generate the Nortwind content:
- SqlServer Northwind - In this page there is a link to a msi installer that will install the sql script under C:\Sql Server 2000 Sample Database folder.
- MySql Nortwind - In this page there is a link to the sql stcript.
The SqlServer script is tought for SqlServer2000, but it also runs well with SqlExpress and with more recent versions (I have SqlExpress 2008, and it runs well).
If you have a different version of SqlServer or if you disabled the Windows security access system, maybe you will have to change the constructor parameters.
In the third part of the DatabaseExample.vb you will find an access to a Sql Server Compact Edition database engine, which doesn't require any installation, and which is able to create the required schema whitout any request, and manage it with whit no effort. If you have neither a SqlServer machine nor MySql machine, you can directly test this engine stepping over the other two.
In this case, you can see the result either the console and the debugger output as I have into my machine in OutputBuffers.zip attached file.
Anohter note: I wrote this class in VB.Net. I like much more VB.Net than C# only because the error reporting is more immediate. If someone try to translate it in C#, I will appreciate him, and I can verify your translation. If you want that I have to translate it, you have to wait for a lot :-)
Using the code
You can create a connection with the constructor of a derived table:
Using DB As DataBase = New DataBase.SQLServerDatabase(".\SqlExpress", "NorthWind")
Using DB As DataBase = New DataBase.MySqlDatabase("127.0.0.1", "NorthWind", "root", "adminpwd")
No need to open the connection, no need to define anything about the command text or anything else. You can forget all the rest. Also, the closing connection is no more ant of your worries. The connection will be closed automatically after a timeout without any command.
The very next statement could be the request of a query:
Using Result As DataBase.SQLResult = DB.Execute("Select * From Employees")
For Each DR As DataRow In Result.DataTable.Rows
[...]
Next DR
End Using
This will check if the connection exists. If it does not exist, then it will be created. Then it will be open. Next, it will create the command text, a new dataadapter, a new dataset and it will finally be filled. The result is put into a SQLResult object and an autoclose timer starts. The SQLResult object knows the dataset of the last select command, as well as the datatable of that dataset - in case the dataset contains only one table or the command is a schema request - If the command is specified that the result must be a datareader the SQLResult object will contain that datareader. Besides it knows also the count of the rows involved by the last operation (when a datareader is not used), or the potential error.
Furthermore, there is a function that returns a string for the debug (AsString()
), and a method showing a messagebox in case of errors. This method has optional parameters to choose the buttons to be shown, a titlle, a user defined message of the messagebox, which can also show the sql that raised the error and/or the stack location.
DONE. Nothing else to do. Easy, right?
The error catching is also very simple:
Using Result As DataBase.SQLResult = DB.Execute("Select * From Employeees")
If Result.HasError Then
Result.ShowMsg()
Else
For Each DR As DataRow In Result.DataTable.Rows
[...]
Next DR
End If
End Using
Or, in a even simpler way:
Using Result As DataBase.SQLResult = DB.Execute(True, "Select * From Employeees")
If Not Result.HasError Then
For Each DR As DataRow In Result.DataTable.Rows
[...]
Next DR
End If
End Using
Where the parameter True
(ShowError
) will show the same MessageBox including the Sql command that raised the error.
Another very simple way to catch errors is checking if there are any, and warning users at the end of the statements. In the following example, I simulate the insertion of an order without a product ID (that raises an error because it has to be not null):
Dim DoneAs Boolean = True
DB.BeginTransaction()
Done = Done AndAlso Not DB.Execute("Insert Into Orders ({0}) Values ({1})",
New Dictionary(Of String, Object) From {{"OrderID", 11078},
{"CustomerID", "WARTH"},
{"EmployeeID", 5},
{"OrderDate", #7/4/1999#},
{"RequiredDate", #9/6/1999#},
{"ShipName", "Wartian Herkku"},
{"ShipAddress", "Torikatu 38"},
{"ShipCity", "Oulu"},
{"ShipPostalCode", "90110"},
{"ShipCountry", "Finland"}}).HasError
For Each W In {New With {.ID = 11078, .Prod = New Integer?(28),
.Price = 45.6, .Qty = 4, .Scount = 20},
New With {.ID = 11078, .Prod = New Integer?(12),
.Price = 38.0, .Qty = 6, .Scount = 20},
New With {.ID = 11078, .Prod = New Integer?(),
.Price = 32.45, .Qty = 1, .Scount = 20},
New With {.ID = 11078, .Prod = New Integer?(49),
.Price = 20.0, .Qty = 25, .Scount = 20}}
Done = Done AndAlso Not DB.Execute("Insert Into `Order details` ({0}) Values ({1})",
New Dictionary(Of String, Object) From {{"OrderID", W.ID},
{"ProductID", W.Prod},
{"UnitPrice", W.Price},
{"Quantity", W.Qty},
{"Discount", W.Scount}}).HasError
Next W
If Done Then
DB.Commit()
Else
DB.RollBack()
MsgBox("Unable to insert order" & vbCrLf & vbCrLf &
DB.LastException.Message & vbCrLf & vbCrLf & DB.LastSqlOnException)
End If
The Done Andalso
forces the program to execute the sql command only if there are no errors in the previous commands. The third order row lacking the product ID (that is mandatory in the Nortwind database), will cause an exception. The following row will not be insert! At the end of the loop, the connection will rollback the order insert commands, and the message will show the cause of the exception, and the Sql that raised the error.
If the constructor is called with the parameter Debugging:=True
, every operation will be tracend into the debugger window. For example, the statement X.Execute("Select * From Employees")
will print into the debugger window the follow lines:
17:41:04.832
Initialization
17:41:05.753
Connection Regenerated and opened
17:41:05.767
* * * Execute SelectString * * *
Command text:
Select * From dbo.Employees
17:41:05.888 (0,121)
Table (9 Rows)
-----------------------------------------------------------------------------------------------
EmployeeID|LastName |FirstName|Title |TitleOfCourtesy|BirthDate
Int32 |String |String |String |String |DateTime
-----------------------------------------------------------------------------------------------
1|Davolio |Nancy |Sales Representative |Ms. |12/08/1948 00:00:00.000
2|Fuller |Andrew |Vice President, Sales |Dr. |02/04/1952 00:00:00.000
3|Leverling|Janet |Sales Representative |Ms. |08/07/1963 00:00:00.000
4|Peacock |Margaret |Sales Representative |Mrs. |09/11/1937 00:00:00.000
5|Buchanan |Steven |Sales Manager |Mr. |03/04/1955 00:00:00.000
6|Suyama |Michael |Sales Representative |Mr. |07/02/1963 00:00:00.000
7|King |Robert |Sales Representative |Mr. |05/03/1960 00:00:00.000
8|Callahan |Laura |Inside Sales Coordinator|Ms. |01/09/1958 00:00:00.000
9|Dodsworth|Anne |Sales Representative |Ms. |10/01/1966 00:00:00.000
Connection closed
You can perform all kind of queries with the same method (.Execute). It will generally recognize the kind of call to the CommandText from the first word. Ie: If the command string starts with a Select
command, the result will be put into a DataSet. If it starts with one of the following: Create
, Insert
, Update
, Delete
, Drop
, Alter
, Use
or Set
, it will be considered an ExecutionNoQuery, and the returned result is the number of rows involved by the command - if the connection permits that; I found some DB engines that return -1 in all cases :-(
If the command string contains a space, it will be considered a stored procedure, in all other cases, it will be considered as the name of a table, and a command like Select * From TheMentionedTable
will be executed. In case of ambiguity, you can specify how the method should behave into the Execute
method:
DB.Execute(DataBase.CallType.Schema, "Tables", {Nothing, "dbo"})
Besides the constructor, there is another way to create an instance of derived classes. For example, you can parametrize both the DB access data and the used engine. With the Build
method of the base class, you will be able to parametrize the engine to use.
For Each DBParameter In {New With {.DBType = DataBase.ConnectorTypeEnum.MySql,
.Server = "127.0.0.1", .DBName = "northwind",
.UID = "root", .Pwd = "adminpwd"},
New With {.DBType = DataBase.ConnectorTypeEnum.SqlServer,
.Server = ".\SqlExpress", .DBName = "NorthWind",
.UID = CStr(Nothing), .Pwd = CStr(Nothing)},
New With {.DBType = DataBase.ConnectorTypeEnum.SqlCompact,
.Server = MyFolder & "\DB.SclCEDB", .DBName = "",
.UID = "", .Pwd = "MyDataBasePwd"}}
Using DB = DataBase.Build(DBParameter.DBType, DBParameter.Server,
DBParameter.DBName, DBParameter.UID, DBParameter.Pwd, Debugging:=True)
DB.Execute("Select Count(*) from {0}",
DB.NamedColumn(If(DBParameter.DBType = DataBase.ConnectorTypeEnum.SqlCompact, "Users", "order details")))
End Using
Next DBParameter
Implementing a different Connector
You can implement all kind of new connectors by inheriting the main class. The only thing to do is to inherit the class, generate the code like the examples for the connectors I already introduced, include the new class description into the ConnectorTypeEnum
and manage the configuration to enable and disable that new class (and the old ones) in the project where you will have to use the DataBase
class itself (either with or without the new connector).
Cause the build
method that uses reflection, there is a mandatory clasuse to the new inherited class implementation: the ConnectorTypeEnum
have to be a new enum value, and the new inherited class MUST have the name combining that new value and the suffix «Database» (for example, using MySql connector, I created the enum value ConnectorTypeEnum.MySql
and the new implemented class named MySqlDatabase
.
I put different connectors into different files, allowing to use the same files in projects that are using different connectors, disabling or enabilng the referements for that files.
In order to implement different connectors you will have to inherit the Database class into a new file, and implement all MustOverride methods, with particular attention to the constructors - where the connection string and all variavles used to manage the parameters are set - and the AddParameters method: remember to use the correct parameter management into the command object. You can use the included derived classes as an example. I putted five different implementations into the uploaded project: SqlCompactEdition, SqlServer, MySql, Oracle, and Odbc. The last one is the more versatile in that it provides some properties that can be used in all DB engines. You will be able to experience that if you have to manage something in a non-standard way, as the prefix and postfix of parameters and strictly object naming become read/write instead of readonly, and there is a way to implement a different escaping string management. The SqlCompactEdition shows a very different use of the database managing. The RegenerateConnection
method tries to open the database. If it fails, the file with the database content will be created or regenerated (depending on the kind of error). Also the MySql shows a database management that is different from the standard one. As a matter of fact, the RegenerateConnection
method sets the net_write_timeout
and net_read_timeout
parameters which should be set in order to avoid to some strange fatal error.
After that, you will have to create a new configuration through the Edit Configuration window in Compile menu of Visual Studio, selecting the configuration called <New...> and manually editing the project file to allow the inclusion or exclusion of that file, and to enable or disable the reference to the connector dlls selecting a different configuration.
This is an example of two different implementations, the first for Sql And MySql connectors and the second for Oracle. You can find thoose rows into the project file.
<Choose>
<When Condition="'$(Configuration)' == 'SqlAndMySQL'">
<ItemGroup>
<Reference Include="MySql.Data, Version=6.6.5.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>C:\Program Files (x86)\MySQL\MySQL Connector Net 6.6.5\Assemblies\v4.0\MySql.Data.dll</HintPath>
</Reference>
</ItemGroup>
<ItemGroup>
<Compile Include="ODBCDataBase.vb" />
<Compile Include="MySqlDatabase.vb" />
<Compile Include="SqlServerDatabase.vb" />
</ItemGroup>
</When>
<When Condition="'$(Configuration)' == 'SqlAndOracle'">
<ItemGroup>
<Reference Include="Oracle.DataAccess, Version=2.112.1.0, Culture=neutral, PublicKeyToken=89b483f429c47342, processorArchitecture=x86">
<SpecificVersion>False</SpecificVersion>
<HintPath>C:\Program Files\oracle\32\Oracle.DataAccess.dll</HintPath>
</Reference>
</ItemGroup>
<ItemGroup>
<Compile Include="OracleDatabase.vb" />
<Compile Include="SqlServerDatabase.vb" />
</ItemGroup>
</When>
</Choose>
Background
All started a lot of time ago, when I had two different database engines (Informix and SqlServer) that had almost the same data (though saved into tables and columns with different names). I started to write a class to open a connection without knowing the connection type itself. My target was to be able to have a class that would use the right connector, the right command and the right dataadapter/datareader by only giving the connection type by a parameter.
Therefore, I provided to save data into fields to access the database, and open the connection at the need. Then, a Timer starts, and in case of another request, the already opened connection performs the new request. Otherwise, if there isn't any request for more than the specified time, the connection will be closed and disposed.
After that, I started to compile the SelectStrings
like the String.Format
does. In my programs I started to store the table names and column names into dictionary and to write select strings like thist:
"Select * From {0} Where {1} = 'Code'", TableDictionary!TableName, ColumnDictionary!Code
A second step was to format all values as the connector wants:
"Select * From {0} Where {1} = '{2}'", TableDictionary!TableName, ColumnDictionary!Code, MyValue
where MyValue
could be a string with any character. SqlServer had to escape only quote characters doubling them, Informix has to put an escape character ('\') before them, and duplicate it if there is any into the string.
The next step was to write a PrepareString
method that could check all parameters, and if it finds one that is a string, escaping it as the connector wants.
The PrepareString
method was not so easy as I thought it was to be. If a parameter was a DateTime, different DB engines have different literal representations. I started to put switches to check datatypes for thoose representations. Moreover, I live in Italy, and there is quite a difference in DateTime representations, whether I want to put the value as a datetime value, or there is a column that is a string and maybe its value is to be set as a datetime value (where the format is dd/MM/yyyy instead MM/dd/yyyy). I started to put a lot of switch (that in due time was cut out). Consider theese two statements:
"Insert Into Table (Code, DateTime, Description)
Values (10, '1/3/2001', 'The character ''\'' is a backslash')"
"Insert Into Table (Code, DateTime, Description)
Values (10, '2001-01-03-00-00-00', 'The character \'\\\' is a backslash')"
They are different representation of the same SQL, that depends from the connector. I started to write SQL commands in this way:
"Insert Into Table ({0}, {1}, {2}) Values ({3}, '{4}', '{5}')",
"Code", "DateTime", "Description", 10, #1/3/2001#, "The character '\' is a backslash"
It's pretty better. The next step was to use Parameters. I found a nice way to use them.
Consider that:
Dim Dk As New Dictionary(Of String, Object) From {{"Code", 10},
{"DateTime", #1/3/2001#},
{"Description", "The character '\' is a backslash"}}
Execute("Insert Into Table ({0}) Values ({1})", Dk)
that could be used also in this way:
Dim Dk As New Dictionary(Of String, Object) From {{"Code", 10},
{"DateTime", #1/3/2001#},
{"Description", "The character '\' is a backslash"}}
If Execute("Update Table set {0} Where Code = {1}", Dk, NamedParameter("Code")).RowNumber = 0 Then
Execute("Insert Into Table ({0}) Values ({1})", Dk)
End If
Slowly, all uses become more and more elegants. But the parameters are handled quite differently from each connector. It had me to decide to stop with switches and start using different classes.
In addiction, I put into the dictionary interpreter some little uncommon implements: the first one regards nulls. Either Null values, empty nullable values and DbNull.Value
values are interpreted as DbNull values. The second one regards strings. You can put in the key of the dictionary the length of the field; this way, the parser will cut off the description to that length. Thus you can avoid errors or warnings when you try to put into values strings that are too long:
Dim Dk As New Dictionary(Of String, Object) From {{"Code", 10},
{"DateTime", #1/3/2001#},
{"Description(20)", "The character '\' is a backslash"}}
If Execute("Update Table set {0} Where Code = {1}", Dk, NamedParameter("Code")).RowNumber = 0 Then
Execute("Insert Into Table ({0}) Values ({1})", Dk)
End If
The parser cuts off the length representation (20)
from the name, leaving only Description
. The property value will be The character '\' is
.
Another implementation is to allow bulk insert and bulk select/delete. If there is a -
character in the name, it will be replaced by a _
character in the parameter naming, and the parser, checking for that follows the -
character knows that the bulk operation is referring to another row/select. Consider that:
Dim MyDataArray() = {New With {.Code = "One", .Art = "First",
.Descr = "Some description", .DT = Now},
New With {.Code = "Two", .Art = "Another",
.Descr = "Another too much long description that have to be cutted", .DT = Now.AddMonths(3)},
New With {.Code = "Three", .Art = "Last",
.Descr = "Last description", .DT = #3/5/1985 11:37:55 AM#}}
Dim DK As New Dictionary(Of String, Object)
Dim I As Integer = -1, Prefix As String = ""
For Each W In MyDataArray
DK.Add("Cod" & Prefix, W.Code)
DK.Add("Art" & Prefix, W.Art)
DK.Add("Descr(20)" & Prefix, W.Descr)
DK.Add("DT" & Prefix, W.DT)
I += 1
Prefix = I.ToString("\-00")
Next W
DB.Execute("Insert Into MyTable({0}) Values ({1})", DK)
DB.Execute("Delete From MyTable Where {0}", Dk)
that are translated into:
Insert Into MyTable (Cod, Art, Descr, DT)
Values (@Cod, @Art, @Descr, @DT),
(@Cod_00, @Art_00, @Descr_00, @DT_00),
(@Cod_01, @Art_01, @Descr_01, @DT_01)
Delete From MyTable Where ((Cod = @Cod And Art = @Art And Descr = @Descr And DT = @DT) Or
(Cod = @Cod_00 And Art = @Art_00 And Descr = @Descr_00 And DT = @DT_00) Or
(Cod = @Cod_01 And Art = @Art_01 And Descr = @Descr_01 And DT = @DT_01))
i.e. (with literal date parsed by the connector):
Insert Into MyTable (Cod, Art, Descr, Dt)
Values ('One', 'First', 'Some description', Now),
('Two', 'Another', 'Another too much lon', Now + 3 Months),
('Three', 'Last', 'Last description', #3/5/1985#)
Delete From MyTable Where ((Cod = 'One' And Art = 'First' And Descr = 'Some description' And DT = Now) Or
(Cod = 'Two' And Art = 'Another' And Descr = 'Another too much lon' And DT = Now + 3 Months) Or
(Cod = 'Three' And Art = 'Last' And Descr = 'Last description' And DT = #3/5/1985#))
Pretty fine.
Now I started using also MySql, Oracle and SqlCompactEdition. It was VERY easy to implement thoose.
Another implementation that the class has is a Background Worker. When there is a deferred request that is to be executed in a Background process, the ExecuteInBackground
method stores in a queue all requests, than checks if the BackgroundWorker is busy, and, if not, it feeds the BackgroundWorker with the request on the top of the queue. When this request ends, the process returns into the foreground thread, and it invokes the eventual method to be called. After that, it checks if the queue is clear or not, and restarts the cycle.
I used a BackgroundWorker for two reasons: the first one is that I didn't know wich was the best way to operate. In a first moment, when a SqlCommand was performed, instead of exiting to the background thread, a ProgressChange was invoked, where the method to call could be invoked into the Foreground thread. At the same time, the background thread continues to pefrom the next SqlCommand in the queue. Later (as the background worker is designed), when a SqlCommand is performed, the Background thread exits, the process returns into the foreground thread, the method to be called is invoked, and the queue is newerly checked.
In the end I opted for this design. In this way, the external method could check into the database to examinate the results of the performed actions. Furthermore, I can add what I want in any moment to the queue requests (the other way around, the queue had to be locked, because it was analized in a background thread when a foreground thread could feed it). It is less elegant than using AWait
, but in this way the background feeding could be more efficent.
Consider that example:
DB.BeginWork()
DB.ExecuteInBackGround(ContinueCallBack, ArgumentList, "Create Temp Table MyTable ([...])")
DB.ExecuteInBackGround(ContinueCallBack, ArgumentList, "Insert Into MyTable ([...]) Select [...]")
DB.ExecuteInBackGround(ContinueCallBack, ArgumentList, "Update MyTable Set [...] Where [...]")
DB.ExecuteInBackGround(ContinueCallBack, ArgumentList, "Execute Procedure RegenAll()")
DB.ExecuteInBackGround(ContinueCallBack, ArgumentList, "Update OriginalTable Set [...] Where [...]")
where all the statements could be slow (like for a contability regeneration). The ContinueCallBack
method could check for a click of a Cancel button on the UI:
Private Sub CmdCancel_Click(sender As Object, e As EventArgs) Handles CmdCancel.Click
CancelClicked = True
DB.BreakCommand()
End Sub
Private Sub ContinueCallBackFunction(SqlResult As Database.SqlResult, Args As Object())
If CancelClicked Then
DB.BreakCommand(True)
DB.RollBack()
Exit Sub
End If
If SqlResult.HasError Then
[...]
End If
UpdateProgressBar()
End Sub
The derived classes have to implement also a Clone method, and it is so usefull when you have to manage background operation. All connectors could make a single operation each time, then you CANNOT request a background operation and, during the background operation, call the execution of another statement. To avoid that, you can use the Clone method:
Dim DB As DataBase = New SqlServerDatabase(ServerName, DBName, User, Password)
Dim DB1 = DB.Clone()
DB1.ExecuteInBackground([...])
DB.Execute([...])
In this way there are two different connections, one for each instance. The requests are concurrent. If you use the same connection, one of two commands will throw an exception, i.e. the SqlResult will contain the exception.
The Clone Method could be also usefull for using DataReader. By default, the query are returned into dataset, but you can also retrieve data into datareader. It could be usefull in case of a slow statement (when there are a lot of rows and no use of indexes) and you have to do a lot of work for each of them. As a matter of fact, the use of the dataset performs the same operations as the use of the datareader, but the return data is performed when the last row is retrieved and the dataset contains all rows. The datareader allows to return the control when the first row is retrieved, and if the search of the second row is slow, and if the processing of each row is so slow, you can process the first row while the database server searches for other rows. In this way the performances are optimized. If one, or both, of the operation is quick, the dataset allows to manage the entire retrieved data at once, for example using LinQ, or returning back to the first row, oragain, accessing to the rows not sequentially - the datareader performs the query again if you try to search the first row a second time.
Another limitation of the datareader is that you cannot perform any other operation until the datareader is closed. For that purpose you can use the clone method to perform more operations in the same time:
Using DB As DataBase = New SqlServerDatabase(ServerName, DBName, User, Password)
Using Result As SqlResult = DB.Execute(CallType.SelectToReader, "Select [...]")
Using DB1 As DataBase = DB.Clone()
While Result.DataReader.Read()
DB1.Execute([...])
End While
End Using
End Using
End Using
There is also a serie of debugging tools: in case the solution is in debug mode, the property DebugMode is though that all requests are to be printed into the debug window. There is also the possibility to write a log file, where you will find all requests after a flush - that is performed every time there is a close request, either after the closingConnectionTimer
has a tick. While the DebugMode
is not shared along all connections, though you can set it into the constructors, or set it for a specific Database instance - the DebugFileLocation
, wich sets the StreamWriter of that logging, is shared along all instances to avoid conflicts.
Moreover the functions Data2String
are very usefull for debugging, and used by it: you can see the content of a Datareader, of a DataTable, or, if you are filtering data with a DataView or the Select method of the DataTable, it retrieves the result of any enumerable type of DataRow and DataRowView objects. If you launch the Data2Sstring of a single row or rowview, it will search, optionally, the entire rowset of that row, to show the current row index along the rest of the table. In that way, if you are using a loop for scanning along the rows of a datatable, you can see the progress of that scan.