Contents
This article showcases my AMS.ADO class library, which contains a set of classes used for executing database commands without the need for the typical connection management code. The classes are implemented into two separate assemblies -- one for .NET 2.0 (to take advantage of generics), the other for .NET 1.1 -- and are available for the four main providers: SQL Server, OLEDB, ODBC, and Oracle. Enjoy!
When you want to run a database query or stored procedure, you typically follow the same set of steps every time:
- Create a command object and pass it the SQL for the query as well as a database connection object.
- If it's a stored procedure, set the
CommandType
to StoredProcedure
.
- Set any parameters required for the query.
- Open the connection to the database.
- Execute the query and retrieve the records, if necessary.
- Close the database connection.
The code in C# typically looks like this:
using System.Data.SqlClient;
...
using (SqlConnection conn = new SqlConnection(connectionString))
{
SqlCommand command = new SqlCommand("spUpdateDescription", conn);
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add("@ID", id);
command.Parameters.Add("@Description", description);
conn.Open();
command.ExecuteNonQuery();
}
or in VB.NET 1.1:
Imports System.Data.SqlClient
...
Dim conn As New SqlConnection(connectionString)
Dim command As New SqlCommand("spUpdateDescription", conn)
command.CommandType = CommandType.StoredProcedure
command.Parameters.Add("@ID", id)
command.Parameters.Add("@Description", description)
Try
conn.Open()
command.ExecuteNonQuery()
Finally
conn.Dispose()
End Try
This logic needs to be repeated pretty much everywhere a query or stored procedure is executed. The reason is that the connection needs to be carefully managed so that it's opened and used only for the time required to access the database. Since this is a manual process, it's possible to inadvertently forget to close a connection after it's been used, which may cause the pool of connections to eventually reach its limit.
My solution to these issues is to wrap the four main IDbCommand
classes (SqlCommand
, OleDbCommand
, OdbcCommand
, and OracleCommand
) into two sets of classes, which handle the connection management details behind the scenes for me.
The two classes are called SQL
and StoredProcedure
. They reduce the above six steps down to three:
- Create the SQL object and pass it the query (or stored procedure) as well as a database connection string or object.
- Set any parameters required for the query.
- Execute the query and retrieve the records, if necessary.
So, the above code looks like this in C#:
using AMS.ADO.SqlClient;
...
StoredProcedure sp = new
StoredProcedure("spUpdateDescription", connectionString);
sp.Parameters.Add("@ID", id);
sp.Parameters.Add("@Description", description);
sp.ExecuteNonQuery();
or in VB.NET:
Imports AMS.ADO.SqlClient
...
Dim sp As New StoredProcedure("spUpdateDescription", connectionString)
sp.Parameters.Add("@ID", id)
sp.Parameters.Add("@Description", description)
sp.ExecuteNonQuery()
The same logic is now in five clear and simple lines of code, instead of ten (or twelve for VB.NET 1.1), with database access and connection management code mixed together. What a difference!
And it's not just the reduction in code, it's also not having to worry about leaving a connection open inadvertently. As you probably guessed, my ExecuteNonQuery
creates and opens the connection, calls the real ExecuteNonQuery
, and then closes the connection before returning the results. So, all that repetitive code is now where it should be: hidden away.
The above example is a simple (but realistic) case of writing to the database via a stored procedure. Since I pass a connection string to the constructor, the class creates and maintains the connection internally. If I had passed a connection object instead, it would have left it in the same open/closed state as it found it before the call to ExecuteNonQuery
.
The StoredProcedure
class derives most of its functionality from the SQL
class. It's designed to eliminate the extra call to set the CommandType
to StoredProcedure
and to make it easy to search for stored procedure calls inside the code. The SQL
class is designed for executing general SQL statements or queries (CRUD) against the database.
Let's look at another example, this time using the SQL
class to execute a SELECT
query in C#:
using AMS.ADO.SqlClient;
using System.Data.SqlClient;
...
using (SQL sql = new SQL("SELECT Description FROM" +
" SomeTable WHERE ID = @ID", connectionString))
{
sql.Parameters.Add("@ID", id);
for (SqlDataReader reader = sql.ExecuteReader(); reader.Read(); )
{
string description = reader.GetString(0);
...
}
}
or in VB.NET 1.1:
Imports AMS.ADO.SqlClient
Imports System.Data.SqlClient
...
Dim sql As New SQL("SELECT Description FROM" & _
" SomeTable WHERE ID = @ID", connectionString)
sql.Parameters.Add("@ID", id)
Try
Dim reader As SqlDataReader = sql.ExecuteReader()
While reader.Read()
Dim description As String = reader.GetString(0)
...
End While
Finally
sql.Dispose()
End Try
Did you notice something missing? That's right, the connection object is nowhere to be seen! You only see what you care about: running the query and retrieving the results.
In this case, the ExecuteReader
creates and opens the connection before calling the real ExecuteReader
. Then, when I dispose off the SQL
object (or close the reader), the connection gets closed automatically.
Both classes, SQL
and StoredProcedure
, are well documented in the downloadable help file (AMS.ADO.chm zipped), which I created based on the XML comments inside the code (with the NDoc tool).
As I mentioned before, I created a separate set of these classes for the four main data providers available in .NET today. The classes are named the same (SQL
and StoredProcedure
) but they're distinguished by the namespace they belong to:
AMS.ADO.SqlClient
AMS.ADO.OleDb
AMS.ADO.Odbc
AMS.ADO.OracleClient
.NET 2.0
I initially wrote the code in C# 2.0 so that I could take advantage of generics. The idea was to have a single generic base class for all data providers since the code would be the same except for the type names. I named my class "Command
" (inside the AMS.ADO
namespace) and declared it like this:
public class Command<ConnectionClass, TransactionClass, CommandClass,
ParameterClass, ParameterCollectionClass, DataReaderClass,
DataAdapterClass> : ICommand
{
...
}
So now, the same class serves as base for all four data providers -- SQL Server, OLE DB, ODBC, and Oracle -- and additional providers (such as SQL Server CE) can easily be added in future. Generics rocks!
After I had written the code, I wanted to use something like a typedef
that would allow me to define the corresponding classes for all providers in one line of code, sort of like this:
public typedef AMS.ADO.Command<SqlConnection, SqlTransaction,
SqlCommand, SqlParameter, SqlParameterCollection,
SqlDataReader, SqlDataAdapter> SQL;
Unfortunately, C# doesn't have a real typedef
equivalent -- the using
directive is not the same -- so I was forced to define each class explicitly, along with all the required constructors, since they're not inherited. In short, I had to do this:
public class SQL : AMS.ADO.Command<SqlConnection, SqlTransaction,
SqlCommand, SqlParameter, SqlParameterCollection,
SqlDataReader, SqlDataAdapter>
{
public SQL()
{
}
public SQL(string sql);
{
...
}
public SQL(string sql, string connectionString)
{
...
}
...
}
It wasn't much of a problem, but it's a clear example of how the absence of a language feature can make a significant difference. (In retrospect, I could have written this in C++, but I'm too attached to C#.)
After I had written the code, I discovered Visual Studio 2005's cool new "View Class Diagram" feature (by right-clicking on the project), and decided to generate it for my classes. Here's what ClassDiagram.cd looks like:
As you can see, my ICommand
interface derives from System.Data.IDbCommand
. I simply added a couple of extra properties and methods that I thought would be nice to have. Other than that, the Command
-derived classes look very much like their .NET counterparts, so they're easy to pick up.
.NET 1.1
After I had written and documented the code, I decided to generate the help file from the comments in the code. The only tool, I know of, that does it is NDoc, which as of this writing does not support generic types in .NET 2.0. I decided to shelve this project to see if a working version of NDoc would be released... but it never came. So, I finally decided to create a version of these classes for .NET 1.1. This would also allow those users who still haven't moved to .NET 2.0 to use these classes. Of course, the downside would be that I would have to create four new sets of classes, and duplicate the code in the Command
class directly inside each one. Not a pretty sight, but it worked.
I created a separate Visual Studio .NET 2003 solution inside the NET 1.1 folder, where I copied the files into. I kept the same names across the board, for both classes and namespaces. The idea is that when you switch to .NET 2.0, you'll just need to reference the .NET 2.0 assembly and rebuild your project(s). No code changes will be required.
So the help file is based on the .NET 1.1 version, but it's applicable to both assemblies since the names are the same.
Testing
I tested my code using the popular NUnit tool -- csUnit was still not available for .NET 2.0.
I created a "Test" folder where I added "fixtures" for the SqlClient, OleDb, and Odbc classes. They all work with the local SQL Server database using Windows authentication, and automatically create a small database ("testAMSADO") along with a couple of tables and stored procedures.
Since the test source files depend on the nunit.framework assembly, I decided to exclude them from the solutions to eliminate the unnecessary dependency. The files are still there in case anyone's interested. Here's how I had it set up for the two assemblies:
- The .NET 2.0 project built the test files right into the assembly, so I would load AMS.ADO.dll directly into NUnit.
- The .NET 1.1 solution had a separate project (AMS.ADO.Test.csproj) for the test files inside the same Test folder, so I would load AMS.ADO.Test.dll into NUnit.
The tests were great in helping me verify that the code worked as designed. I highly recommend testing low-level code with tools like NUnit.
The downloadable zip file contains both the .NET 1.1 and 2.0 versions of AMS.ADO.dll (named the same, under different folders), so be sure to use the one appropriate for your project.
- Open your own project inside Visual Studio .NET.
- Inside the Solution Explorer toolbox, right-click on the References folder and select "Add Reference".
- Click on the Browse button and select the AMS.ADO.dll file. The .NET 1.1 version is inside the NET1.1 folder.
- Click OK. You may now use the
AMS.ADO
namespace inside your project.
If you want to minimize the size of the assembly (although it's only 24K), you can open the corresponding solution inside Visual Studio and exclude the source files that you don't need. For example, if you won't need Oracle or ODBC access, you can right-click on OracleClient.cs and Odbc.cs and select "Exclude From Project". Then, you can remove the reference to System.Data.OracleClient
. As an alternative, you may wish to copy the individual .cs files to your own project to avoid adding yet another assembly to your distribution.
- Version 1.0 - Dec 22, 2005