Abstract
In this article, we will drill down deeper in to the design of a n-tier architecture and our focus will be on the data access tier (DAT) (Figure 0).Our goal is to design and implement an appropriate data access tier(DAT) for
scalable, reliable and interoperable
web application. I have divided this article in to two major parts. In the first part, we
will discuss about architectural goals, typed DataSet,
EventLogs and HttpModules, which
are helpful to understand the second part of this article . We will build
in the second part a small(but fine) n-tier application and our focus will be on design and
implementation of data access tier. You
need a
SqlServer (database: Northwind) and VS.net to follow this article.
Contents:
1. Designing Goals
of a data access tier(DAT)
1.1
Architectural Goals
1.2
Installation of the prototype
1.3
Typed DataSets
1.4
Event Logs
1.5
HttpModules
2.0 Implementing
versatile data access tier(DAT)
2.1 Implemeting a super
class for data access tier.
2.2
A performance
test to evaluate SqlDataReader and
DataSet objects in the business tier.
2.3
How can I extend this data access tier (DAT) to access
multi-databases.
2.4 How can I apply database transactions with different
isolation level?
2.5 How can I
insert and retrieve images?
Figure 0
shows us a typical 3 Tier
Application that is dissected in to three major
disjunctive layers, which
are data tier (DA),middle tier and presentation tier (PT).
If you are not familiar with n-tier architecture, I like to recommend following articles.
1.
Designing an appropriate data access tier (DAT)
1.0
Architectural Goals
The must requirements of an enterprise application can be
expressed through the following buzzwords
- Interoperability
and Extendibility
- Scalability
and Performance
- Reliability
and Manageability
- Security
Let us look some of
the buzzwords closer, what do they really
mean?
Interoperability deals
with the ability of a system to
communicate with other internal or
external systems in a heterogeneous environment . This is a very important
issue, because we need information to
act just in time to meet needs of our customers and business
partners. To achieve this important goal, our system must able to offer and consume web services.
To be agile in your
business, sometimes you need to
extend your application. If we couple a
system so tightly to other systems, then it may trigger a �modify avalanche� (e.g. you pass data to a peer application
via email, which represents data
tables through comma separated columns.
The recipient will open the email and
parse it to gain information. Say, if
we want add or change a column,
then we have to modify other
depending systems too because
the order of columns have changed).In order to avoid �modify avalanches�, we must couple
systems loosely using technologies such like Web services (preciously: XML supported technologies ).
Scalability is a measure which describes the ability of a
system to increase performance by adding
additional system resources .A
system applies following strategies to be scalable.
- It must
able to allocate new resources to satisfy increasing user requests.
- It must use algorithms and data
structures which won�t occupy
resource for long time.
Scalability is a very important issue, because you can
invest in a system with the confidence that
the system will grow with the business.
Reliability of a
software system can be defined as the
probability that a system will perform
its task without failure . To achieve higher reliability, a system must provide
utilities to track possible errors.
Figure 1 shows partial ER-Diagram
of the database Northwind
So, we have now great goals. In order to achieve these goals, we have to lay a proper groundwork.
Our data access tier (DAT) must be designed to accomplish the following tasks.
- It
must enable us to interact with set of data sources and to coordinate transactions among them
- It
must be flexible to use different data access technologies (SqlDataAdapter
and DataSet), because this issue deals with scalability and
performance of your system.
- It must support presentation tier (PT)
to display coherent data tables
. Most of interrelated tables of a relational database are modeled according to
master-detail (Parent-Child )
relationship . In our database
example (see Fig 1), you can
find out some these duos and they are: Customers-Orders . Orders-Order Details. There is always
a need in the presentation tier(PT) to drill down the
master table to expose details. The Figure 2 displays the
master-detail relationship
Customer-Order .
- It
must provide infrastructure to trace
and analyze errors .
Figure 2 shows a screen shot of the demo web application DAPrototype
1.2 Installation
of the prototype
Before we move
further, I recommend you to install
the sample application DAPrototype, so that
you can refer it time to
time. The solution DAPrototype.sln consists following class libraries (see Fig 3).
Assembly
|
Description
|
BLT
|
Assembly for the business logic tier
|
DAT
|
Assembly for
the data access tier
|
HttpTraceModule
|
contents a
HttpModule to trace errors in the
presentation layer
|
TDS
|
Contents typed
datasets which serve as a vehicle to
transport data between the tiers
|
Figure 3
Please follow these
instruction to install the sample application:
- Download
the file DAPrototype.zip
- Create a new ASP.NET
C# web application DAPrototype and replace it with the file which you have downloaded.
-
Open and execute the SQL Script DAPrototype/SqlScript/DAPrototype.sql, in order to create store
procedures on the database Nortwind which are part of our data tier
(DT)
-
In order to create Event Logs(Windows) for the presentation, business and
data access tiers, open the C# project
DAProtype/ELCreator/ELCreator.csproj separately (see 1.4). Please
adjust the machine name
there and execute it . Because of
security reasons, We can�t integrate
this creational functionality in our web application . The Window user ASPNET
has no sufficient right to
create windows resources.
- Adjust configuration values for the keys in the DAPrototype/Web.config especially the keys localConnection, daMachine, blMachine and plMachine (See Fig 4).
<appSettings>
-->
<add key="LocalConnection"
value="server=localhost;database=Northwind;uid=sa;pwd=moses;
pooling=true; Max Pool Size=100;"/>
-->
<addkey="daMachine"value="ABRAHAM"/>
<addkey="daLog"value="DALog"/>
<addkey="daSource"value="DASource"/>
-->
<addkey="blMachine"value="ABRAHAM"/>
<addkey="blLog"value="BLLog"/>
<addkey="blSource"value="BLSource"/>
-->
<addkey="plMachine"value="ABRAHAM"/>
<addkey="plLog"value="PLLog"/>
<addkey="plSource"value="PLSource"/>
-->
<addkey="consumeTaxRate"value="0.16"/>
</appSettings>
Figure 4
- Set the page
DAPrototype/SerchCustomOrders.aspx as
start page.
I hope now, that
you can run the application.
1.3 Typed DataSets
A DataSet is a
in-memory database and relies on
relational data model. Datasets are
predestinated for cross-tier communication. Like every relational
data model, it has data table, foreign key constraints,
triggers for table relations.
DataSets are self sufficient, because
they must be independent from feeding data sources.
Further more, DataSets are also able to deal with hierarchal data structures
such like XML DOM by treating
XML elements as tables
and XML attributes as table
attributes. Let us look an example. (see Fig 5). The presented XML file lists
some states and its cities of Germany.
=�1.0�encoding=�utf-8�?
<Germany>
<StateName=�Bayern�>
<CityName=�Munic�/>
<CityName=�N�renburg�/>
</State>
<StateName=�NRW�>
<CityName=�D�sseldorf�/>
<CityName=�Bonn�/>
</State>
</Germany>
XML File
XSD file for Corresponding DataSet
Figure 5
In this case, Dataset � German�, needs two tables State and City
and a data relation to
represent this XML document. We
don�t want to dig here further, because it is beyond our focus.
Let us look a
typical data access scenario. Say, you
using one of the overloaded method SqlDataAdapter.Fill
to
populate a DataSet
(untyped) from
a database . The SqlDataAdapter.Fill
method takes a DataSet
and a name of the DataTable as
parameters. DataTables
and
DataColumns
will be only created in the
Dataset, if they don�t already exist in
the DataSet
. In this case, it will
retrieve the required schema
from the data source using the SqlDataAdapter.Schema
method, after that
DataTables
and DataColumns
will
be created, which will fit
the loading data.
DataSet
(untyped)
offers great deal of plasticity at runtime, but they are error-prone, because developers must anticipate the tables and its attributes, while
they code. In worst case, developers must trace back
from business layer (BL) to stored procedures to find out
the content of a DataSet.
Additionally, it is not always possible
to follow back A such case is given,
when you receive a DataSet
from a third party via Webservice and it is not well documented The most errors(e.g. type mismatch
errors) concerning DataSets can be trapped only during the runtime. You can wipe out this sort of problem
using typed DataSets.
Typed DataSets are subclasses of the class DataSet and it has built-in and solid-stated (hard
coded) DataTables and DataRows, which are specified to represent a certain relational data model. Metaphorically spoken, untyped DataSets
are like T-Shirts and a
typed DataSet can be compared to
a made-to-measure suit.
Typed DataSets have in contrast to untyped DataSet
following advantages.
- It is easy read and write source code because hard-coded DataTables and DataRows exposes attributes and their types during the coding
time. If you use untyped DataSet then
you have to search them explicitly in the
DataTable and DataRow collections and you must know their types.
- Type mismatch errors can be discovered during the compile
time.
- Access to DataTables and DataRows of a typed Dataset is slightly faster because the access is determined
at compile time and you don�t need to search DataTables and DataRows through
accessing collections at runtime.
Now, we like show
you, how create to a typed DataSet
using of Visual Studio Intellisence. We like create a typed DataSet
DSCustomer,in which we can store data from customers and their orders. We will use
store procedures SPSelCustomers and
SPSelOrders (See Fig 6) to fetch
data from master-detail tables Customer
and Orders
CREATE PROCEDURE SPSelCustomers
(
@CustomerID varchar(5)
)
AS
SELECT
C.CustomerID,C.CompanyName,C.Address,C.City
FROM Customers C
WHERE C.CustomerID LIKE @CustomerID + '%'
GO
CREATE PROCEDURE SPSelOrders
(
@CustomerID nchar(5)
)
AS
SELECT O.OrderID,O.CustomerID, O.OrderDate
FROM Orders O
WHERE O.CustomerID = @CustomerID
ORDER BY O.OrderDate DESC
GO
Figure 6
In order create the typed DataSet DSCustomer, we must first
design a blueprint . A XSD schema serves this purpose(See Fig 7). Follow these steps to create such a schema.
(You can refer the
file DAProtype/TDS/DSCustomer.xsd )
- Create a new DataSet file DSCustomer.xsd .
(Open the Solution Explorer-> right mouse click on the project -> choose
Add -> Add New Item-> select DataSet and type there DSCustomer.xsd ).
- Create
XML elements that represent the
built-in DataTables .(Launch
the Server Explorer and navigate:
Servers-
> SqlServer -> Northwind -> Store Procedures, then drag SPSelCustomers
and SPSelOrders and drop them on
DSCustomer.xsd ).
-
Create a DataRelation
(master-detail ) between the tables SPSelCustomers and SPSelOrders .
(Right
mouse click on the table SpSelCustomers
- > Add -> New Relation. See Fig 8).
-
Give friendly
names to DataTables and DataRows
without changing the underlying schema .
(Click on the XML view and type
the following statements on the file DSOrders.xsd, which are marked
by an underline
-
<xs:schemaid=DSCustomer
xmlns
:
codegen
=urn:schemas-microsoft-com:xml-msprop.
-
<xs:elementname=SPSelCustomers
codegen
:
typedName
=Customer
codegen
:
typedPlural
=Customers
>
-
<xs:elementname=SPSelOrderscodegen:typedName=Ordercodegen:typedPlural=Orders>
Now, we have denominated the
tables SPSelOrders and SpSelOrders as
Customers and Orders and their DataRows as Customer and Order )
-
Generate the class DSOrder
(Swich back to DataSet
view and right mouse click -> Generate DataSet)
Note: If you
want use typed DataSets in Web service, then it is important to keep in mind that xml
annotations to create friendly name won�t
work with WSDL(web Service Description Language).
Figure 7 XSD schema for the typed
DataSet �DSCustomer �
Figure 8
The following code snippet demonstrates that is very comfortable to write and read
the source code with the
typed DataSet �DSCustomer�, for it exposes its built-in DataTable and DataRows
as properties .
foreach(DSCustomer.Customer drCustomer in dsCustomer.Customers)
{
if( drCustomer.CustomerID ==ANTON)
{
DSCustomer.Order[] oField = drCustomer.GetOrders();
foreach(DSCustomer.Order drOrder in oField)
{
}
}
}
1.4 Event Logs
Event
Logs are centralized Windows resources
to protocol important software and hardware events and it is indispensable to determine error sources. You can easily connect event
logs of remote computers on your local computer, consequently event logs
are suitable to analyze a n-tier
application which is deployed on more
than one machine.
You
can use the EventLog class (namespace: System.Diagnostics
) to deal with
event logs and they allow you
write different types of entries (e.g. information, warning, error etc) that helps
us to express severity of occurred events. The following code snippet shows the basics of the class EventLog .
string strMachine = abraham;
string strLogBL = BLLog;
string strSourceBL = BLSource;
if(!EventLog.SourceExists(strSourceBL,strMachine))
{
EventLog.CreateEventSource(strSourceBL,strLogBL,strMachine);
Console.WriteLine(New log and a source for BL is created);
}
else
{
Console.WriteLine( source for BL exists already);
}
EventLog elBL= new EventLog(strLogBL,strMachine,strSourceBL);
elBL.WriteEntry(Test Error,EventLogEntryType.Error);
elBL.WriteEntry(Test Inforamtion,EventLogEntryType.Information);
elBL.WriteEntry(Test Warning,EventLogEntryType.Warning);
Of course, you can delete an existing EventLog
using the event the EventLog.Delete
method. Writing on Event Logs
consumes resources(CPU time and
disk space), therefore it is recommended to write only important
events.
1.5
HttpModules
HttpModule is an
useful instrument to customize
ASP.NET applications. In other
words, when CLR processes a
web-request from a client, you can use HttpModules to take influence on every stage of html- render process. We will
see later in details that request-processing in ASP.net relies on the
pipeline process model., which enables parallel processing (Webgarten ) and enhance expendability and manageability
of a
web application .
Let us look for a
moment, where we can apply HttpModules. In commercial application, there is always a need collect
data for customer profiling, product
marketing, forecasting business trends etc. You can use HttpModules in elegant
way to collect data with out blotting the application code. It is also applicable to solve security issues( e.g. modules to prevent web attacks, authorization). Further more, HttpModules are
very flexible(manageable), because you can switch on or off
HttpModules on the web.config file. We are using
HttpModules in our sample
application to record the errors that
occur in the presentation layer. We must understand first the ASP.NET pipeline process
model in order to write HttpModules.
Figure 9 sketches ASP.NET
pipeline
Let us look, what happens in a server, when it processes a
request.
Say, a client submits a request for the page Test.aspx .
Step 1:
IIS will
extract page name�s suffix, in our concrete example � .aspx �, in order to load an appropriate ISAPI extension.. IIS will look in the metabase and load for the suffix �.aspx � the extension aspnet_isapi.dll . (You
can manage the extension- loading process by editing the Application
Configuration dialog(See Fig 10). You can open this dialog by launching the
Internet Services and expand -> right click on Default Web Site -> select
the option �Properties� ->
select the tab �Home Directory �-> push the
button �Configuration� )
Figure 10
Afterwards, the HttpExtensionProc function,which is provided by the aspnet_isap.dll,
will be invoked ; subsequently, the client request Test.aspx will be
hand over to the
aspnet_wp.exe process. This process serves as unmanaged
platform for the common language runtime (CLR), that�s why
you add the aspnet_wp.exe process
to the VS debugger, when you debug ASP.NET applications.
Step 2
Now, our request is now passed to a System.Web.HttpRuntime object, which marks the entry point of the process pipeline and
the frontier of managed
code . In fact, the HttpRuntime class is
responsible for processing the
requests, but it funnels incoming
requests through the pipeline
for further processing. The client
request will be wrapped into an instance of the HttpContext class
and delivers it to an
appropriate instance of HttpApplication class . ASP.Net treats
every virtual directory as a separate application. The HttpRuntime
object will examine the client
request and uses an instance of
the HttpApplicationFactory class
to create or find a proper
instance of the HttpApplication class.
The HttpContext
class serves as transport
medium in the pipeline; therefore,
it is accessible in every stage of
the pipeline. The class
HttpContext has instances of classes
HttpRequest, HttpResponse, HttpSessionState, HttpServerUtility and so forth as intrinsic components . Thus makes
this object self sufficient on its journey along the process pipeline to give
information about request,user, session and to load the produced HTML
code.
Step 3
The HttpApplication
class is responsible for managing the entire life cycle of request execution process ;
therefore, it raises events to mark
pipeline stages(e.g . BeginRequest, PreRequestHandlerExcecute,
PostReuestHandler, EndRequesthandler, Error �etc). You can now write
handlers to subscribe events of the
HttpApplication class in order to customize execution process . There are
two major ways to implement it.
You can use directly the Global class,
which inherits the
HttpApplication class (implemented in the file
Global.asax.cs ), but if you hardcode the handlers in the instance Gloabal,
then it will
reduce manageability and
flexibility of the application. HttpModules
are designed to solve this problem in an elegant manner .
Spoken in the language of design and pattern, HttpModules can be
considered as observers of
the subject HttpApplication,
because they are designed to intercept events of the class HttpApplication
. They implement the interface IHttpModule and complied in to a separate assembly .We will show you later in an concrete
example, how to write a HttpModule.
Let us move to next step.
Note :
A HttpApplication object
can be used to process many
requests, but it can process only one request at a time and ASP.Net
doesn�t treat a
HttpApplication object as singleton, particularly in webgarden.
So, this class is not appropriate to
hold properties which deals with whole application such like number of current users.
Step 4.
Now, we come to the climax
of request processing. In this
phase,contents of the server side
controls are transformed into HTML code
and saved into the instance of the
HttpContext class and this scenario can be described through
the pattern abstract factory . Let us look, how this procedure
realized.
At first, the instance of the HttpApplication class
selects an appropriate
HttpHandlerFactory object,
which implements the interface
IHttpHandlerFactory . In our
example Test.aspx, HttpApplication object will
look on the file machine.config (
C:\WINNT\Microsoft.NET\Framework\v1.0.3705\CONFIG) under section <httpHandlers> and
choose because of the suffix
�.aspx � the HttpHandlerFactory
:System.Web.UI.PageHandlerFactory .
In order to get an appropriate instance of the System.Web.UI.Page class, which implements the interface IHttpHandler,
HttpApplication object will invoke the method PageHanderFactory:GetHandler . PageHandlerFactory. HttpApplication object will look first for the requested url Test.aspx on the file web.config under the section <httpHandlers> to find out, if there any HttpHandler(Page) specified. If it finds a specification, then it will create the specified HttpHander and give it to the HttpApplication object. Otherwise, PageHandlerFactory will presume that a HttpHandler(Page) with the name Test exists and create the Test page and return it to the HttpApplication object.
Finally, HttpRuntime object will invoke the method IHttpHandler.
ProcessRequest
(HttpContext context),in our example Test. ProcessRequest
(HttpContext context),
to load the content of sever side controls in to the instance of the class HttpContext and it will go back through HttpModules, if they implement any handlers for the post request execution events.
Now, we look how to implement a HttpModule .As we mentioned before that HttpModules implement the interface IHttpModule . An instance of the class HttpApplication(more preciously : class Global ) is responsible for instantiating HttpModules. The IHttpInterface interface exposes two contract methods.
interface IHttpModule
{
void Init(HttpApplication httpApplication);
void Dispose();
}
The method �void
Init� is normally used subscribe events from HttpApplication and
method � void Dispose�
is a potential candidate to
release unmanaged resources .
We are using a HttpModule HttpTraceModule (see Fig
11) in our prototype DAPrototype to record errors that occurr in the
presentation layer(PL), therefore we
intercept the event
HttpApplication.EndRequest
and retrieve the initial error
in the presentation layer(PL) using the property HttpContext.Error . To use this facility, you must add
the thrown exception to the
current HttpContext using the
AddError(Exception e) method .
using System;
using System.Web;
using System.Diagnostics;
using System.Configuration;
using System.Text;
namespace HttpTraceModule
{
publicclass HttpTraceModule:IHttpModule
{
EventLog elPL;
public HttpTraceModule()
{
string strLog = ConfigurationSettings.AppSettings["plLog"];
string strSource = ConfigurationSettings.AppSettings["plSource"];
string strMachine = ConfigurationSettings.AppSettings["plMachine"];
if(EventLog.SourceExists(strLog,strMachine))
{
elPL = new EventLog(strLog,strMachine,strSource);
}
}
publicvoid Init(HttpApplication httpApplication)
{
httpApplication.BeginRequest += new EventHandler(Application_BeginRequest);
httpApplication.EndRequest += new EventHandler(Application_EndRequest);
}
publicvoid Dispose()
{
}
privatevoid WriteError(HttpContext oContext,Exception oException)
{
StringBuilder oBuilder = new StringBuilder();
oBuilder.Append("An Error took place, while attempting to request the URL: ");
oBuilder.Append(oContext.Request.RawUrl);
oBuilder.Append("**");
oBuilder.Append("Client Address:");
oBuilder.Append(oContext.Request.UserHostAddress);
oBuilder.Append("**");
if( oException != null)
{
oBuilder.Append( oException.Message);
}
if(elPL != null)
{
elPL.WriteEntry(oBuilder.ToString(),EventLogEntryType.Error);
}
}
privatevoid Application_EndRequest(object source, EventArgs e)
{
HttpApplication hApplication = (HttpApplication) source;
HttpContext oContext = hApplication.Context;
Exception oException = oContext.Error;
if(oException != null)
{
this.WriteError(oContext,oException);
}
}
privatevoid Application_BeginRequest(object source, EventArgs e)
{
HttpApplication hApplication = (HttpApplication) source;
HttpContext oContext = hApplication.Context;
}
}
}
Figure 11
We have compiled this module in to the assembly
HttpTraceModule.dll and
You must endorse the
following entry on web.config file to activate this module.
<httpModules>
-->
<add type=HttpTraceModule.HttpTraceModule,HttpTraceModule name=HttpTraceModule/>
</httpModules>
2.0 Implementing
versatile data access tier (DAT)
Now, we are going to
implement a data access tier that satisfies all expectations that
we have outlined in the section 1.0 . First, we will design a super class.
2.1 Implementing a super
class for data access tier (DAT)
We want now design a super class
for our data access tier (DAT), which
is capable enough to take over regular data access routines from its
derived classes. We must
foremost analyze diverse data access contexts, which can be categorized into
-
Selecting DataRows using DataSets
-
Retrieving data
using SqlDataReader
-
Updating and deleting tables directly using SqlCommand (Excecute �NonQuery)
Let us analyze these data access
processes one after another:
Selecting DataRows using DataSet
Step 1:
Create objects for the
database connection to the database
and SqlCommand object for
the stored procedure �strSP� .
SqlConnection dbConnection_L = new SqlConnection(strLocalServer);
SqlCommand dbCommand_L = new SqlCommand (strSP,dbConnection_L);
dbCommand_L.CommandType = CommandType.StoredProcedure;
Step 2:
Create an instance of SqlDataAdapter class
and associate with the SqlCommand
object: dbCommand_L.
SqlDataAdapter dbAdapter_L = new SqlDataAdapter();
dbAdapter_L.SelectCommand = dbCommand_L;
Step 3:
Add the query parameters to the
SqlCommand object:
SqlParameter oParameterIn = new SqlParameter(�@ParaIn�,SqlDbType.Int,4);
oParameterIn.Direction = ParameterDirection.Input;
oParameterIn.Value = 7;
dbCommand_L.Parameters.Add(oParameterIn);
Step 4:
Open the database connection : dbConnection_L.Open(); We don�t need to open the connection explicitly here, because dbAdapter_L object
will open the connection implicitly in the step 5.
Step 5:
Fill the Dataset:
DataSet dsOut = new Dataset();
string strTable = Customer ;
dbAdapter_L.Fill(dsOut,strTable);
Step 6:
Close the database connection: dbConnection_L.Close();
Retrieving data using
SqlDataReader
This type of data accessing differs little from previous one. We must leave out the step 2 and modify the step 5.
Step 1:
Create an object for the connection
to the database and
an instance of SqlCommand for
the stored procedure �strSP �.
SqlConnection dbConnection_L = new SqlConnection(strLocalServer);
SqlCommand dbCommand_L = new SqlCommand (strSP,dbConnection_L);
dbCommand_L.CommandType = CommandType.StoredProcedure;
Step 2:
We don�t need the step 2 here.
Step 3:
Add the query parameters to the
SqlCommand object:
SqlParameter oParameterIn = new SqlParameter(�@ParaIn�,SqlDbType.Int,4);
oParameterIn.Direction = ParameterDirection.Input;
oParameterIn.Value = 7;
dbCommand_L.Parameters.Add(oParameterIn);
Step 4:
Open the database connection : dbConnection_L.Open();
Step 5:
Receive
the reader by executing the SqlCommand .ExcecuteReader method
SqlDataReader oReader = dbCommand_L.ExecuteReader(CommandBehavior.CloseConnection);
The enumerator �CommandBehavior .CloseConnection�
is used in the dbCommand_L.ExecuteReader
method to close database connection
automatically after the retrieval of SqlDataReader object .
Step 6:
Because of the enumerator � CommandBehavior CloseConnection� we don�t need
the step 6 .
Updating and deleting
tables (Executing non-queries)
This type of data accessing differ slightly from
previous data access process.
Let us look this type
step by step.
Step 1:
Create a connection object to
the database and
an instance of SqlCommand class
for the stored procedure
�strSP�.
SqlConnection dbConnection_L = new SqlConnection(strLocalServer);
SqlCommand dbCommand_L = new SqlCommand (strSP,dbConnection_L);
dbCommand_L.CommandType = CommandType.StoredProcedure;
Step 2:
We don�t require this step here,
because we don�t need any instance of a class SqlDataAdapter
in this process.
Step 3:
Add the in- parameters and out-parameters to the SqlCommand
object:
SqlParameter oParameterIn = new SqlParameter(�@ParaIn �,SqlDbType.Int,4);
oParameterIn.Direction = ParameterDirection.Input;
oParameterIn.Value = 7;
dbCommand_L.Parameters.Add(oParameterIn);
SqlParameter oParameterOut = new SqlParameter(�@ParaOut�,SqlDbType.Int,4);
oParameterOut.Direction = ParameterDirection.Output;
dbCommand_L.Parameters.Add(oParameterOut);
Step 4:
Open the database connection
explicitly : dbConnection_L.Open();
Step 5:
Execute the non-query and retrieve output-parameter
value
int nAffected = dbCommand_L.ExecuteNonQuery();
int nOut = (int) oParameterOut.Value;
Step 6:
Close the database connection: dbConnection_L.Close();
As you can see, we have here ideal
condition to design a super class .First of all, we must
capsule the important data
access objects as protected elements in the basic class
DABasis ( Please refer parallel
the Figure 12 and
source code on the file DAPrototype/DAT/DABaisis.cs )
protected SqlConnection dbConnection_L;
protected SqlCommand dbCommand_L;
protected SqlDataAdapter dbAdapter_L;
protected OleDbTransaction dbTransaction_L;
We use here the suffix �_L�, in order to express that
these object deals with the local sql server. This notation will
enhance clarity, if our DAT deals with many databases.
We
are now going to implement functionalities of step 1
in the
DABasis.Perpair_L(string strSP)
method . It will create a connection to the local database and an instance of the SqlCommand class for a certain
store procedure with the name strSP
.
protectedvirtualvoid Prepair_L(string strSP)
{
try
{
dbConnection_L = new SqlConnection(strLocalServer);
dbCommand_L = new SqlCommand (strSP,dbConnection_L);
dbCommand_L.CommandType = CommandType.StoredProcedure;
}
catch(Exception oException)
{
string strMessage = Occurred in Prepa_L() ;
ErrorLog(strMessage,oException);
}
}
Figure 12 shows
the class diagram of the data
access tier(DAT)
We don�t implement functionalities
of step 2 in the super class, because
this step is not general for all data access methods.
The functionalities of Step 3 will be implemented in the
derived class, but the
super class supports this step with a parameter-factory .
The parameter-factory is
designed according to the pattern
abstract factory and it will supply us with several type of SqlParameters with variations
Input and Output direction,
which are used set and
retrieve values from the parameters . Further more, super class
DABasis supplies a method virtualvoid AddParameter_L(object
oParameter) to add
an instance of SqlParameter class
to the dbCommand_L object . Thus
makes your code compact and neat.
The super class DABasis
supports step 4 with
the void Open_L() method to open the
database to the local server and the complement step
6 is
covered by the void Close_L() method .
The functionality of the step 5
is implemented in the methods void GetDataSet_L(DataSet dsOut,string strTable) and int
ExcecuteNonQuery(). Now, it is
time to look, how we can
apply these methods in different database access contexts.
In the first practical example,
we want to retrieve order details
for an
order using the DataSet
object.The workflow for the
implementation can be sketched through following steps.
- Write a store procedure � SpSelOrderDetail� to access the
data in the database(see figure 13).
- Create a typed DataSet � DSOrderDetail for
the store procedure(see how in �1.3)
- Write a data access method in the data access tier (DAT): void
DAOrder.GetOrderDetail(int nOrderID,out DSOrderDetail dsOrderDetail)( See
Figure 14).
CREATE PROCEDURE SPSelOrderDetail
(
@OrderID int
)
AS
SELECT OD.ProductID, OD.UnitPrice,OD.Quantity, P.ProductName
FROM [Order Details] OD INNER JOIN Products P
ON OD.ProductID = P.ProductID
WHERE OD.OrderID =@OrderID
GO
Figure 13
publicvoid GetOrderDetail(int nOrderID,out DSOrderDetail dsOrderDetail)
{
dsOrderDetail = new DSOrderDetail();
try
{
Monitor.Enter(this);
Prepair_L(�SPSelOrderDetail�);dbAdapter_L = new SqlDataAdapter();
dbAdapter_L.SelectCommand = dbCommand_L;
SqlParameter pmOrderID = pmFactory_In.GetPMInt4(�@OrderID�);pmOrderID.Value = nOrderID;
AddParameter_L(pmOrderID);
string strTable = dsOrderDetail.OrderDetails.TableName;
GetDataSet_L(dsOrderDetail,strTable);
}
catch(Exception oException)
{
string strError;
strError = An Error Occured in DAOrder:GetOrderDetailOrders;
this.ErrorLog(strError,oException);
}
finally
{
this.Close_L();
Monitor.Exit(this);
}
}
Figure 14
Note: In order to protect
intrinsic data �access objects from concurring threads, We are using the Monitor.Enter(this) method and it is complement
Monitor.Exit(this) in the
publicvoid
GetOrderDetail(int nOrderID,out DSOrderDetail dsOrderDetail)method .
Moreover, we will use always
try-catch-finally block in the
data access tier (DAT) and we
perform the kernel operation in the try-block, in the catch �block : we
write possible errors in the Eventlogs, in the finally-block
: we clean used -resources which we
used in the try � block;
In the second practical
example, we want to demonstrate
the int DABais.ExcecuteNonQuery
method, which can be used to update and
insert databases directly.
If you implement the presentation layer(PL) with ASP.NET, it is very efficient update
directly against the database than using one
of the SqlDataAdapter.Update
method to update the DataSet object, because you must
store the instance of a
particular DataSet during
the page-roundup in the session memory ; consequently, that will affect
adversely the scalability and
performance. In this example, we
will update the attribute
Quantity of the table Order
Details (see Figure 15).
CREATE PROCEDURE SPUpOrderDetail
(
@OrderID int,
@ProductID int,
@Quantity int
)
AS
UPDATE [Order Details]
SET Quantity = @Quantity
WHERE ProductID=@ProductID AND OrderID=@OrderId
Figure 15
publicint UpdateOrderDetail(int nOrderID,int nProductID,int nQuantity)
{
int nAffected = 0;
try
{
Monitor.Enter(this);
Prepair_L(�SPUpOrderDetail�);
SqlParameter pmOrderID = pmFactory_In.GetPMInt4(�@OrderID�);pmOrderID.Value = nOrderID;
AddParameter_L(pmOrderID);
SqlParameter pmProductID = pmFactory_In.GetPMInt4(�@ProdID�);pmProductID.Value = nProductID;
AddParameter_L(pmProductID);
SqlParameter pmQuantity = pmFactory_In.GetPMInt4(�@Quantity�);pmQuantity.Value = nQuantity;
AddParameter_L(pmQuantity);
this.Open_L();
nAffected = this.ExcecuteNonQuery();
}
catch(Exception oException)
{
string strError;
strError = An Error Occured in DAOrder:UpdateOrderDetail;
this.ErrorLog(strError,oException);
}
finally
{
this.Close_L();
Monitor.Exit(this);
}
return nAffected;
}
Figure 16
The store procedure � SPUpOrderDetail� will act in the
data tier (DT) to accomplish the update task.The int DAOrder.UpdateOrderDetail method (See Fig 16) will receive
values from business logic tier (BLT) and it
will create parameters for them, then created parameters
will be added to the SqlCommand object with the help of inherited DABais.AddParameter_L method. Afterwards, we
will use the DABais.ExcecuteNonQuery() method to update
the database directly.
In the third example, we will
use a SqlDataReader object to fetch data, which is
an alternative to using
DataSet objects. The
functionality of this example
is same as in example 1.
The method public SqlDataReader DAOrder.GetOrderDetail(int
nOrderID) retrieves order details of an Order and
uses the store
procedure �SpSelOrderDetail � (Figure
13) in the data tier(DT).
public SqlDataReader GetOrderDetail(int nOrderID)
{
SqlDataReader oReader = null;
try
{
Monitor.Enter(this);
Prepair_L(�SPSelOrderDetail�);
SqlParameter pmOrderID = pmFactory_In.GetPMInt4(�@OrderID�);pmOrderID.Value = nOrderID;
AddParameter_L(pmOrderID);
this.Open_L();
oReader = dbCommand_L.ExecuteReader(CommandBehavior.CloseConnection);
}
catch(Exception oException)
{
this.Close_L();
Monitor.Exit(this);
string strError;
strError = An Error Occured in DAOrder:GetOrderDetailOrders(DR);
this.ErrorLog(strError,oException);
}
return oReader;
}
Figure 17
Unlike DataSets, a SqlDataReader
holds the database connection
open and it will retrieve data-rows one
after another from the database in to the application. Afterwards, you
must close the database connection
explicitly . You can use
SqlDataReader,when you need to
read large amount of data quickly and there is no need for in-memory database. I prefer
to use DataSets in multi-tier applications than SqlDataReaders, because there
is always a need to calculate
some business metrics in the middle tier before
the retried data visualized in
the presentation layer(PL).
In this case, SqlDataReaders are
not faster than Datasets. Let me
confirm the previous statement
through a performance test.
2.2 A performance test to evaluate SqlDataReader and DataSet objects
in the business tier.
In this experiment, we are going
to implement a typical data retrieving scenario of a business application using SqlDataReader and
DataSet,then we will put both implementations under
the same well defined stress
condition. We can simulate a multi-user requests using the Microsoft Web Application Stress Tool and measure
metrics such like hits and requests per seconds.
I think that it is appropriate to consider the following data
accessing scenario as an
average case in a commercial web
application. We want to
display an invoice list for an order. In sake of simplicity we will
leave out bonus, consume -tax-rate
depend on the product category
and so forth from the
calculation.. We will calculate the
invoice list of an order using the simplified calculation
formula
where
(See the table Order
Details in the figure 1)
2.2.1 DataSet Implementation
We will use the DAOrder.GetOrderDetail(int
nOrderID,out DSOrderDetail dsOrderDetail)
method ( See Figure 14) to retrieve data from the database.
In order to calculate the total
sum, we will alter
the OrderDetail table in the DSOrderDetail DataSet
with a decimal attribute: Price
( see in source code TDS/DSOrderDetail.xsd).
public void GetOrderDetail(int nOrderID,out DSOrderDetail dsOrderDetail, outdecimal dTotal)
{
dsOrderDetail = null;
dTotal = 0;
try
{
DAOrder daOrder = new DAOrder();
daOrder.GetOrderDetail(nOrderID,out dsOrderDetail);
dsOrderDetail.OrderDetails.Columns[Price].Expression = �Quantity*UnitPrice�;
object oSum = dsOrderDetail.OrderDetails.Compute(Sum(Price),Quantity>0);
decimal dSum = Convert.ToDecimal(oSum);
dTotal = dSum*dConsumeTaxRate;
}
catch(Exception oException)
{
string strMessage = Error in BLOrderDetail:GetOrderDetail;
ErrorLog(strMessage,oException);
}
}
Figure 18 (MethodBLOrderDetail. GetOrderDetail)
In the BLOrderDetail. GetOrderDetail method (Figure 18), we will first
retrieve the typed DataSet � DSOrderDetail� from database via data access tier, afterwards we will use
the expression �Quantity*
UnitPrice� to calculate the
Price column. Now, we can determine the total of the invoice list by summing the
Price column and
multiply this sum with the
factor dConsumeTaxRate . We will
then expose our invoice
list on the DAPrototype/DSTest.aspx page.
2.2.1 SqlDataReader
Implementation
We will use the store
procedure SPSelOrderDetail (Figure
13) and the public SqlDataReader DAOrder.GetOrderDetail(int
nOrderID) method (Figure
17) to retrieve necessary data
from the database to the business
logic tier(BLT). As you can see in
the public SqlDataReader BLOrder.GetOrderDetail(int nOrderID,outdecimal dTotal) method(Figure 19), we must retrieve twice data from the database. We use the first data retrieval to calculate the business metric �total � and the second data retrieval
will be passed to presentation layer to expose
invoice
list.(DAPrototype/DRTest.aspx )
public SqlDataReader GetOrderDetail(int nOrderID,outdecimal dTotal)
{
dTotal = 0;
SqlDataReader oReader = null;
try
{
DAOrder daOrder = new DAOrder();
oReader = daOrder.GetOrderDetail(nOrderID);
while(oReader.Read())
{
decimal dUnitPrice = Convert.ToDecimal(oReader[UnitPrice]);
int nQuantity = Convert.ToInt16(oReader[Quantity]);
dTotal += dUnitPrice * nQuantity;
}
dTotal*= dConsumeTaxRate;
oReader = daOrder.GetOrderDetail(nOrderID);
}
catch(Exception oException)
{
string strMessage = Error in BLOrderDetail:GetOrderDetail(DR);
ErrorLog(strMessage,oException);
}
return oReader;
}
Figure 19
2.2.2
We can now start the performance test. I am using my home
computer for this test
and it uses the Windows 2000 OS
and has a single
800 MHz processor and 130 MB
RAM. Sql Server 2000,.Net
Framework and Web stress tool are installed in the same
machine. We will keep the request
stress level constant during the test,
which will be 200 simultaneous browser
connection. Web Stress Tool will send
request for the Dataset implementation(DAPrototype/DSTest.aspx.) and the SqlDataReader implemetation
(DAPrototype/DRTest.aspx.) for five
minutes. Here are the results:
DataSet implemetation
Total number of requests: 42.093
Average requests per second: 140,31
Average database connection per second:31,90
SqlDataReader implemetation
Total number of requests: 35.663
Average requests per second: 118,88
Average database connection per second:173,27
Conclusion:
Dataset implementation is not only slightly faster than SqlDataReader implementation, but also it
uses database connections efficiently. Furthermore, DataSets enable
us to implement business-logic
efficiently than DataReaders.
Now, let
us discuss remaining aspects in the
question and answer form.
2.3 How can I extend this data access tier
(DAT) to access multi-databases.
It is very
easy to extend it. Say, you want
extend DAT, so that
it can work with the database
X. An extending workflow is given as follows.
- Create protected attributes in the super
class DABasis, in order
to represent data access classes
from the name space System.Data.SqlClient;
protected SqlConnection dbConnection_X;
protected SqlCommand dbCommand_X;
protected SqlDataAdapter dbAdapter_X;
protected SqlTransaction_X
- Create
methods which wraps these intrinsic data access objects. These
methods can be listed as follows
PrePair_X,Open_X,BegeinTransaction_X,ExcequteScalar_X,ExcequteNonQuery_X,
GetDataSet_X,ReuseCommand_X. In order to establish database connection,
you must add a key on the
web.config file under the section <appSettings> and
read and assign it to a protected
variable in the super class DABasis. Folks, that�s all, now
our data access tier is ready to work with database X.
2.4 How can I apply database transactions with different
isolation level?
.The DABasis class
supports database transactions with the methods:
protected void BeginTransaction_L(IsolationLevel iLevel),
protected void Rollback_L(),protectedvoid Commit_L()(Figure 20).
The void BeginTransaction_L(IsolationLevel iLevel) method is the start point
of the transaction and you can
start transactions with different isolation level. The ANSI SQL standard defines four level of transaction isolation and SqlServer
supports all of them.
Let us look all four isolation levels briefly:
- Uncommitted read( dirty read):
Lets a transaction read uncommitted and
committed data from other transaction.
- Commited read: Lets a transaction
read only committed data from other transaction and SqlServer uses these isolation
level as default.
- Repeatable Read: This isolation
level ensures that an another transaction won�t update your own
transaction data until you have committed or roll backed your transaction.
But you can�t prevent phantoms with this isolation level.(An another
transaction can add a row to your own transaction data)
- Serializable: This is the highest
isolation level and it ensures that an
another transaction won�t update,
delete or insert your own transaction
data.
protectedvoid BeginTransaction_L(IsolationLevel iLevel)
{
try
{
this.dbTransaction_L = dbConnection_L.BeginTransaction(iLevel);
this.dbCommand_L.Transaction = this.dbTransaction_L;
}
catch(Exception oException)
{
string strMessage = "Occured in BeginTransaction_L()";
ErrorLog(strMessage,oException);
}
}
protectedvoid Rollback_L()
{
try
{
this.dbTransaction_L.Rollback();
}
catch(Exception oException)
{
string strMessage = "Occured in RollBack_L()";
ErrorLog(strMessage,oException);
}
}
protectedvoid Commit_L()
{
try
{
this.dbTransaction_L.Commit();
}
catch(Exception oException)
{
string strMessage = "Occured in Commit_L()";
ErrorLog(strMessage,oException);
}
}
Figure 20
Let us explain, how these methods
work together through an example. In this example we going to delete an order
and it�s details. Because of master-detail relationship constraints, we must delete first the order
details and then the order itself. It
is very efficient using a single
store procedure �SPDelOrders� ( see Figure 21 and invoke it in an ExecutenonQuery
method.
CREATE PROCEDURE SPDelOrders
(
@OrderID int
)
AS
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE
BEGIN TRANSACTION
DELETE [Order Details] WHERE OrderID = @OrderID
DELETE Orders WHERE OrderID = @OrderID
IF @@error > 0
ROLLBACK TRANSACTION
ELSE
COMMIT TRANSACTION
GO
Figure 21
But in sake of demonstration, we
delete an order and its order details in the data access tier. We can achieve this
goal by modifying ExecuteNonQuery pattern.
We will use store procedures �SPDelODDemo� and
�SPDelOrdersDemo� to delete order details and the order respectively(Figure 22).
CREATE PROCEDURE SPDelODDemo
(
@OrderID int
)
AS
DELETE [Order Details] WHERE OrderID = @OrderID
CREATE PROCEDURE SPDelOrdersDemo
(
@OrderID int
)
AS
DELETE Orders WHERE OrderID = @OrderID
Method of
the DAOrder class
to delete orders
public bool DeleteOrder(int nOrderID)
{
bool bSuccess=false;
try
{
Monitor.Enter(this);
Prepair_L(�SPDelODDemo�);
SqlParameter pmOrderID = pmFactory_In.GetPMInt4(�@OrderID�);
pmOrderID.Value = nOrderID;
AddParameter_L(pmOrderID);
this.Open_L();
this.BeginTransaction_L(System.Data.IsolationLevel.Serializable);
int nOD = ExcecuteNonQuery_L();
StringBuilder strBuilder = new StringBuilder();
strBuilder.Append(nOD);
strBuilder.Append( Rows are deleted in [Order Details] with the OrderID );
strBuilder.Append(nOrderID);
this.WarningLog(strBuilder.ToString());
ReuseCommand_L(�SPDelOrdersDemo�);
AddParameter_L(pmOrderID);
nOD = ExcecuteNonQuery_L();
strBuilder = new StringBuilder();
strBuilder.Append(an order was deleted with ID );
strBuilder.Append(nOrderID);
WarningLog(strBuilder.ToString());
this.Commit_L();
bSuccess=true;
}
catch(Exception oException)
{
bSuccess=false;
this.Rollback_L();
string strMessage = "An error occured in DAOrderDetail:DeleteOrder" ;
ErrorLog(strMessage,oException);
}
finally
{
Close_L();
Monitor.Exit(this);
}
return bSuccess;
}
Figure 22
Let us analyze the method publicbool DAOrder.DeleteOrder(int nOrderID)
step by step.
Step 1.
We initialize the data base connection and a SqlCommand object with the store procedure to delete
Order details. Afterwards we will add
query parameters to the instance of the
SqlCommand class
Monitor.Enter(this);
Prepair_L(�SPDelODDemo�);
SqlParameter pmOrderID = pmFactory_In.GetPMInt4(@OrderID);
pmOrderID.Value = nOrderID;
AddParameter_L(pmOrderID);
Step 2.
Open the database connection,start the transaction using the BeginTransaction_L(System.Data.IsolationLevel.Serializable)
method
and execute the store
procedure � SPDelODDemo� .
this.Open_L();
this.BeginTransaction_L(System.Data.IsolationLevel.Serializable);
int nOD = ExecuteNonQuery_L();
StringBuilder strBuilder = new StringBuilder();
strBuilder.Append(nOD);
strBuilder.Append( Rows are deleted in [Order Details] with the OrderID );
strBuilder.Append(nOrderID);
this.WarningLog(strBuilder.ToString
Step 3
Reinitialize the SqlCommand object by clearing previous
query parameters, set the store
procedure �SPDelOrdersDemo�
and execcute it.
ReuseCommand_L(�SPDelOrdersDemo�);
AddParameter_L(pmOrderID);
nOD = ExecuteNonQuery_L();
strBuilder.Append(an order was deleted with ID );
strBuilder.Append(nOrderID);
WarningLog(strBuilder.ToString());
Step
4
If there is no
error exception in the try block,
then the transaction will be committed,
otherwise it rolled back in the exception block.
this.Commit_L();
bSuccess=true;
}
catch(Exception oException)
{
bSuccess=false;
this.Rollback_L();
string strMessage = An error occured in DAOrderDetail:DeleteOrder ;
ErrorLog(strMessage,oException);
}
finally
{
Close_L();
Monitor.Exit(this);
}
2.5 How can I insert
and retrieve images?
Say, we have a table �Images� with following attributes:
PId char(5)
Picture image
We can now use the store procedures
�SPInImage� and �SPSelImage� to insert and retrieve
images from the database.
CREATE PROCEDURE SPInImage
(
@PId char(5),
@Picture image
)
AS
IF NOT EXISTS(SELECT * From Images WHERE PId=@PId)
BEGIN
INSERT INTO Images
(PId,
Picture
)
VALUES
(
@PId,
@Picture
)
END
GO
CREATE PROCEDURE SPSelImage
(
@PId char(15)
)
AS
SELECT Picture FROM Images WHERE PId = @PId
GO
Figure 23
Now, we can use the methods bool InsertImage(string strPId,byte[]
imPicture) and byte[] GetImage(string strPId)
to invoke these store procedures(SeeFigure 24)
publicbool InsertImage(string strPId,byte[] imPicture)
{
bool bInsert = false;
try
{
Monitor.Enter(this);
this.Prepair_L("SPInImage");
SqlParameter pmPId = pmFactory_In.GetPMChar5("@PId");
pmPId.Value = strPId;
this.AddParameter_L(pmPId);
SqlParameter pmPicture = pmFactory_In.GetPMImage("@Picture");
pmPicture.Value = imPicture;
this.AddParameter_L(pmPicture);
this.Open_L();
int nExcequte = ExcecuteNonQuery_L();
if(nExcequte>0)
{
bInsert = true;
}
}
catch(Exception oException)
{
string strError = "An Error Occured in :InsertImage";
ErrorLog(strError,oException);
}
finally
{
this.Close_L();
Monitor.Exit(this);
}
return bInsert;
}
publicbyte[] GetImage(string strPId)
{
byte[] imPicture = null;
try
{
Monitor.Enter(this);
this.Prepair_L("SPSelImage");
SqlParameter pmPId = pmFactory_In.GetPMChar15("@PId");
pmPId.Value = strPId;
this.AddParameter_L(pmPId);
this.Open_L();
object oScalar = this.ExcecuteScalar_L();
if(oScalar!=null)
{
imPicture = (byte[]) oScalar;
}
}
catch(Exception oException)
{
string strError = "An Error Occured in :GetImage";
ErrorLog(strError,oException);
}
finally
{
this.Close_L();
Monitor.Exit(this);
}
return imPicture;
}
Figure 24