Introduction
Three-tier is a client-server architecture in which the user interface, functional process logic ("business rules"), and data storage and data access are developed and maintained as independent modules, most often on separate platforms. In this article, I want to show one of many design solutions of how to build a robust middle-tier that will be easy to maintain and easy to make modifications without reflecting other tiers.
Object-Relational Persistence
An object-relational persistence framework provides basic data CRUD (create, retrieve, update, and delete) services to your business objects. A persistence framework is really not responsible for doing any more than this. If your application has business logic (and I can�t imagine a business application development project that doesn�t have the requirements for business logic), then the persistence framework is not the part of the architecture that handles this.
Coordinating business logic with persistence is the responsibility of a yet higher order framework. This is what the Microsoft Business Framework is responsible for. The persistence framework�s only job is to perform operations on business objects that correspond to SQL insert
, select
, update
, and delete
commands.
A good persistence framework should also coordinate persistence messages between related business objects as well as coordinate basic transactions, manage optimistic concurrency, and propagate database-generated keys back to the middle tier. Persistence frameworks also provide mechanisms � usually in the form of some sort of metadata � that allow business objects to be mapped to database tables, and business object properties to be mapped to table attributes.
A persistence framework has the smarts to understand the structure of your business objects and can infer the SQL command text needed to perform CRUD operations on the underlying durable data store.
Using the code
This article will develop a sample object repository that leverages the XML extensions of SQL Server for transforming between XML and relational data structures. In the example below, I use SQLXML 3.0 and SQL Server 2000 Pubs database entity Jobs.
First, we prepare stored procedures CRUD (Create, Retrieve, Update, Delete):
CREATE PROCEDURE dbo.SelectJob
@job_id smallint
AS
SELECT 1 as Tag,
null as Parent,
job_id as [Job!1!JobId!element],
job_desc as [Job!1!JobDesc!element],
min_lvl as [Job!1!MinLvl!element],
max_lvl as [Job!1!MaxLvl!element]
FROM jobs
WHERE job_id = @job_id
FOR XML EXPLICIT
GO
CREATE PROCEDURE dbo.SelectJobs
AS
SELECT 1 as Tag,
null as Parent,
null as [JobCollection!1!],
null as [Job!2!JobId!element],
null as [Job!2!JobDesc!element],
null as [Job!2!MinLvl!element],
null as [Job!2!MaxLvl!element]
UNION ALL
SELECT 2,
1,
null,
job_id,
job_desc,
min_lvl,
max_lvl
FROM jobs
FOR XML EXPLICIT
GO
CREATE PROCEDURE dbo.UpdateJob
@job ntext
AS
BEGIN
DECLARE @idoc int
DECLARE @job_id smallint
EXEC sp_xml_preparedocument @idoc OUTPUT, @job
SELECT @job_id = job_id
FROM OPENXML(@idoc, '/Job')
WITH (JobId smallint './JobId')
UPDATE jobs
SET jobs.job_desc = xml.JobDesc,
jobs.max_lvl = xml.MaxLvl,
jobs.min_lvl = xml.MinLvl
FROM OPENXML (@idoc, '/Job')
WITH (JobDesc varchar(50) './JobDesc',
MaxLvl tinyint './MaxLvl',
MinLvl tinyint './MinLvl') xml
WHERE jobs.job_id = @job_id
EXEC sp_xml_removedocument @idoc
END
GO
CREATE PROCEDURE dbo.InsertJob
@job ntext
AS
BEGIN
DECLARE @idoc int
EXEC sp_xml_preparedocument @idoc OUTPUT, @job
INSERT INTO jobs (job_desc, min_lvl, max_lvl)
SELECT JobDesc, MinLvl, MaxLvl
FROM OPENXML(@idoc, './Job')
WITH (JobDesc varchar(50) './JobDesc',
MaxLvl tinyint './MaxLvl',
MinLvl tinyint './MinLvl')
EXEC sp_xml_removedocument @idoc
SELECT @@IDENTITY AS id FOR XML RAW
END
GO
CREATE PROCEDURE dbo.DeleteJob
@job_id smallint
AS
DELETE FROM jobs WHERE job_id = @job_id
GO
Now, we can create business objects based on the Jobs table of the Pubs database:
using System;
using System.Collections;
namespace Database.Pubs
{
public interface IJob
{
short JobId {get; set;}
string JobDesc {get; set;}
byte MaxLvl {get; set;}
byte MinLvl {get; set;}
EmployeeCollection Employees {get;}
}
[Serializable]
public class Job : IJob
{
private short job_id = -1;
private string job_desc;
private byte min_lvl;
private byte max_lvl;
private EmployeeCollection employees;
public Job() {}
public short JobId
{
get
{
return job_id;
}
set
{
job_id = value;
}
}
[Required]
[Length(50)]
public string JobDesc
{
get
{
return job_desc;
}
set
{
job_desc = value;
}
}
[Eval("[this]>=10")]
[Required]
public byte MinLvl
{
get
{
return min_lvl;
}
set
{
min_lvl = value;
}
}
[Eval("[this]<=250")]
[Required]
public byte MaxLvl
{
get
{
return max_lvl;
}
set
{
max_lvl = value;
}
}
public EmployeeCollection Employees
{
get
{
if (employees == null)
{
EmployeeBiz biz = new EmloyeeBiz();
employees = biz.GetEmployees(this.JobId);
}
return employees;
}
}
}
[Serializable]
public class JobCollection : ObjectCollection
{
public JobCollection()
{
}
public void Add(Job job)
{
base.Add(job);
}
public void Delete(int index)
{
base.RemoveAt(index);
}
public Job this[index]
{
get
{
return (Job)base.List[index];
}
set
{
base.List[index] = value;
}
}
}
}
We have Job
and JobCollection
classes which are mapped to Jobs table. I used custom attributes (System.Attribute
) to specify validation of each property, which will validate Job
class before we Persist (Add
, Update
) Job
into a database. You can create your own custom attributes to further describe the class property. In this example I used:
RequiredAttribute
- validate data (whether it allows nulls or not)
EvalAttribute
- validate data (match specific mathematical function), based on CHECK
constraint of the table column
LengthAttribute
- validate allowable length of data
RegexpAttribute
- validate data (specific regular expression), based on CHECK
constraint of the table column
If data will be valid, we continue to update or insert data into a database, else we throw an error. This is a very important step to validate data entry before you send it to a database. Validation must be done at the business logic layer. Some applications use client side validation and do not have server side validation. This could create a problem. Let me explain why we need to use validation inside of a middle tier. Our middle tier does not depend on client application, whether it is a web application, web services, or Windows application. Some applications might not be under our control, and will not implement the same validation as our business layer does; for example, web services that you can provide. In this case, to prevent bad data to be sent into our database, we must implement validation in our middle tier.
Also, you can change class serialization the way you want it by specifying XmlElement
attribute. For example, [XmlElement(ElementName = "Job_Id")]
. In this case, instead of JobId
element in your serialized XML, you will have Job_Id
element.
OK, now is a time to build Job data access layer, and later, on top of it, we create Job business layer where I demonstrate you how to validate class property values:
using System;
using System.IO;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
using Microsoft.Data.SqlXml;
namespace Database.Pubs
{
public class DbHelper
{
public static readonly string PUBS =
"provider=SQLOLEDB;Database=pubs;" +
"Server=localhost; Integrated Security=SSPI";
public static object DeserializeObject(string root,
System.Type type, XmlReader reader)
{
XmlRootAttribute xmlRoot = new XmlRootAttribute();
xmlRoot.ElementName = root;
XmlSerializer serializer = new XmlSerializer(type, xmlRoot);
return serializer.Desirialize(reader);
}
public static string SerializeObject(object obj, System.Type type)
{
XmlSerializer serializer = new XmlSerializer(type);
StringBuilder sb = new StringBuilder();
TextWriter writer = new StringWriter(sb);
serializer.Serialize(writer, obj);
writer.Close();
return sb.ToString();
}
}
public class JobDB
{
public JobDB() {}
public XmlReader GetJob(short jobId)
{
SqlXmlCommand command = new SqlXmlCommand(DbHelper.PUBS);
command.CommandType = SqlXmlCommandType.Sql;
command.CommandText = "exec SelectJob ?";
command.CreateParameter().Value = jobId;
return command.ExecuteXmlReader();
}
public XmlReader GetJobs()
{
SqlXmlCommand command = new SqlXmlCommand(DbHelper.PUBS);
command.CommandType = SqlXmlCommandType.Sql;
command.CommandText = "exec SelectJobs";
return command.ExecuteXmlReader();
}
public short AddJob(string serializedJob)
{
short jobId = -1;
SqlXmlCommand command = new SqlXmlCommand(DbHelper.PUBS);
command.CommandType = SqlXmlCommandType.Sql;
command.CommandText = "exec InsertJob ?";
command.CreateParameter().Value = serializedJob;
XmlReader reader = command.ExecuteXmlReader();
if (reader.Read())
{
reader.MoveToAttribute("id");
jobId = Convert.ToInt16(reader.Value);
}
if (jobId <= 0)
throw new Exception("Insert operation is failed");
return jobId;
}
public void UpdateJob(string serializedJob)
{
SqlXmlCommand command = new SqlXmlCommand(DbHelper.PUBS);
command.CommandType = SqlXmlCommandType.Sql;
command.CommandText = "exec UpdateJob ?";
command.CreateParameter().Value = serializedJob;
command.ExecuteNonQuery();
}
public void DeleteJob(short jobId)
{
SqlXmlCommand command = new SqlXmlCommand(DbHelper.PUBS);
command.CommandType = SqlXmlCommandType.Sql;
command.CommandText = "exec DeleteJob ?";
command.CreateParameter().Value = jobId;
command.ExecuteNonQuery();
}
}
public class JobBiz
{
private JobDB db;
public JobBiz()
{
db = new JobDB();
}
public Job GetJob(short jobId)
{
XmlReader reader = db.GetJob(jobId);
if (!reader.Read())
return null;
Job job = (Job)DbHelper.DeserializeObject("Job", typeof(Job), reader);
reader.Close();
return job;
}
public JobCollection GetJobs()
{
XmlReader reader = db.GetJobs();
JobCollection coll =
(JobCollection)DbHelper.DeserializeObject("JobCollection",
typeof(JobCollection), reader);
reader.Close();
return coll;
}
public short AddJob(Job job)
{
string message;
EntityValidationRule rule = new EntityValidationRule();
if (!rule.Validate(job, out message))
throw new Exception (message);
string serializedJob = DBHelper.SerializeObject(job, typeof(Job));
return db.AddJob(serializedJob);
}
public void UpdateJob(Job job)
{
string message;
EntityValidationRule rule = new EntityValidationRule();
if (!rule.Validate(job, out message))
throw new Exception (message);
string serializedJob = DBHelper.SerializeObject(job, typeof(Job));
db.UpdateJob(serializedJob);
}
public void DeleteJob(short jobId)
{
db.DeleteJob(jobId);
}
}
}
Let's update our JobCollection
class to add some persistent methods. This step is up to you, whether you want to persist data into a database when you add, update, or delete objects in objects collection. I add three methods, UpdatePersist
, DeletePersist
, and AddPersist
, to the JobCollection
class:
public class JobCollection
{
private JobBiz biz;
public JobCollection()
{
biz = new JobBiz();
}
...
public void DeletePersistent(short jobId)
{
biz.Delete(jobId);
for(int i = 0; i < this.Count; i++)
{
Job _job = this[i];
if (_job.JobId == jobId)
{
base.RemoveAt(i);
break;
}
}
}
public void AddPersistent(Job job)
{
job.JobId = biz.AddJob(job);
base.Add(job);
}
public void UpdatePersistent(Job job)
{
biz.UpdateJob(job);
for(int i = 0; i < this.Count; i++)
{
Job _job = this[i];
if (_job.JobId == jobId)
{
this[i] = job;
break;
}
}
}
...
}
Extending the Business Objects
So, shortly after releasing version 1.0 of the product, the work on the second version of the application begins, and the customer is waiting with a list of new requirements for the Job
objects. To handle these additions, we add new class�Job2
which is a sub-class of the original Job
class:
public class Job2 : Job
{
private string additionalInfo;
public Job2() : base() {}
public string AdditionalInfo
{
get
{
return additionalInfo;
}
set
{
additionalInfo = value;
}
}
}
In traditional object to relational database access code, these changes would have meant making changes to the structure of the relational storage. Perhaps, adding a couple of fields to the Jobs table. (And everyone knows that with some DBAs, this can be like pulling teeth). However, because XML was used as the transport to the relational database, and the database designer planned ahead by including an overflow field into the initial Jobs table, the object repository can support the new business objects without changing the relational database structures, stored procedures, or object repository code. For example, the code persisting Job2
objects now looks like this:
Job2 job = new Job2();
job.JobDesc = "job desc";
job.AdditionalInfo = "add info";
...
JobCollection coll = new JobCollection();
coll.AddPersistent(job);
This is basically the same code that is used for the original business objects, the only difference is that Job2
objects were explicitly created. In general, the object repository has proven to be a very extensible design. In fact, unless the shape of the XML document that is generated from the business object hierarchy is dramatically changed, the business object developers have the freedom to extend or tweak the objects as much as they like. For example, this tweaking could be as drastic as removing fields from the Job
objects, something that will break most code. In that case, the OpenXML
code will not find the element in the serialization format and will insert a Null
value instead.
So, there are a couple of technical details worth noting here. First of all, when the Job2
was added as a subclass of the Job
class, it was necessary to add the XmlInclude
attribute to the Job
class:
[XmlInclude(typeof(Job2))]
public class Job
{
...
}
By including this attribute and specifying the new Job2
type, the XmlSerializer
can then recognize both the base and derived types when serializing and deserializing. For serializing, this is not so complicated because the XmlSerializer
can dynamically figure out the type of the instance to be serialized. However, when deserializing, the XmlSerializer
actually creates Job2
types when fetching the XML from the database for persisted Job2
objects. How does the XmlSerializer
figure this out? The key to that question lies in examining the overflow content in the database.
Conclusion
As with any design, there are definitely some tradeoffs. XML as a transport is clearly not the most performant mechanism for database access. There is a lot of markup that is being transported over the wire. But as shown, by using XML as the transport to and from the database and OpenXML
as the persistence mechanism, one can significantly decouple the database layer from the business logic layer and thereby develop a very flexible and extensible design. In general, it comes down to the particular requirements of a given application.
You may create your own JobDB
, and JobBiz
classes and the way they connect to a database.
History