Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles
(untagged)

.NET Data Persistence: SQL Server vs. Matisse vs. FastObjects

0.00/5 (No votes)
10 May 2004 1  
An article with source code examining the development speed and performance capability of .NET, Matisse and FastObjects

Introduction

The source code contains 4 C# projects: FrontHelpers, TestMatisse, TestSQL, TestVersant. For TestMatisee you must download the Matisse Database and .NET bindings. For TestSQL you must download and reference the DAAB version 2. For TestVersant you must download FastObjects .NET Trial release, compile with enhancements and enhance the project thereafter (instructions are in the body of the article). The TestVersant project does not come with the FastObjects.cfg file, dat and dict file and ObjecrScopeProvider. You will have to generate these on your own according to the instructions hereunder).

Almost every enterprise development I have done or seen begins with design and definition of the data-base. Even when the system was developed using pure OO methodologies, coding begins with the database. Recently, whenever I have told my developers to develop the enterprise such that it is data-base agnostic, they are flabbergasted. I have literally had to remove the database from their systems to force them to develop without database dependency. If you develop bottom-up i.e. beginning with the database, major decisions in Business Layer are always effected. Instead of having a system that can port to any data-persistence mechanism, you will have a system that is dependant on a proprietary data-persistence layer.

Especially in the Microsoft world (but not only), data-base centric development is a symptom of vendors trying to push their products. Like it or not, a Microsoft SQL Server bias is build into the .NET Framework. The tools, techniques, data-binding mechanics, serialization abilities of .NET point toward SQL Server. To my mind, this is a lie embedded in the heart of .NET. The soul of .NET is OO. To state the obvious, SQL Server is predicated upon the relational paradigm. OO and relational are from two different conceptual and implementation planets. I don�t know about you, but most of my development time is taken up by coding elaborate Object-Relational-Mapping (ORM) code (reams and uninspiring, boring and tedious reams of it). I take comfort from the fact that the largest portion of my Bible (Martin Fowler�s Patterns of Enterprise Application Architecture) is dedicated to ORM patterns.

This issue is all the more topical for two main reasons. Firstly, Microsoft is about to launch its latest version of SQL Server code named �Yukon�. From what I have seen, �Yukon� is very different to SQL Server 2000 (and 7); it will require a significant learning curve to master it. Most importantly, if .NET is SQL Server biased, �Yukon� will be .NET biased. In fact, the �Yukon� IDE looks the same as the Visual Studio .NET IDE - the lines between .NET and SQL Server seem to blurring. The other important reason is that there are serious challengers to SQL Server as the most appropriate .NET data-repository of choice.

In this article, I want to briefly introduce two of those alternatives: Matisse (which calls itself post-relational) and the newly released FastObjects for .NET from Versant (which is pure Object database). I will do this by means of a simple example. The example will emulate a layered, disconnected enterprise scenario. There will be three projects � each coping with the same problem but using a different data-storage technology. I have added simple metrics to allow you to conduct your own performance testing. I admit that the example does not represent a comprehensive performance and stress testing scenario. My emphasis will be from a purely developers perspective. I don�t want to pre-judge the outcome. Essentially, I want you to judge for yourself as a developer which data-storage technology is the quickest, easiest and most efficient to use.

The Example

I will be dealing with a Person object, which has just three attributes:

  • Firstname � text
  • Lastname � text
  • Birthdate - date

The application must do the following:

  1. Create - Register as many Person objects as indicated on the UI.
  2. Get - Retrieve all the Person objects in the data-storage and bind them to a ListBox on the UI.
  3. Update - Update all the Person objects bound to the ListBox and persist those updates to the data-storage.

Each one of the above functions will be measured in seconds, milliseconds and nano-seconds. The results of the measurements will be written to a text-file.

Each example will be built in the same sequence:

  1. Create the Person business object � the Person object will be called the PersonBE. PersonBE must have three read/write properties: Firstname, Lastname and Birthdate. PersonBE�s ToString() method is overridden as follows:
    public class PersonBE
    {
       public override string ToString()
       {return this.Firstname + "," + this.Lastname;}
    }
    
  2. Create the Data-Storage.
  3. Code the Data Access � all data access class will be in the DataAccess class.
  4. Hook up the UI events.

The application will have a standard UI:

The UI has the following controls:

  • CREATE Button � will generate as many Person objects as indicated in the NumericUpDown control and persist those new objects to the data-store.
  • NumericUpDown � indicates how many new Person objects to generate for persistence. The minimum value is set to 30 and maximum to 30,000.
  • GET Button � retrieves all the Person objects in the data storage and binds them to the ListBox.
  • UPDATE Button � traverses through all the Person objects bound to the ListBox, modifies the Firstname and Lastname attributes and persists the changes to the data-storage.
  • ListBox � displays a list of all Person objects retrieved by clicking the GET button.
  • RichTextBox � displays the metrics file measuring the specific action taken.
  • CheckBox � if checked, indicates a new file must be created otherwise an existing file is used. On initialization, the CheckBox is marked as checked.

The UI has three event handlers. In all versions the event handlers are more or less the same. If there are variations they will be pointed out when I deal with each example on its own merits:

  • btn_CreateClick � triggers the Create functionality. A number of Person objects are created corresponding to the value of the NumericUpDown control. The Person objects are added to a container object. The timer is then started and the list of person objects is passed to the Data Access layer for persistence. When the Data Access layer has persisted all the objects to the data store, the timer is stopped, the file content (the timer tests results) are then written to the file.
    private void btnCreate_Click(object sender, EventArgs e)
    {
       list = new ArrayList();
       for(int i = 0; i < ud.Value; i++)
       {
             PersonBE p = new PersonBE();
        p.Firstname = "FN" + i.ToString();
        p.Lastname  = "LN" + i.ToString();
        p.BirthDate = DateTime.Now;
        list.Add(p);
       }
       counter.Start();
       DataAccess.insPersons(list);
       counter.Stop();
    
       s = FileHelper.preCreate(DB, (int)ud.Value);
       file();            
    }
    
  • btnGet_Click � triggers the Get functionality. A call is made to the Data Access layer which returns a list of Person objects. These objects must be bound to the ListBox. I am using data-binding and so I have to check if there is a list of objects already bound to the ListBox. If there is an existing binding, I must suspend the binding, bind the new list and then resume the binding; otherwise I just bind the new list. I start the counter before the call to the Data Access layer and stop it once the list is bound to the control. I then call the file functions.
    private void btnGet_Click(object sender, EventArgs e)
    {
      counter.Start();
      if(lb.Items.Count == 0)
      {
        list = DataAccess.getPersonsList();
        lb.DataSource = list;
      }
      else
      {
        BindingManagerBase manager = this.BindingContext[list];
        manager.SuspendBinding();
        list = DataAccess.getPersonsList();
        lb.DataSource = list;
        manager.ResumeBinding();
       }
       counter.Stop();
    
       s = FileHelper.preGet(DB, list.Count);
       file();            
    }
    
  • btnUp_Click triggers the Update functionality. The method iterates through the list of Person objects bound to the ListBox; it changes the Firstname and Lastname property of each person object. At this point the timer is started and a call is made to the Data Access layer to persist the changes to the data-storage. The timer is then stopped and the test results are written to file and displayed:
    private void btnUp_Click(object sender, EventArgs e)
    {
      foreach(PersonBE p in list)
      {
        p.Firstname = p.Firstname + "up";
        p.Lastname = p.Lastname + "up";
       }
       
       counter.Start();
       DataAccess.upPersons(list);
       counter.Stop();
    
       s = FileHelper.preUpdate(DB, list.Count);
       file();            
    }
    

The UI has a private method file() which centralizes UI access to and use of the FileHelper (see below).

Each UI form has three class level variables:

  • Counter � a reference to the Counter helper class (see below).
  • DB � a string constant reflecting the type of data-storage being used.
  • FILE � a string constant indicating the name of the text file associated with the application.

The Helper Project

This project is shared by all three applications. You will find it in the download source code as the FrontHelpers.proj. It performs to main functions: File IO and Metrics.

Metrics

There are two classes providing metrics:

  • NanoCounter � I have taken this class directly from the latest (April 2004) Patterns and Practices guideline Improving .NET Application Performance and Scalability in the How To entitled Time Managed Code Using QueryPerformanceCounter and QueryPerformanceFrequency The class times the execution of managed code in nano-seconds i.e. accurate to one-billionth of a second. The only modification I have made is to add a non-parameterized Duration method. The class has the following API:
    • Start � start the performance counter.
    • Stop � stops the performance counter.
    • Duration (int iteration) �returns a double indicating the duration for each iteration of a code-block.
    • Duration � returns a double indicating the total duration from start to stop for the code block execution in nano-seconds.
  • Counter - this class initializes the timer. It contains a reference to the Nanocounter to measure nano-second performance. The class has the following API:
    • Start � Initializes the timer and the nano-counter.
    • Stop � Stops the timer and nano-counter.
    • Result � Returns a formatted string with the time duration in seconds, milliseconds and nano-seconds �each measurement on a separate line.

File IO

Common File IO functionality is encapsulated in the FileHelper class. This class has two main types of functions: File Creation, Reading and Writing; and formatting of File content. All methods of the class are static.

The File IO API is as follows:

  • WriteFile � A method that creates a new file or appends to an existing file and saves the file to disk. The method has three parameters:
    • FileName (string) - the name of the file to create and write to disk.
    • FileContent (string) � the content of the file to be created and saved.
    • NewFile (bool) � if the file is new the Filemode will be Create otherwise the FileMode is Append.
  • ReadFile � this method reads the contents of a file of a given name and returns those contents as a string. There is only one parameter i.e. the name of the file to be read.

The File formatting API consists of methods which formats informative text before the execution of the functionality the application is going to measure. Each method takes as parameters the name of the file to format and the number of records the action will measure. There are three public methods corresponding to the three measurable actions:

  • preCreate
  • preGet
  • preUpdate

SQL Server

The first application uses SQL Server 2000 as the data-storage. I have used best practices recommended by Microsoft and more specifically the DAAB (Data Access Application Block). The source code for the SQL Server version can be found in the TestSQL.proj.

  1. Create PersonBE Object � in the SQL version an extra property must be added i.e. ID. This will enable the application to identify a specific Person object (functionality I am not providing for in this particular demonstration, but nonetheless essential in any ordinary scenario).
  2. Create Data Storage (see sqltest.sql script in the source):

    For SQL Server there are two stages to setting up the database:

    • Setting up the database tables:
      CREATE DATABASE SQLTest
      go
      
      use SQLTest
      go
      
      CREATE TABLE Person
      (
          ID        int IDENTITY(1, 1) NOT NULL,
          Firstname    varchar(30) NOT NULL,
          Surname        varchar(30) NOT NULL,
          BirthDate    smalldatetime NOT NULL,
          CONSTRAINT Person_PK PRIMARY KEY (ID)
      )
      go
      
    • Coding the stored procedures (three in this simple instance):
      CREATE PROC getPersons
      AS
      SELECT ID, Firstname, Surname, Birthdate
      FROM   Person
      go
      
      CREATE PROC insPerson
      @first varchar(30),
      @last varchar(30),
      @birth smalldatetime
      AS
      INSERT INTO Person
      VALUES(@first, @last, @birth)
      go
      
      CREATE PROC upPerson
      @id int,
      @first varchar(30),
      @last varchar(30),
      @birth smalldatetime
      AS
      UPDATE Person
      SET    Firstname = @first,
             Surname = @last,
             Birthdate = @birth
      WHERE  ID = @id
      GO
      
  3. Code the Data Access.

    The SQL Data Access references the DAAB and uses the System.Data and the System.Data.SqlClient namespaces. The class has to get a handle to a SQL Server connection string. In this instance I have hard coded the connection string using the SQL authentication and the �sa� user. (When using this example replace the �1234� with your SQL �sa� password.)

    There are four methods all static � three are internal and one private. The API is as follows:

    • insPersons � this method takes an ArrayList of persons as a parameter. It iterates through the list, creating an ADO.NET SqlParameter for each property of each PersonBE corresponding to the insPerson stored procedure parameters. For each PersonBE in the ArrayList a call is made to ExecuteNonQuery method of the DAAB.
    • getPersons � this method returns an ArrayList of all the PersonBEs in the database. A call is first made to the getPersonsDS private method; this method makes a call to the ExecuteDataSet method of the DAAB calling the getPersons stored procedure and returns a DataSet. The calling method then iterates through each row in the DataSet creating a new PersonBE object for each iteration and matching the Rows columns to the PersonBE�s properties. Each PersonBE is added to an ArrayList which is returned. (This is ORM code):
      internal static ArrayList getPersons()
      {
        ArrayList list = new ArrayList();
        DataSet ds = getPersonsDS();
        foreach(DataRow row in ds.Tables[0].Rows)
        {
          PersonBE p = new PersonBE();
          p.ID = (int)row["ID"];
          p.Firstname = (string)row["Firstname"];
          p.Lastname = (string)row["Surname"];
          p.BirthDate = (DateTime)row["BirthDate"];
          list.Add(p);
         }
        return list;
      }
      
    • upPersons � this method has an ArrayList of PersonBE objects as a parameter. The method iterates through each ArrayList mapping each object to SqParameters and calling ExecuteNonQuery of the DAAB for each iteration using the upPerson stored proc. I am assuming that every PersonBE in the list has been modified. This is an assumption you can�t make in the real world. I would have to have some mechanism to test whether my Person object was dirty i.e. did I indeed change a property such that what I extracted from the data-base is indeed different and therefore warrants an update. Using SQL and not using DataSets to transport my data, I would have to make this test.
  4. Hook up the UI � in the SQL UI I use an ArrayList to contain my downloaded and new Person objects.

MATISSE

To run the Matisse example you will have to download the Matisse database and Matisse .NET data binding � both of which you can get from the Matisse web site . [In this brief example I can�t do justice to the wonders of Matisse and .NET binding. I strongly recommend downloading the documentation and reading it at your own leisure. For those of you new to Matisse, there is an instructive and good introductory tutorial in a series of 5 articles by John Sasak.] In this Matisse example I would like to demonstrate a basic framework how to use Matisse in a disconnected, layered environment. The Matisse project is in the source code as TestMatisse.proj.

  1. Create PersonBE Object � The Matisse PersonBE is the simplest of the lot. There is no need for an ID property as in the SQL example.
  2. Create Data Storage � The Matisse data-base can be set up in a number of ways: Rational Rose; Matisse SQL (DDL); Object Definition Language (ODL). I find the ODL most intuitive for work with .NET. The ODL for the example is simple (see MatTest.odl in the source):
    interface Person: persistent
    {
        attribute String Firstname;
        attribute String Lastname;
        attribute Date Birthdate;
    };
    

    Fire up the Matisse Enterprise Manager. Click File --> New Database and in the dialogue type �MTest� and click OK.

    A new Matisse database called �MTest� is created. On the Enterprise Manager navigate to MTest and expand it; right click the Schema node and select �Import ODL Schema��. Locate MTest.odl on your disk and import it. The database schema should generate with a single class �Person�:

    The next step is to generate stub-classes based on the data-base schema. In the Matisse world you must distinguish between connected classes and disconnected-classes: the former (generated by Matisse) work with a live connection to the Matisse data-base; the latter transport Matisse class data to a layer not connected to the database. Matisse has the ability to generate disconnected-classes but I like to control my disconnected classes so I define my own. In this case PersonBE which we have already coded.

    Generate the Matisse .NET classes under the TestMatisse namespace by firing up the command console and call mt_stbgen in the directory of your project:

    c:\ directory of project\mt_stbgen name of your machine  MTest �p �TestMatisse�

    Matisse will write a Person.cs to your directory. Import this class to your project. (I suggest you replace the one in my source code. Notice � I have formatted the source code into regions so it is easier to understand. I am not going into the details of the Matisse generated class in this article.)

    The next stage is to code some-methods on the Person class which allows it to synchronize with the PersonBE. Remember, Person communicates with the Matisse database with a live data-base connection. The actual data sent to the Presentation Layer is contained in the PersonBE. I need a mechanism to pass Person properties to the PersonBE properties when I retrieve the data from the database; on the opposite side, I need to convert properties from the PersonBE to Person properties when I want to update the database from the changes made in the front end. I achieve this synchronization mechanism with three methods on the Matisse generated class.

    • The update method synchronizes the PersonBE (from the UI) to the Person object. The method takes a PersonBE and step by step transfers the PersonBE properties to the corresponding Person properties:
      public void update(PersonBE obj)
      {
        this.Firstname = obj.Firstname;
        this.Lastname = obj.Lastname;
        this.Birthdate = obj.BirthDate;
       }    
      
    • Converting a Person object to a PersonBE is achieved by overriding the Person�s ToDataObject and CopyAttributesToDataObj methods. The ToDataObject method creates a new PersonBE object and passes it to the overridden CopyAttributesToDataObj method, which copies all of its property values to the corresponding property values of the PersonBE class. This class is then sent to the UI layer.
      public override object ToDataObject()
      {
       PersonBE person = new PersonBE();
       CopyAttributesToDataObj(person);
       return person;
      }
      public override void CopyAttributesToDataObj(object dataObj)
      {
       base.CopyAttributesToDataObj (dataObj);
       ((PersonBE)dataObj).Firstname = this.Firstname;
       ((PersonBE)dataObj).Lastname = this.Lastname;
       ((PersonBE)dataObj).BirthDate = this.Birthdate;
      }
      
  3. Code the Data Access � I have done a little bit extra work in the Matisse Data Access Layer. I have created two classes: BaseMatisse and DataAccess. The BaseMatisse is an abstract class which acts as a type of DAAB to Matisse. I have used this class in more sophisticated and intricate Matisse programming and it serves nearly all of my needs. Essentially, it is a fa�ade pattern to Matisse data-connectivity functionality. Specific Data Access classes simply inherit the BaseMatisse functionality.

    In this version of BaseMatisse I have hard-coded the Matisse version of a connection object as a static MtDatabase object called db. The class then has a number of static methods which Open and Closes a Matisse database, Begins and Commits a Matisse Transaction, and Opens and Closes a Matisse Version read (See the Matisse literature for explanation of Version and Transactional reads and writes.) You will have to change "PDEV1" to the name of your own server in the source code!!

    using System;
    using com.matisse.reflect;
    using com.matisse.Data;
    using com.matisse;
    
    namespace TestMatisse
    {
       public abstract class BaseMatisse
       {
        protected static MtDatabase db = new MtDatabase("PDEV1", 
    "MTest", new MtPackageObjectFactory("TestMatisse,TestMatisse"));
        protected static void close()
        {
        if(db.IsConnectionOpen())
        db.Close();
        }
        protected static void open()
        {
         if(!db.IsConnectionOpen())
        db.Open();
        }
        protected static void beginTrans()
        {
          if(!db.IsConnectionOpen())
        open();
          if(!db.IsTransactionInProgress())
          db.BeginTransaction();
        }
        protected static void commitTrans(bool off)
        {
        if(db.IsTransactionInProgress())
          db.Commit();
        if(off)
          close();
        }
        protected static void openVersion()
        {
        if(!db.IsConnectionOpen())
          open();
        if(!db.IsVersionAccessInProgress())
         db.StartVersionAccess();
         }
         protected static void closeVersion(bool off)
         {
        if(db.IsVersionAccessInProgress())
           db.EndVersionAccess();
         if(off)
              close();
         }
     }
        
    }
    

    It�s now easy and intuitive to inherit the BaseMatisse and code the Data Access logic for the application. This is done in the DataAccess class which has two short and self-explanatory methods:

    internal class DataAccess: BaseMatisse
    {
        internal static ArrayList getPersons()
        {
        ArrayList list = new ArrayList();
        openVersion();
        foreach(Person person in Person.InstanceEnumerator(db))
        {
           PersonBE pbe = (PersonBE)person.ToDataObject();
           list.Add(pbe);
        }
        closeVersion(true);
        return list;
         }
         internal static void insPersons(ArrayList persons)
         {
        beginTrans();
        foreach(PersonBE pbe in persons)
        {
          Person person = new Person(db);
          person.update(pbe);
        }            
        commitTrans(true);
         }
    }
    

    Unlike the SQL data-access logic, there is no reliance here on SQL stored procedures. (Matisse does have an equivalent to stored procedures but you can get by without using them.) Also, Matisse uses the same method to update and insert data i.e. the insPerson method. Further, note the mapping to PersonBE from Person in getPersons, and the converse mapping to Person from PersonBE in the insPersons method.

  4. Hook up the UI � the UI is exactly the same as the SQL UI.

FASTOBJECTS

When I discovered Matisse about eight-months ago, it blew my socks away. I realized then and there that it was worth-while to make the effort and learn this data-storage technology and try push it to my customers. It took a good, intensive three months to master the product. Two weeks ago I discovered FastObjects � all I can say that now I am in love. Don�t hesitate � go to www.FastObjects.com NOW and download it. The documentation is thorough and easy; the technology is simple yet powerful. I am not going to say anything more. [Unfortunately, this article is not a detailed tutorial on FastObjects. For that � maybe another time.] I�ll take you through the steps for doing what we did with Matisse and SQL server and judge for yourself.

Download and install the FastObjects .NET Trial edition (it expires at the end of June 2004).

The FastObjects source-code is in the TestVersant.proj.

  1. Create PersonBE Object - code the PersonBE object with its properties. Reference FastObjects.t7.Runtime (you�ll find it in the Program File/FastObjects_.NET_Trial/bin directory). Use the FastObjects namespace. Mark your PersonBE class with the [Persistent(Verifiable=true)] attribute. (The Verifiable qualifier is need for a disconnected scenario.)
    using System;
    using FastObjects;
    
    namespace TestVersant
    {
      [Persistent(Verifiable=true)]
       public class PersonBE
       {
         private string first, last;
         private DateTime birth;
         public PersonBE(){}
    
         public string Firstname
         {
        get{return first;}
        set{first = value;}
         }
         public string Lastname
         {
        get{return last;}
        set{last = value;}
         }
         public DateTime BirthDate
         {
        get{return birth;}
        set{birth = value;}
         }
        public override bool Equals(object obj)
        {
        return this.Firstname + ", " + this.Lastname;
      }
     }
    }
    

    Select the TestVersant project in the Solution Explorer and press F4 to get the project properties. You will see the following properties that FastObjects generate: ConfigFile; Database; Enhancing; Policy File; Project File; Project Folder; Schema.

    FastObjects works on the principle of generating a schema file and a data file for your data-repository. The data-fie contains your data and is defined by the Database property. The schema file is termed a dictionary in the FastObjects world and is defined by the Schema property.

    The magic is in the Enhancing property. If you set the Enhancing property to True, when you compile your project, FastObjects inspects all classes marked with the [Persistent] attribute and generates a FastObjects database (dictionary and database) automatically. No SQL script, no ODL, no DDL, no going into Enterprise Managers � just set enhancing to True and compile. (How FastObjects does this � must be left for another occasion.)

    Go ahead and set the property values as follows:

    • Database = �TestVersantBase�
    • Enhancing = �True�
    • Schema = �TestVersantDict�

    Compile!

    That�s basically all there is to it. But FastObjects can do more work for you. You should notice a FastObjects menu on your Visual Studio top-menu. Select the Enable Project option and just follow the wizard. FastObjects now generates two objects in your project: a FastObjects.cfg file and a ObjectScopeProvider1.cs(class).

  2. Create Data Storage � see above!!!!!
  3. Code the Data Access

    The essential concepts of FastObjects data-access are simple to grasp. Access to the data-store is gained via a DataBase object. However, to work with the database, an ObjectScope object is required. ObjectScope implements the IObjectScope interface. The FastObjects equivalent to a data-string is the URL of your data file. In this example:

    fastobjects://LOCAL/TestVersantBase

    In two simple lines you get a handle to the IObjectInterface:

    FastObjects.Database db = FastObjects.Database.Get(
      "fastobjects://LOCAL/TestVersantBase" );
    IObjectScope  scope = db.GetObjectScope();
    

    The ObjectScopeProvider1 class generated by FastObjects codes all of this for us.

    To start working with a FastObjects database you must begin a transaction by calling the Begin method of the scope�s Transaction property. A transaction must be committed.

    IObjectScope scope = ObjectScopeProvider1.ObjectScope();
    scope.Transaction.Begin();
        ��..
    scope.Transaction.Commit();
    

    Once you have opened the scope�s transaction, there are a range of operations you can carry out on the database (you have a live connection now). One of those options is to traverse the extent of objects of a specific class in the database. You can get an enumerator to the extent and traverse through each instance of the extant.

    IDBObjectEnumerator en = scope.GetExtent(
      typeof(PersonBE)).GetEnumerator();

    Because the ObjectScope needs a connection to the data-base, it cannot be sent to other layers in the application. As in Matisse, the scope must have a mechanism to transport its data in a disconnected manner. This is where the FastObjects ObjectContainer comes into play. I can�t begin to sing the praises of the ObjectContainer � it deserves a series of articles all on its own. The ObjectContainer is a container for ObjectScope data you want to disconnect from the database and send to the UI or Business Layer. Data in the ObjectScope can be serialized; can be converted to XML. But more than this � the ObjectContainer monitors changes made to the objects it contains. When the ObjectContainer is sent back from the UI to persist changes �FastObjects knows automatically which objects are dirty and so must be updated and which objects are new and so must be inserted. There is so much more - but all this for another occasion.

    The FastObjects DataAccess class, like the Matisse DataAccess class has only two methods: getPersons and insPersons. In Matisse getPersons returns an ArrayList � in FastObjects it returns an ObjectContainer. In Matisse the insPersons takes an ArrayList as a parameter �in FastObjects the parameter is the magical ObjectContainer.

    The insPersons method gets a handle to the IObjectScope interface and begins a transaction. Then, in one line, the objects in the ObjectContainer are copied to the ObjectScope; the Verfy.All and Verify.Lock parameters perform the FastObjects concurrency checks (again � unfortunately � this is not the place to explain the magic).

    container.CopyTo(scope, Verify.All | Verify.Lock);

    The transaction is then committed and that�s it. All new objects are inserted. All objects changed are updated.

    internal static void insPersons(ObjectContainer container)
    {
      IObjectScope scope = ObjectScopeProvider1.ObjectScope();
      scope.Transaction.Begin();
      container.CopyTo(scope, Verify.All | Verify.Lock);
      scope.Transaction.Commit();
    }
      

    The getPersons method instantiates a new ObjectContainer and then gets a handle to the IObjectScope interface. A transaction is opened and an enumerator of the PersonBE extent is obtained. We copy the scope and its enumerator to the ObjectContainer � which basically puts all the PersonBE objects into the ObjectContainer.

    container.CopyFrom(scope, en);

    Dispose of the enumerator, commit the transaction and send off the ObjectContainer to perform its wonders.

    internal static ObjectContainer getPersons()
    {
      ObjectContainer container = new ObjectContainer();
      IObjectScope scope = ObjectScopeProvider1.ObjectScope();
      scope.Transaction.Begin();
      IDBObjectEnumerator en = scope.GetExtent(
        typeof(PersonBE)).GetEnumerator();
      container.CopyFrom(scope, en);
      en.Dispose();
      scope.Transaction.Commit();
      return container;            
    }
    
  4. Hook up the UI � the FastObjects UI uses the ObjectContainer and not an ArrayList. Therefore, the UI needs a reference to FastObjects and must use the FastObjects namespace. The ArrayList object is replaced by a ObjectContainer object.
    using FastObjects;
    public class Form1 : System.Windows.Forms.Form
    {
       private ObjectContainer container;
      �..
    }
    
    //The ObjectContainer has a getList() methods which returns 
    
    //an IList interface of the internal collection of objects 
    
    //in the ObjectContainer. This feature allows me to use data
    
    //binding directly on the ObjectContainer.
    
    
    private void btnGet_Click(object sender, EventArgs e)
    {
      counter.Start();
      container = DataAccess.getPersons();
      
      if(lb.Items.Count == 0)
     {
       lb.DataSource = container.GetList();
      }
      else
      {
        BindingManagerBase manager = this.BindingContext[
          container.GetList()];
        manager.SuspendBinding();                
        lb.DataSource = container.GetList();
        manager.ResumeBinding();
       }            
      
      counter.Stop();
    
      s = FileHelper.preGet(DB, container.GetList().Count);
      file();                         
    }
    

Test Results

I have conducted a number of tests using all three databases. I will give here the test results for inserting of records. The test is simple: insert 30, then 300 and then 3000 Person objects into each database. Note down the time in milliseconds for completing the operation and perform five of each type i.e. 5 * 30, 5 * 300.

These were the results I obtained for the purposes of this article:

1 2 3 4 5 AVERAGE milseconds
SQL 30 15.625 15.625 15.625 15.625 15.625 15.625
300 296.875 250 203.125 187.5 234.375 234.375
3000 3046.875 2953.125 2765.625 2718.75 4203.125 3137.5
Matisse 30 15.625 31.25 15.625 15.615 46.875 24.998
300 62.5 62.5 62.5 62.5 62.5 62.5
3000 453.125 453.125 453.125 437.5 437.5 446.875
FastObjects 30 62.5 78.125 15.625 16 62.5 46.8
300 93.75 109.375 93.75 109.357 62.5 93.746
3000 578.125 437.5 453.125 453.125 484.375 481.25

The summary can be laid out as follows:

SQL MATISSE FastObjects
30 15.625 24.998 46.875
300 234.375 62.5 93.746
3000 3137.5 446.875 481.25

From the above the following can be observed:

  • For a small amount of inserts, SQL Server is the better performer. Matisse performs better than FastObjects.
  • For 300 inserts both FastObjects and Matisse significantly outperform SQL Server. Matisse performs about 30% faster than FastObjects.
  • For 3000 inserts both Matisse and FastObjects perform about 700% (!!!!) better than SQL Server. Matisse performs about 2% better than FastObjects.

Overall � the other tests conducted � confirm the above findings i.e. Matisse, overall, is the better performer. For small tasks (i.e. limited records) SQL Server is the better performer.

I still have not conducted heavy-duty stress testing, with multiple hits for multiple reads and writes (i.e. 30 simultaneous writes with 30, 300 and 3000 records; 3000 writes for 30, 300, 3000 etc�).

But there is an essential element to add to the testing. The SQL application took the longest to develop, followed by Matisse. The FastObjects application I developed in about � of the time of Matisse and SQL. If you add objects (i.e. tables) and object relations (i.e. join tables), SQL development time will increase exponentially; Matisse will increase incrementally while FastObjects development time will still be minimal.

Taking time to market, complexity of data model, maintainability of code and performance into the equation (with the proviso that no heavy-duty stress testing has been conducted) FastObjects is my choice by a long, long, long, long way.

Finally

I would greatly appreciate any feed back on heavy duty stress testing you might conduct on any of these products. FastObjects has released a Trial edition but I don't think there will be much change when the final version is released. Matisse is already in its incremental .NET binding release. i have used Matisse for quite a long time - and so far it has proved robust and stable and capable of taking all I can throw at it. There is one further point I would like to make: the footprint of both Matisse and FastObjects is miniscule compared to SQL Server (not to mention "Yuokon").

I am still battling to change my profile on this site. But please register at my site www.pdev.co.za and I will keep you posted on both Matisse and FastObjects.

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here