Introduction
This article explains using data driven unit testing for a DotNetNuke Private Assembly (PA) Module with Visual Studio 2005 and SQL Server CE.
My previous article covered how to create a unit test for a DotNetNuke Private Assembly Module. This is an extension to that article, and covers hooking up Data Driven unit testing. Data Driven unit testing involves using an external data source to feed test data into your Unit Tests so that many different data values and combinations can be run through a single piece of unit test code.
Data Driven unit tests are far superior in that they can provide much better test coverage than 'hard coded' single unit tests. This is particularly useful for checking data bounds on storage types, such as string lengths, numeric types, and other vital system parameters.
This article covers specifically using the unit testing class I setup to do DotNetNuke module testing, but most of the information will apply to any type of Visual Studio data-driven unit testing. So even if you're not developing in DotNetNuke, there's probably something here for you - after all, a DotNetNuke module is just an Assembly full of types and data access methods.
Specifying an External Data Source
The first question to answer is: what format to store your test data? The obvious answer is SQL Server - that's where the database is, after all. It stores the data in the correct formats, the access methods are well known, and if you have Management Studio, comes with a good UI.
But SQL Server comes with a significant drawback - it's not portable. Even in SQL Server Express form, where you can move around a .mdf file, you still need to have the full application installed on every machine that is going to use it. Developer machines don't always have SQL Server on them, and even if they do, it might be the wrong version, the permissions might be configured differently. And then, if you're trying to introduce some quality control to your development environment (which is why you are implementing unit testing in the first place, right?), you probably want a separate, clean, build machine on which to build and deploy your finished code.
All this leaves the SQL Server solution a little washed up, because it would mean keeping several SQL Server installs across several machines, or making sure that all machines can access a single server. And the killer blow is that it's very difficult to keep the changes in a SQL Server database synchronised with your source control system.
Another alternative is to put your data in an Excel spreadsheet and reference that as your test data. While I think this is great as it is a lightweight, portable way to store test data that can be included in source code control, there can be some problems with data formatting and making sure your test data comes across in the right way. If you're just dealing with some text strings and a few numbers, it's probably not going to affect you. But start dealing with financial/scientific data which needs precision carefully controlled, and you might get into some trouble.
Microsoft Access is a good solution, my only personal gripe is one: bloat. Even the simplest Access database can get nasty after a few opens/closes. But if you want to store and access your test data, integrate it with source code control, and even create a simple interface so that non-technical users can enter test data (say, a quality assurance team - wouldn't one of those be nice!), it's a good solution.
My personal favourite, though, is SQL Server CE. This many-named technology has been called, amongst other things, SQL Server Mobile, SQL Server Everywhere Edition, and finally I think Microsoft has settled on SQL Server Compact Edition (CE). It's been roughly the same product all the way, but now they have added Desktop machine support, so it's not just for PDAs. It is very lightweight, does not require an install on any machine, and the databases typically stay pretty small. You can modify the tables directly in Visual Studio, and the data is stored in the same types as they are in SQL Server.
The rest of this article concentrates on doing unit testing with SQL Server CE as the test data source.
As usual, whenever I first set out on a new path, I look for the trailblazers who have gone before us: a quick web search turns up Using SQL Server Files for Unit Testing in VS2005 TS, which is a good simple guide on hooking up SQL Server CE to your unit tests.
How to create a data driven DotNetNuke Module Unit Test
Step 0: Read the first article and download the code
The previous article goes into how to setup a unit test for a DotNetNuke module. Don't even start trying to hook it up to be data driven until that part is working, and you can insert/delete/update/select from the DotNetNuke database.
The download for this article contains all the libraries that were in the first one, plus a few more.
Step 1: Download the SQL Server CE libraries
SQL Server CE is a free download from Microsoft. You can get all the details from Microsoft.
They aren't the friendliest of installs, but it should go OK. You're just installing some simple files onto your system - it's not a major install like SQL Server Enterprise Edition. Besides, once you've used it once, you'll find all sorts of uses for SQL Server CE - it's a handy little lightweight database.
Step 2: Create a Unit Test project
Follow the instructions in the first article to create your unit test project. In short: create a unit test project, reference the iFinity.Dnn.Utilities DLL,l and then get your Unit Test class to inherit from DnnUnitTest
. You could always read the previous article!
Step 3: Create a new SQL Server CE database
You can create a SQL Server CE database from your Visual Studio menu. Open your 'Server Explorer', and click on the 'Connect to Database' button. This is the one with the small green '+' sign over a yellow cylinder.
The 'Add Connection' dialog should come up. You will need to change the data source, so click on 'Change...' This will bring up the 'Change Data Source' dialog. You should have a listing for 'Microsoft SQL Server 2005 Compact Edition'. If you don't have this, you're stuck. Go back to step 1 and repeat until you do have this in the listing. (Note: if you were doing your test data in a different technology, you could select it here.)
Selecting SQL Server CE as the data source
When you select 'Microsoft SQL Server 2005 Compact Edition', it should change the 'Data Provider' drop down to say '.NET Framework Data Provider for SQL Server Compact Edition'. Again, if you don't have this, go back to step 1 and repeat until you do. Click OK and you will be taken back to the 'Add Connection' screen, only this time it will have changed. Click on the 'Create' button to create a new SQL Server CE database, or click on the 'Browse' button if you already have one created that will do the job.
Creating a new SQL Server CE Database
I normally create my test database in a relative location to my test project (usually in a subfolder called /db). This means that you can easily include your database in source control and keep the version of the database in sync with the version of the Unit Test code which uses it. It also means that when extracting the code from source control on a different machine, you don't have to fiddle with the app.config file to specify a new path. This is especially useful if you are using a build machine.
I don't ever put a password in for test data, because it just makes things more complicated, and it's just test data. But if you really want to, you can create a password. Just remember to write it down for later. Click 'OK' and your database is created.
In your 'Server Explorer' window in Visual Studio, you should now have a new database listed in the 'Data Connections' section.
Step 4: Create Test Data Tables
Now that you have a database, you can get cracking on creating test data. I just create tables using the Server Explorer (note that you can open SWL Server CE databases in SWL Server Management Studio, if you have it). My tables are generally a denormalized version of the test data I want to use. For example, if I have a table structure which is one-to-many in my DotNetNuke database, I will just create one test data table to test that entire set of data, and I enter the test data in denormalised form. If you don't understand anything in that last sentence, don't concern yourself - you're probably already producing denormalised data anyway.
The moral of this paragraph is to create one table per Unit Test. So if you have a Unit Test class which tests the insert/update/delete of Widgets, then you want one Widget table in your test data database.
When creating the test data tables, I like to add in some extra columns which describe the test being undertaken. An important one is 'TestOutcome'. You need to be able to enter test data that will fail as well as test data that will pass. Depending on what you are testing, this could be anything from a true/false value, to the catching of an exception. I normally just create it as a varchar
field so I can put whatever I like in there and code against it.
Creating the columns for a table
Another field I like to put in is 'DeleteOnComplete' - which is just a boolean indicator saying whether or not to delete the test data on completion. Sometimes you want to delete your test data from the database when finished, sometimes you don't. By putting a column into the table, you can control whether or not you delete that data when the test has completed. It can also be a varchar
field, where you define a range - such as 'OnSuccess', 'OnFail' - to determine whether or not to delete the test data.
For the rest of the columns, these should pretty much match what your destination DotNetNuke module database table has in it. It's easiest to match names and types, unless you are creating a specific column to test a particular function. Otherwise you pretty much want to replicate your destination table.
A nice trick is to go to your destination table and generate the create script for it (I use SQL Server Management Studio for this). You can then cut-and-paste 'Create table dnn_Widget (WidgetId int...
' into a SQL Server CE query window, change the table name, remove the 'proper' SQL Server syntax (such as collation type, indexes, etc.), and run it - this will create a copy of your table in your test database. Just add your control columns to the create table definition and you're away. You can even save the script in a text file along with the database for easy drop/recreate in the future.
Now that your test table is created, go and fill it with data. This can easily be done within Visual Studio 2005 by opening the table and entering into the datasheet-style editor.
Step 5: Modify the App.Config file to include the test database
In order to get your data-driven unit testing to work with your newly created test data, you need to specify it all in your app.config file. This is done in two sections, the Connection Strings and the Data Sources. The download has an example in it.
- Connection Strings
Add in the connection string to your test database that you just created. This is done in the <connectionStrings>
section in your app.config:
<add name="TestData" providerName="System.Data.OleDb"
connectionString=
"Provider=Microsoft.SQLSERVER.MOBILE.OLEDB.3.0;Data Source=MyTestData.sdf"/>
Note there is no path information on the database filename - more on that in the next step.
- DataSources
The <dataSources>
section is specified in the <microsoft.visualstudio.testtools>
section. You may have to add this into your app.config. Then, for each table created in your test data database, add in a data source definition (this is why I recommend a 1-to-1 mapping between your unit tests and tables in your database). Your app.config should contain something like this (although it will contain other Section Groups, sections, connection stings, and other vital settings - don't clear it out and just put in this example!).
<configSections>
<section name="microsoft.visualstudio.testtools"
type="Microsoft.VisualStudio.TestTools.UnitTesting.TestConfigurationSection,
Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=8.0.0.0,
Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/>
</configSections>
<microsoft.visualstudio.testtools>
<dataSources>
<add name="WidgetInfoTest" connectionString="TestData" dataTableName="WidgetTest"
dataAccessMethod="Sequential" /></dataSources>
</microsoft.visualstudio.testtools>
Step 6: Include the test database into the Test Configuration Deployment
When you compile and run a Unit Test under Visual Studio, it doesn't run from the \bin\debug directory underneath the project, like you might think. Instead, it gets copied to a directory going by the name of 'TestResults\<Username_ComputerName YYYY-MM-DD HH_MM_SS\out'. Now, because you want to make sure that you have the latest version of the test data, you need to make sure that the test database is right there, next to the binaries, ready to be opened.
To do this, you include the database in the 'localtestrun.testrunconfig' file. This file gets automatically created for you when you create a new testing project, and it gets shelved at the solution level, under 'Solution Items' in VS.
The steps are:
- Open the localtestrun.testrunconfig file by right clicking on it and selecting 'Open'.
- Click on the 'Deployment' section in the list, and click on 'Add File'.
- Select the database file you created in Step 2. It should appear in the list.
Changing the testrunconfig file
Click Apply and Close. Now every time a test is run, a copy of the test database will be placed alongside the code that ran with it.
If the fact that you will be spawning a little .sdf file every time you run a test worries you, I suggest that you:
- purchase more disk,
- clean up regularly, or
- go to a centralised database.
For a centralised database, you would specify the full path in the app.config file and not include it in the testrunconfig deployment. However, you lose the ability to have the unit test run on any machine (unless you set up each machine with the same path) - whether or not this will be a problem will be related to the individual situation.
Step 7: Hitch your Unit Test wagon to the data driven horses
The declaration of a unit test is where it is specified as data driven. All you need to do is specify the Data Source (from the app.config) file that you wish to use in the test. See the definition below. Just for fun, I also added in a Description declaration as well.
[TestMethod, DataSource("WidgetInfoTest"), Description("Test out Widget Save")]
public void MyWidgetSaveTest()
{
}
VB.NET
<TestMethod, DataSource("WidgetInfoTest"), Description("Test out Widget Save")>()
Public Sub MyDotNetNukeSqlTest()
End Sub
All the data driven test does is open up a cursor as specified in the DataSource, and then iterate the unit test for each row in the cursor. Accordingly, all you need to do is access the current cursor row, like you would anywhere else. In this case, it's kept in the TestContext.DataRow
property (the TextContext
property is defined in the underlying DnnUnitTest
class from the first article).
Here is a 'before' unit test method code, using hardcoded values to test out saving out Widgets. Note that I have made the assumption that a 'fail' in the database save is returned as a boolean rather than an exception - depending on how the code is written, a fail may be a caught exception instead. Additionally, the assumption here is that when a widget is saved, the WidgetId
value is set by an output parameter from the Stored Procedure and returned with the object (this is how I personally do it)
[TestMethod, DataSource("WidgetInfoTest"),
Description("Test out Widget Save")]
public void MyWidgetSaveTest()
{
WidgetInfo myWidget = new WidgetInfo();
myWidget.WidgetName = "Test Widget";
myWidget.WidgetPrice = 2.44;
myWidget.WidgetCategoryId = 4;
SqlDataProvider sqlProvider = new SqlDataProvder();
bool retVal = sqlProvider.Save(myWidget);
Assert.IsTrue(retVal);
Assert.AreNotEqual(myWidget.WidgetId, 0);
}
VB.NET
<TestMethod>()
Public Sub MyDotNetNukeSqlTest()
Dim myWidget as WidgetInfo = new WidgetInfo()
myWidget.WidgetName = "Test Widget"
myWidget.WidgetPrice = 2.44
myWidget.WidgetCategoryId = 4
Dim sqlProvider as SqlDataProvider = new SqlDataProvder()
Dim retVal as Boolean = sqlProvider.Save(myWidget)
Assert.IsTrue(retVal);
Assert.AreNotEqual(myWidget.WidgetId, 0);
End Sub
And this is the unit test code, after conversion to being data driven:
[TestMethod]
public void MyWidgetSaveTest()
{
WidgetInfo myWidget = new WidgetInfo();
myWidget.WidgetName = (string)TestContext.DataRow["WidgetName"];
myWidget.WidgetPrice = (double)TestContext.DataRow["WidgetPrice"];
myWidget.WidgetCategoryId = (int)TestContext.DataRow["WidgetCategoryId"];
SqlDataProvider sqlProvider = new SqlDataProvder();
bool retVal = sqlProvider.Save(myWidget);
bool expected = (bool)TestContext.DataRow["TestOutcome"];
Assert.AreEqual(retVal, expected);
Assert.AreNotEqual(myWidget.WidgetId, 0);
TestContext.DataRow["WidgetId"] = myWidget.WidgetId;
TestContext.DataRow.AcceptChanges(); }
VB.NET
<TestMethod, DataSource("WidgetInfoTest"), Description("Test out Widget Save")>()
Public Sub MyDotNetNukeSqlTest()
Dim myWidget as WidgetInfo = new WidgetInfo();
myWidget.WidgetName = Convert.ToString(TestContext.DataRow["WidgetName"])
myWidget.WidgetPrice = Convert.ToDouble(TestContext.DataRow["WidgetPrice"])
myWidget.WidgetCategoryId = Convert.ToInt32(TestContext.DataRow["WidgetCategoryId"])
Dim sqlProvider as SqlDataProvider = new SqlDataProvider()
Dim retVal as Boolean = sqlProvider.Save(myWidget)
Dim expected as Boolean = Convert.ToBoolean(TestContext.DataRow["TestOutcome"])
Assert.AreEqual(retVal, expected)
Assert.AreNotEqual(myWidget.WidgetId, 0)
TestContext.DataRow["WidgetId"] = myWidget.WidgetId
TestContext.DataRow.AcceptChanges() End Sub
As you can see, the test now gets the test data from the TestContext.DataRow
cursor. And, because we generate the WidgetId from the save process, I am storing the widget back into the DataRow
- which in turn stores it in the SQL Server CE table - allowing it to be used elsewhere. Additionally, the pass/fail status of the unit test is stored with the test data, so the same test is checked for both positive and negative cases - this tests out failure handling in the same code as the success case - which it will probably do when implemented 'properly' in your DotNetNuke module.
Step 8: Clean up your mess
One thing that can be a major problem with unit testing is that it can clog up your database with test data. This may or may not be a problem, depending on the type of application you are writing. What I tend to do is store a flag with the test data which says what to do with the test data once your test has run (either successfully, in a failed case, or the 'no matter what' case).
This is in my test table as the 'DeleteOnComplete'. In reality, you can define whatever column suits your testing needs - whether it is a boolean, a 'magic number', or a text string - it doesn't really matter.
You utilise this by putting in a 'TestCleanup
' method and reading back that flag to determine what to do. You can either access the methods in your SqlDataProvider, or if you don't have any suitable delete methods in there, you can just submit some raw SQL straight to your DotNetNuke database. Either way works. Here I've just submitted some raw SQL back into the DotNetNuke database, as specified by _sqlProvider.ConnectionString
, using the Microsoft Application Blocks SqlHelper
class.
[TestCleanup,Description("Cleanup for test"), DataSource("TestOrgPerson")]
public void CleanupTestData()
{
bool delete = (bool)TestContext.DataRow["DeleteOnComplete"];
if (delete)
{
int widgetId = (int)TestContext.DataRow["WidgetId"];
SqlDataProvider sqlProvider = new SqlDataProvder();
Microsoft.ApplicationBlocks.Data.SqlHelper.ExecuteNonQuery(sqlProvider.ConnectionString,
"DELETE FROM dnn_Widget where WidgetId = " + widgetId.ToString());
}
}
VB.NET
<TestCleanup,Description("Cleanup for test"), DataSource("TestOrgPerson")>()
Public Sub CleanupTestData()
Dim delete as Boolean = Convert.ToBoolean(TestContext.DataRow["DeleteOnComplete"])
If delete = True Then
Dim widgetId as Int32 = Convert.ToInt32(TestContext.DataRow["WidgetId"])
Dim sqlProvider as SqlDataProvider = new SqlDataProvider()
Microsoft.ApplicationBlocks.Data.SqlHelper.ExecuteNonQuery(sqlProvider.ConnectionString,_
"DELETE FROM dnn_Widget where WidgetId = " + widgetId.ToString())
End If
End Sub
This code will run each time a unit test completes, regardless of whether it fails or passes. By using this, your test data will go in and out of your database and not pollute the data with repetitive values (unless you want it to!).
Using a DotNetNuke database table as a test data source
Of course, there are many different types of testing. Let's say, for instance, you need to test out a piece of code that works for each user in the dnn_User table. Instead of putting all those users into a test data database table, you can just set the dnn_user table of your DotNetNuke database as the data source for your data driven test.
All you would need to do is this:
- Add in a new DataSource to the
<microsoft.visualstudio.testtools>
section which looks like the following:
<add name="dnnUsers" connectionString="SiteSqlServer"
dataTableName="dnn_Users" dataAccessMethod="Sequential"/>
This will open a sequential-forward only cursor on the dnn_Users table, as specified in your SiteSqlServer connection string.
- Specify 'dnnUsers' as a data source in your
[TestMethod,DataSource("dnnUsers")]
declaration, and you'll have the entire set of users to iterate over and test your code - or whichever table you so choose.
You can't use a combination of both (test data and 'real' data) because it works off single connection strings, but there's nothing to stop you from writing some code in the unit test to get records from the 'real' database to further refine your test inputs. An example would be selecting a particular Tab or TabModule record to work against. This can either be done with a SQL statement or by using the built-in DotNetNuke DataProvider methods.
Thoughts on data driven unit testing
I find data-driven unit testing to always be worth the slight amount of extra work. The fact is that when you have users adding data to your site, you will never know just what types of things they will stick into there. By having a data driven set of unit tests, you can always add some more data to the test data table and see how the code handles it. It's much easier to generate far greater sets of test data when using a table-driven approach. And the more data tested, the better the overall result.
On the same train of thought, I always spend time to clean up properly, as having large sets of generic data in a database makes it harder to spot errors. How would you know if your search results were returning duplicate records when you have 500 records with 'Test Data' as the headline? I carefully think about what should be kept after the test is complete, and what should be deleted.
Additionally, if you are running a 'build' machine and a product like TFS or NAnt, you should seriously think about automating your unit tests to run after each build. It's how the pros do it, and while it's a judgment call over how much time you spend on it, the reward is nearly always worth the effort as you will catch bugs earlier and make them more traceable. You should check your SQL Server CE database into your source code control system so that the code, the unit test code, and the unit test data are all stored as a single atomic changeset. Otherwise your test data and your unit test code can quickly become out-of-sync and you'll probably abandon it once that happens.
Did it work for you?
Like the first article, the code started out as very messy experimentation, which I continually refined to work the way I wanted, and this article started out as part of the same 'cheat sheet'. I'm still refining my approach to DotNetNuke unit testing, but based on the feedback from the first article, I'd say others out there are thinking along the same lines. Data driven tests are, in my opinion, superior to static tests, and only marginally harder to write.
If you have downloaded the code, used it, and found problems or come up with better ways to do things, please use the comments field in this article to let me and others know how you got on.
History
- 18 June 07 - Initial version.