Introduction
Let's get this out of the way right up front. I don't use ORMs. I abhor the idea of them, because they try to be (and need to be) a one-size fits all solution, and assume that you should never stray from whatever "one true path" they tend to advocate. Furthermore, I don't know who decided it was a bad idea to use stored procedures, but ORMs appear to adhere to this paradigm sh*t, and I much prefer stored procedures over in-code queries due to the much higher inherent security stored procedures afford. Be it known that I'm not the slightest bit interested in arguing the point, nor even rationally discussing it, so don't waste your time (or mine) filling the comments section with such drivel.
Furthermore, this code uses SQL Server (2014 if you're interested), and should be compatible with everything back to 2008R2, and everything up to whatever version is currently available. Once again, I'm not interested in the typical fanboy claptrap regarding MySql, YourSql, or WhateverSQL you might be using, and I won't be willing to help you modify the code to do so. You're all programmers, so work the problem, and don't bother me about it.
I understand that the girls out there that use .NET Core might also be able to use this code, but since I don't, I won't be providing code that supports it, or even suggestions about how to convert it to do so. If you're even semi-self-aware, you'll be able to easily identify the parts of the code that require modification to support your own swirling vortex of insanity.
Now that we have the pleasantries out of the way...
General
This article reflects a technique I implemented in an environment that supports two networks (classified and unclassified), each having five environments (on different virtual machines), supporting 20 applications, and accessing over 30 databases. In the interest of brevity, I removed everything not directly related to this code (network, environment, and applications) because a) it's proprietary, b) it just cluttered up the code, and c) it didn't apply to the desired aim of the article. If anyone is doing the math, this scheme supported the possibility of having 6000 possible connection strings (although realistically, each of our apps would only require 4-5 to be available).
This article demonstrates a reasonably generic approach to using ADO.NET and support of a high number of connection strings. Not only that, but all connection strings are built on-demand, and absolves the programmer of maintaining config files.
The Code
The following are nice-to-know facts regarding this code:
- Used Visual Studio 2017 (should be able to use 2013 or higher without significant gymnastics on your part)
- .NET 4.62 (should be able to use 4.5 or greater)
- SQL Server 2014 (should be able to use 2008R2 or higher)
- Windows 7 VM (yes, I'm a barbarian, but the code should work in whatever bug-ridden abomination you might be using)
- Reflection is used to achieve the generic nature of the code. I realize reflection is generally frowned upon because it's slow, but reflection performance can't be a factor due to the nature of using generic types.
The sample solution architecture is implemented in such a way as to minimize the code/assembly footprint. Feel free to release your inner programmer to make it in your own image. I retained the comments in an effort to reduce the typed article narrative.
Connection Strings
As stated above, the ConnectionString
class presented here was developed due to the extreme breadth of operating environments we have to support. In our code, we have a ConnectionStringList
class that implements proprietary code to support our requirements, but it wasn't necessary to illustrate the actual reason we're here. However, it might be useful if you want to obfuscate your own connection strings, or ease your eventual quagmire of web.config files in a multi-environment situation such as ours.
First, we have the expected connection string components. These properties support the most commonly used attributes in a connection string
, and are unremarkable.
protected string Server { get; set; }
protected string Database { get; set; }
protected string UserID { get; set; }
protected string Password { get; set; }
Next, we have a simple sanity check property. It's pupose is to perform rudimentary validation to make sure a minimally valid connection string
is possible:
private bool IsValid
{
get
{
bool hasServer = !string.IsNullOrEmpty(this.Server);
bool hasDB = !string.IsNullOrEmpty(this.Database);
bool hasUid = !string.IsNullOrEmpty(this.UserID);
bool hasPwd = !string.IsNullOrEmpty(this.Password);
bool isValid = (hasServer && hasDB);
isValid &= ((!hasUid && !hasPwd) || (hasUid && hasPwd));
return isValid;
}
}
And then come the more interesting properties, which are used to actually build the connection string when requested by the calling method. The connection strings created by this code determine how it's constructed based on the previously discussed properties. As you can see, credentials and encrypted traffic (take note of the comment for the encrypted traffic property) are supported. To add more security, the returned connection string is base64 encoded. While this code always encodes the connection string, you can easily refactor this class to either not do that at all, or to react to an accompanying property that can dictate the desired functionality.
private string Credentials
{
get
{
string value = "Integrated Security=";
if (string.IsNullOrEmpty(this.Password) && string.IsNullOrEmpty(this.UserID))
{
value = string.Format("{0}true;", value);
}
else
{
value = string.Format("{0}false; user id={1}; password={2};",
value, this.UserID, this.Password);
}
return value;
}
}
private string WithEncryptedTraffic
{
get
{
string value = string.Empty;
if (this.EncryptTraffic)
{
value = "Encrypt=true; TrustServerCertificate=true;";
}
return value;
}
}
public string ConnectionString
{
get
{
string value = string.Empty;
if (this.IsValid)
{
value = string.Format("data source={0}; initial catalog={1}; {2} {3}",
this.Server, this.Database, this.Credentials, this.WithEncryptedTraffic);
}
else
{
throw new InvalidOperationException
("One or more required connection string parameters were not specified.");
}
return value.Base64Encode();
}
}
Finally, we come to the constructor and the obligatory ToString()
override. There's really nothing to discuss here because it's just standard C# stuff.
public AppConnectionString(string name, string server,
string database, string uid, string pwd, bool encryptTraffic=false)
{
this.Name = name;
this.Server = server;
this.Database = database;
this.UserID = uid;
this.Password = pwd;
this.EncryptTraffic = encryptTraffic;
}
public override string ToString()
{
return this.ConnectionString;
}
Entities and Attributes
I decided to use the word "entity" to ease the calm for everyone that's been seduced by the promises of their favorite ORM.
EntityBase
The EntityBase
class provides code that returns an inherited object's properties as a SqlParameter
array. This means all of your entity-specific code can stay in the entity object without cluttering up the business logic. To add a modicum of control over what properties are returned a SQL parameters, you can decorate those properties with the CanDbInsert
attribute. This means you can mix properties into your class that don't necessarily need to be (or qualify for) insertion into the database.
public partial class EntityBase
{
private bool onlyDecorated = true;
public bool OnlyDecorated
{
get { return this.onlyDecorated; }
set { this.onlyDecorated = value; }
}
The following property simplifies the generation of an appropriate array of SqlParameter
objects. It simply calls the GetEntityProperties
method to retrieve all decorated properties to be converted to SQLParameter
objects.
public virtual SqlParameter[] AsSqlParams
{
get
{
PropertyInfo[] properties = this.GetEntityProperties();
SqlParameter[] parameters = this.MakeSqlParameters(properties);
return parameters;
}
}
The following methods are used by the AsSqlParameters
property, and use reflection to retrieve - and optionally filter - the properties that will be returned as SqlParameter
objects.
public virtual PropertyInfo[] GetEntityProperties()
{
return GetEntityProperties(this.OnlyDecorated);
}
public virtual PropertyInfo[] GetEntityProperties(bool onlyDecorated)
{
PropertyInfo[] properties = (onlyDecorated)
? this.GetType().GetProperties().Where
(x => x.IsDefined(typeof(CanDbInsertAttribute))).ToArray()
: this.GetType().GetProperties();
return properties;
}
Finally, we have a method that creates the SqlParameter
array from the provided properties.
protected virtual SqlParameter[] MakeSqlParameters(PropertyInfo[] properties)
{
List<sqlparameter> parameters = new List<sqlparameter>();
foreach(PropertyInfo property in properties)
{
parameters.Add(new SqlParameter(string.Format("@{0}",property.Name),
property.GetValue(this)));
}
return ((parameters.Count > 0) ? parameters.ToArray() : null);
}
}
The CanDbInsertAttribute Object
This object represents a nothing-fancy decorator attribute used by the EntityBase
class to weed out non-insertable properties in your EntityBase
-derived entities. If you need more attributes, you can use this one as an example.
[AttributeUsage(AttributeTargets.Property, AllowMultiple = false, Inherited = true)]
public class CanDbInsertAttribute : Attribute
{
public string Name { get; set; }
public string Argument { get; set; }
}
The Sample Entity
The sample entity used in the sample application exercises the EntityBase
class, and the CanDbInsertAttribute
attribute.
namespace BLLSample.Entities
{
public class EntityUserInfo : EntityBase
{
public int UserID { get; set; }
[CanDbInsert]
public string UserName { get; set; }
[CanDbInsert]
public string Description { get; set; }
[CanDbInsert]
public string Gender { get; set; }
public SqlParameter[] AsSqlParametersForUpdate
{
get
{
PropertyInfo[] properties = this.GetEntityProperties(false);
SqlParameter[] parameters = this.MakeSqlParameters(properties);
return parameters;
}
}
}
}
The DAL (Data Access Layer)
Finally, the reason we're all actually here. The DAL object is what makes this whole thing work in a generic way. The need to manually identify properties for inserting/updating into the database has already been adequately mitigated, but what about retrieving the data? The DAL object uses reflection rather heavily to achieve this functionality. However, it does not absolve us of creating the requisite ADO code.
First, we establish some necessary properties, and initialize them in the constructor. Take note of the FailOnMismatch
property. This property allows you to tell the DAL whether or not an exception should be thrown in the event that the returned dataset has more/fewer columns that the receiving entity has matching properties. The default value for this property is false
.
public bool FailOnMismatch { get; set; }
public int TimeoutSecs { get; set; }
public string ConnectionString { get; protected set; }
public bool AddReturnParamIfMissing { get; set; }
public int BulkInsertBatchSize { get; set; }
public int BulkCopyTimeout { get; set; }
public DAL(string connStr)
{
if (string.IsNullOrEmpty(connStr))
{
throw new ArgumentNullException("connection string");
}
this.ConnectionString = connStr;
this.TimeoutSecs = 300;
this.FailOnMismatch = false;
this.AddReturnParamIfMissing = true;
this.BulkInsertBatchSize = 100;
this.BulkCopyTimeout = 600;
}
Next, we establish two methods for getting and setting data in the database. Notice that these two methods request the afore-described connection string
s, and base64 decode them for use. If you choose NOT to base64
-encode your connection string
s, this method will still return the correct (un-decoded) string
value.
protected DataTable GetData(string cmdText, SqlParameter[] parameters=null,
CommandType cmdType = CommandType.StoredProcedure)
{
SqlConnection conn = null;
SqlCommand cmd = null;
SqlDataReader reader = null;
DataTable data = null;
using (conn = new SqlConnection(this.ConnectionString.Base64Decode()))
{
conn.Open();
using (cmd = new SqlCommand(cmdText, conn)
{ CommandTimeout = this.TimeoutSecs, CommandType = cmdType } )
{
if (parameters != null)
{
cmd.Parameters.AddRange(parameters);
}
using (reader = cmd.ExecuteReader())
{
data = new DataTable();
data.Load(reader);
}
}
}
return data;
}
protected int SetData(string cmdText, SqlParameter[] parameters,
CommandType cmdType = CommandType.StoredProcedure)
{
int result = 0;
SqlConnection conn = null;
SqlCommand cmd = null;
using (conn = new SqlConnection(this.ConnectionString.Base64Decode()))
{
conn.Open();
using (cmd = new SqlCommand(cmdText, conn)
{ CommandTimeout = this.TimeoutSecs, CommandType = cmdType } )
{
SqlParameter rowsAffected = null;
if (parameters != null)
{
cmd.Parameters.AddRange(parameters);
if (cmdType == CommandType.StoredProcedure && this.AddReturnParamIfMissing)
{
rowsAffected = parameters.FirstOrDefault
(x=>x.Direction == ParameterDirection.ReturnValue);
if (rowsAffected == null)
{
rowsAffected = cmd.Parameters.Add(new SqlParameter
("@rowsAffected", SqlDbType.Int)
{ Direction = ParameterDirection.ReturnValue } );
}
}
}
result = cmd.ExecuteNonQuery();
result = (rowsAffected != null) ? (int)rowsAffected.Value : result;
}
}
return result;
}
Next, we see the helper methods that actually use the magic of reflection to determine how to move data from the provided DataTable
into the specified entity object. In order for this to work, the property names MUST be identical to the column names returned by the query (and yes, the comparison is case-sensitive).
protected static T ConvertFromDBValue<T>(object obj, T defaultValue)
{
T result = (obj == null || obj == DBNull.Value) ? default(T) : (T)obj;
return result;
}
protected List<T> MakeEntityFromDataTable<T>(DataTable data)
{
Type objType = typeof(T);
List<T> collection = new List<T>();
if (data != null && data.Rows.Count > 0)
{
int matched = 0;
foreach(DataRow row in data.Rows)
{
T item = (T)Activator.CreateInstance(objType);
PropertyInfo[] properties = objType.GetProperties();
foreach (PropertyInfo property in properties)
{
if (data.Columns.Contains(property.Name))
{
Type pType = property.PropertyType;
var defaultValue = pType.GetDefaultValue();
var value = row[property.Name];
value = ConvertFromDBValue(value, defaultValue );
property.SetValue(item, value);
matched++;
}
}
if (matched != data.Columns.Count && this.FailOnMismatch)
{
throw new Exception("Data retrieved does not match specified model.");
}
collection.Add(item);
}
}
return collection;
}
Next, we see the actual methods that are called by the business layer object. There are two pairs of overloaded methods. Each pair contains what amounts to a "getter" and a "setter" overload. The code from which this article was taken throws an exception when the developer tries to call ExecuteQuery
because all of our database interaction is performed via stored procedures (it's just more reliable that way).
public List<T> ExecuteStoredProc<T>(string storedProc, params SqlParameter[] parameters)
{
if (string.IsNullOrEmpty(storedProc))
{
throw new ArgumentNullException("stored procedure");
}
DataTable data = this.GetData(storedProc, parameters, CommandType.StoredProcedure);
List<T> collection = this.MakeEntityFromDataTable<T>(data);
return collection;
}
public int ExecuteStoredProc(string storedProc, params SqlParameter[] parameters)
{
if (string.IsNullOrEmpty(storedProc))
{
throw new ArgumentNullException("stored procedure");
}
int result = this.SetData(storedProc, parameters, CommandType.StoredProcedure);
return result;
}
public List<T> ExecuteQuery<T>(string query, params SqlParameter[] parameters)
{
if (string.IsNullOrEmpty(query))
{
throw new ArgumentNullException("query");
}
DataTable data = this.GetData(query, parameters, CommandType.Text);
List<T> collection = this.MakeEntityFromDataTable<T>(data);
return collection;
}
public int ExecuteQuery(string query, params SqlParameter[] parameters)
{
if (string.IsNullOrEmpty(query))
{
throw new ArgumentNullException("query");
}
int result = this.SetData(query, parameters, CommandType.Text);
return result;
}
And finally, we have a set of methods that allow bulk inserting into the database via the ADO SqlBulkInsert
object.
public int DoBulkInsert(DataTable dataTable)
{
int recordsAffected = 0;
SqlConnection conn = null;
SqlBulkCopy bulk = null;
using (conn = new SqlConnection(this.ConnectionString))
{
conn.Open();
using (bulk = new SqlBulkCopy(conn)
{
BatchSize = this.BulkInsertBatchSize
,BulkCopyTimeout = this.BulkCopyTimeout
,DestinationTableName = dataTable.TableName
})
{
bulk.WriteToServer(dataTable);
}
}
return recordsAffected;
}
public int DoBulkInsert<T>(IEnumerable<T> data,
string tableName, bool byDBInsertAttribute=false)
{
int result = 0;
DataTable dataTable = null;
if (data.Count() > 0)
{
using (dataTable = new DataTable(){TableName = tableName })
{
Type type = typeof(T);
MethodInfo method = type.GetMethod("GetEntityProperties");
PropertyInfo[] properties = (method == null) ?
type.GetProperties().Where(prop =>
Attribute.IsDefined(prop, typeof(CanDbInsertAttribute))).ToArray()
: (PropertyInfo[])method.Invoke(this, null);
foreach (PropertyInfo property in properties)
{
dataTable.Columns.Add(new DataColumn(property.Name, property.PropertyType));
}
foreach (T entity in data)
{
DataRow row = dataTable.NewRow();
foreach (PropertyInfo property in properties)
{
row[property.Name] = property.GetValue(entity);
}
dataTable.Rows.Add(row);
}
}
result = this.DoBulkInsert(dataTable);
}
return result;
}
The BLL (Business Logic Layer)
Because of all the work done behind the scenes, implementing a BLL is laughably simple, and the methods implemented can be as simple or as complex as necessary. Essentially, the entities and the business layer are the only things you really have to code yourself. Your BLL
object must inherit the DAL
object, and at that point, Bob's your uncle. As you can see, your BLL
methods are usually very simple (part of my belief that the further out you get from the core code, the simpler it should be on the programmer.
namespace BLLSample
{
public partial class BLL : DAL
{
public BLL(string connectionString) : base (connectionString)
{
}
public List<EntityUserInfo> GetUsers()
{
List<EntityUserInfo> users = this.ExecuteStoredProc<EntityUserInfo>
("dbo.spGetUsers", null);
return users;
}
public EntityUserInfo GetUserInfo(int id)
{
SqlParameter[] parameters = new SqlParameter[]
{
new SqlParameter("@userID", id),
};
EntityUserInfo user = this.ExecuteStoredProc<EntityUserInfo>
("dbo.spGetUser", parameters).FirstOrDefault();
return user;
}
public int SaveUser(EntityUserInfo model)
{
int result = this.ExecuteStoredProc("dbo.spAddUser", model.AsSqlParams);
return result;
}
}
}
Usage
Before you compile and run the app, open the CreateDALSampleDB.sql file (in the DALSample
project), and follow the instructions at the top of the file.
using DataModels.DAL;
using BLLSample;
using BLLSample.Entities;
namespace DALSample
{
class Program
{
static void Main(string[] args)
{
AppConnectionString connectionString =
new AppConnectionString("Sample", "localhost", "DALSample", "", "", false);
BLL bll = new BLL(connectionString.ConnectionString);
List<EntityUserInfo> data = bll.GetUsers();
}
}
}
Late Breaking Code Correction
1) I noticed that the ConnectionString
property in the AppConnectionString
class did not call the Base64Encode()
method before returning the constructed connection string. I fixed this in the article code snippet, but NOT in the associated source code ZIP file (my work environment prevents me from downloading anything, so I can't fix it from work). The absence of the Base64Encode
call won't hurt/break anything, but it may be important to you so, act on this in any way you see fit.
In Closing
I make no guarantees that this code follows anybody's idea of "best practice", and I will vehemently dissuade everyone from using an ORM of any kind. I also cannot guarantee compatibility with any past, current, or future version of any of the tools cited. In other words, if the code is somehow borked beyond all recognition, be a programmer, and fix it yourself. If you find problems, and let me know, I'll try to fix the article ASAP.
Spelling mistakes may be present in copious quantities - I typed this article on my phone - in the bathroom.
No liquid nitrogen was used or harmed, intentionally or otherwise, during the creation of this article.
Bacon - because 'Murrica.
History
- 2019.07.23 - Added some narrative, and fixed some code in one of the connection string code blocks (refer to the section labelled Late Breaking Code Correction.
- 2019.07.22 - Fixed PRE blocks
- 2019.07.21 - Initial publication