Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / Languages / SQL

SSRS Multi-Data Source DPE (Data Processing Extension)

5.00/5 (6 votes)
23 Oct 2011CPOL6 min read 52.2K   1.6K  
A way to combine data from two different sources into one datasource for SSRS reporting

Introduction

SSRS is a wonderful tool for quickly retrieving data from many different data sources and presenting the data to the user at a run-time decided format. One area where SSRS often falls short is when the underlying data needs to come from several different sources. Perhaps we want to retrieve data from the General Ledger which is in Oracle, and join that against employees which are listed in SQL Server to display in a table. Many times, when this happens, we end up with two disconnected data sets and are unable to join them without the use of a linked server.

The option shown here is a custom Data Processing Extension that will collect the data from the two different sources and merge them into one table.

Background

Before attempting a custom DPE (Data Processing Extension) for this problem, remember that alternatives such as Web Services, and SSIS data migration might be easier ways for you to stage your data into one data set for reporting purposes. I propose this as one more tool for your tool belt when developing reports against enterprise size systems.

I strongly recommend also reading the following code project tutorial here to get a stronger understanding of what can be done with an SSRS DPE. I have chosen to upload this tutorial because the current tutorial has significantly more functionality than is needed to implement a basic DPE, and I wanted to trim down some of the additional functionality. The above DPE is used to allow XML files to be a Data Source for SSRS reporting - I highly recommend it to anybody interested in more information on DPEs.

I also highly recommend reading on MSDN the overview of Data Processing Extensions to understand their role within the SSRS lifecycle: Data Processing Extensions Overview.

Using the Code

After downloading the DBDPE.zip (Data Base Data Processing Extension) and loading it into Visual Studio 2010, you will notice a dependency on Microsoft.ReportingServices.Interfaces. When you install the SQL Server Reporting Services Client on your machine, you should notice a folder:

[SQL Server Path]\MSRS[Version]\ReportServer\bin\ 

On my Denali Instance, that reads:

D:\Program Files\Microsoft SQL Server\MSRS11.DENALI\ReportServer\bin\

This is where Microsoft.ReportingServices.Interface.DLL resides.

It is also important to address .NET versions: SQL Server 2005 and 2008 were built against the 2.0 Framework. 2008 R2 and Denali (SQL Server 2012) are built with the 3.5 assemblies (which we know to be 2.0 along with the handful of extra features). For this reason, we cannot build our DPE as a .NET 4.0 assembly. - Depending on the version of SQL Server, we must choose a version between 2.0 and 3.5. Attempting to use a 4.0 build will cause some very interesting errors.

The two most complex custom objects in the project are DBConnection and DBCommand. The two are designed to utilize two separate data contexts, with their own connections and command text attributes. To merge the two together so that we expose only one command and connection to SSRS, we assume an XML-like syntax to the properties which we will see closer when we look at the RDL file.

The DBCommand object parses the XML file into its two properties as shown below:

C#
        /// <summary>
        /// Gets/sets the connection string
        /// </summary>
        /// <remarks>
        /// String must consist of XML element with <sqlConn1 /> and 
        /// <sqlConn2 /> attributes"
        /// </remarks>
        public String ConnectionString
        {
            get
            {
                return string.Format("SQL: {0} ORA: {1}", 
				m_sqlConnection1, m_sqlConnection2);
            }
            set
            {
                Debug.WriteLine("Setting IDBConnection.ConnectionString  
				to '" + value + "'");

                XmlDocument xmlDoc = new XmlDocument();
                xmlDoc.LoadXml(value);

                if (xmlDoc["xml"].Attributes["sqlConn1"] == null)
                {
                    throw (new ArgumentException("'Connection sqlConn1 attribute 
			is missing in the connection xml", value));
                }
                m_sqlConnection1 = xmlDoc["xml"].Attributes["sqlConn1"].Value;

                if (xmlDoc["xml"].Attributes["sqlConn2"] == null)
                {
                    throw (new ArgumentException("'Connection sqlConn2 attribute 
			is missing in the connection xml", value));
                }
                m_sqlConnection2 = xmlDoc["xml"].Attributes["sqlConn2"].Value;
            }
        }
...

The goal of the above code is to merge two potential connection strings into one exposed property. Any exceptions thrown will be handled by the reporting environment (either BI Developer Studio's report designer, or the SQL Server Report Server). These exceptions can then be resolved by the person developing the report.

Once we have the two different connections, we want to execute our two different commands against those connections. A similar strategy is used to combine two commands into one exposed SSRS property.

C#
/// <summary>
/// Gets/sets the current command text
/// </summary>
/// <remarks>
/// CommandText must be in the format of:
/// <xml leftCommand="TSQL" rightCommand="TSQL" 
/// leftMatch="leftJoin" rightMatch="rightJoin" />
/// </remarks>
public String CommandText
{
    get
    {
        string commandText = string.Format("{0}/{1}/{2}/{3}",
            m_leftCommand, m_rightCommand, m_leftMatch, m_rightMatch);

        Debug.WriteLine("IDBCommand.CommandText: Get Value =" + commandText);
        return (commandText);
    }
    set
    {
        Debug.WriteLine("IDBCommand.CommandText: Set Value =" + value);

        XmlDocument xmlDoc = new XmlDocument();
        xmlDoc.LoadXml(value);

        if (xmlDoc["xml"].Attributes["leftCommand"] == null)
        {
            throw (new ArgumentException
            ("'Connection leftCommand attribute is missing 
				in the connection xml", value));
        }
        m_leftCommand = xmlDoc["xml"].Attributes["leftCommand"].Value;
        if (xmlDoc["xml"].Attributes["leftMatch"] == null)
        {
            throw (new ArgumentException
            ("'Connection leftMatch attribute is missing 
				in the connection xml", value));
        }
        m_leftMatch = xmlDoc["xml"].Attributes["leftMatch"].Value;
        if (xmlDoc["xml"].Attributes["rightCommand"] == null)
        {
            throw (new ArgumentException
            ("'Connection rightCommand attribute is missing 
				in the connection xml", value));
        }
        m_rightCommand = xmlDoc["xml"].Attributes["rightCommand"].Value;

        if (xmlDoc["xml"].Attributes["rightMatch"] == null)
        {
            throw (new ArgumentException
            ("'Connection rightMatch attribute is missing 
				in the connection xml", value));
        }
        m_rightMatch = xmlDoc["xml"].Attributes["rightMatch"].Value;
    }
}

The strategy here is to allow a simulated "left join", where the left and right queries will return two data sets, and they will be joined on a property found in the left table against a property found in the right table. At this point, we only want to parse the command text, but will use that text later to access the underlying data stores.

C#
/// <summary>
/// Retrieves IDataReader interface used to retrieve data and schema information.
/// </summary>
/// <param name="behavior">The requested  command behavior behavior</param>
/// <returns>IDataReader Interface</returns>
public IDataReader ExecuteReader(CommandBehavior behavior)
{
    Debug.WriteLine("IDBCommand.ExecuteReader with CommandBehavior." + behavior);

    SqlCommand sqlCmd1 = new SqlCommand(m_leftCommand, m_connection.sqlConnection1);
    SqlCommand sqlCmd2 = new SqlCommand(m_rightCommand, m_connection.sqlConnection2);

    foreach (DBParameter param in this.Parameters)
    {
        sqlCmd1.Parameters.Add(new SqlParameter(param.ParameterName, param.Value));
    }

    SqlDataAdapter sqlDA1 = new SqlDataAdapter(sqlCmd1);
    System.Data.DataSet dsResult1 = new System.Data.DataSet();
    sqlDA1.Fill(dsResult1);

    SqlDataAdapter sqlDA2 = new SqlDataAdapter(sqlCmd2);
    System.Data.DataSet dsResult2 = new System.Data.DataSet();
    sqlDA2.Fill(dsResult2);

    m_dataTable = MergeTables(dsResult1.Tables[0], dsResult2.Tables[0]);

    return new DBDataReader(this);
}

Here, we will execute the provided connections and commands. The DBConnection object is responsible for opening and closing the connection, so this code is only interested in setting up our command objects. The MergeTables function below will combine our two separate data sets into one data set for our report. The m_dataTable property is shared with the underlying DBDataReader to return data to the reporting environment.

C#
/// <summary>
/// Merge two different tables together based on the left and right 
/// joining column definitions.
/// </summary>
/// <param name="leftTable"></param>
/// <param name="rightTable"></param>
/// <returns></returns>
private System.Data.DataTable MergeTables
	(System.Data.DataTable leftTable, System.Data.DataTable rightTable)
{
    // Setup output table with all columns in both source tables
    System.Data.DataTable dtblResult = new System.Data.DataTable();
    foreach (System.Data.DataColumn column in leftTable.Columns)
        dtblResult.Columns.Add(column.ColumnName, column.DataType);
    foreach (System.Data.DataColumn column in rightTable.Columns)
        dtblResult.Columns.Add(column.ColumnName, column.DataType);

    // Iterate through left table, joining against right
    foreach (System.Data.DataRow leftRow in leftTable.Rows)
    {
        // Simulated Left Join - Every row in left table will be in our result set
        System.Data.DataRow newRow = dtblResult.NewRow();

        // Assign resulting properties from left table
        foreach (System.Data.DataColumn column in leftTable.Columns)
            newRow[column.ColumnName] = leftRow[column];

        // Determine matching value we are looking for
        object leftMatchVal = leftRow[m_leftMatch];
        foreach (System.Data.DataRow rightRow in rightTable.Rows)
        {
            // Determine value to compare against
            object rightMatchVal = rightRow[m_rightMatch];
            if (leftMatchVal.GetType() == rightMatchVal.GetType())
            {
                // Not good enough to do type insensitive match -
                // need to unbox before comparison succeeds.
                // Once a match has been found, break out of right loop
                // and go to next row in left join table
                if (leftMatchVal.GetType() == typeof(string) &&
                    (string)rightMatchVal == (string)leftMatchVal)
                {
                    // Assign resulting properties from right table
                    foreach (System.Data.DataColumn column in rightTable.Columns)
                        newRow[column.ColumnName] = rightRow[column];

                    break;
                }
                else if (leftMatchVal.GetType() == typeof(int) &&
                    (int)rightMatchVal == (int)leftMatchVal)
                {
                    // Assign resulting properties from right table
                    foreach (System.Data.DataColumn column in rightTable.Columns)
                        newRow[column.ColumnName] = rightRow[column];
                    break;
                }
            }
        }

        // Either found match or completed iteration of
        // all rows in right join table - ready for next row
        dtblResult.Rows.Add(newRow);
    }

    // Completed iteration of all rows in left join table - return result
    return dtblResult;
}

I would strongly suggest optimizing the above match algorithm in a production environment - My goal here was straightforward code, so caching and other strategies for optimizing run time are not being used.

Once you have managed to build the Data Processing Extension, remember to deploy it to your development and reporting environments, and to modify the appropriate configuration files. I strongly recommend reading the appropriate MSDN tutorials How to: Deploy a Data Processing Extension to a Report Server and How to: Deploy a Data Processing Extension to Report Designer.

Likewise, the Code Project article referenced earlier Implementing a Data Processing Extension is an excellent tutorial for setting up a DPE within the development and reporting environments.

Once you have your reporting environments set up, it is time to load up the Reporting Services project within DoubleDBReportProj.zip.

The only report within the project is DoubleDB.rdl, and that is sample report using our new Data Source. Open the report and look at the Data Sources defined within the rdl.

Sample Image - maximum width is 600 pixels

The data source type should read "Custom DB Activity Extension", which comes from the LocalizedName property of our DBConnection object. The connection string property should be an XML element (named xml) with two attributes: sqlConn1 and sqlConn2. In our example, they are pointed at the same database, which is a trivial implementation - in a true implementation, we would expect these to be two different database connections.

Sample Image - maximum width is 600 pixels

The DataSet for the previously defined DataSource has a query again which consists of one XML element. Our four attributes are what we defined in our DPE to allow two different queries and a way to join those queries against each other. Clicking refresh fields will validate that everything is acting as expected. I would strongly recommend spending some time debugging your DPE to watch the flow of events.

Sample Image - maximum width is 600 pixels

With the DoubleDBReportProject open in Visual Studio (BIDS), open a second instance of Visual Studio, and load the DBDPE solution. From the DBDPE solution, click Debug -> Attach to Process. In the list of processes on your machine, you should see devenv.exe listed as a process. Select the appropriate devenv.exe instance, and click attach. Adding breakpoints to your DBDPE solution, you should be able to watch the process move from connection to command to data reader and back again to the reporting environment.

Good luck!

History

  • First version: 10/22/2011

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)