Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles
(untagged)

Building DynamicTVP

0.00/5 (No votes)
22 Sep 2015 1  
Building DynamicTVP

Introduction

Conceptually; SQL TVP requires a user to create type tables and that over time becomes a nightmare to maintain if you have multiple tables that constantly need to change.

Background

Now I am no guru developer(simply a software tester) but when one of the best developer guy came up with this concept of creating a dynamicTVP using SQL Temporary tables, I thought it was impossible but took it upon myself to experiment, just to see how far I could get.

The idea behind this concept is not necessarily to build something new that never existed, but simply to make use of existing coding tools also keeping inmind that I need the solution to be used across multiple databases without any restrictions.

The way this concept works is as follows:

1. Create a normal sql stored procedure that

  • Acquires the current state of a particular table that one needs to insert data to.
  • Construct a global temporary table based on this structure.

2. Make use of the existing SqlBulkCopy to insert data into the temporary table created.

3. Manipulate the data as desired directly from the temporary table.

Using the code

Because we want this to be as dynamic as possible, the stored procedure that we use needs to be able to accept:

  • temporary table name that we want to create: @tblName
  • destination table that we want to import data to, and create temptable based on: @processtbl
  • database name where the destination table resides on, though not necessary.: @processdb

 

CREATE PROCEDURE [dbo].[DynamicTVP]        
        @tblName VARCHAR(200),
        @processtbl VARCHAR(200),
        @processdb VARCHAR(200)
AS

 

Then we need to simply acquire as much information about the table what we want to create a temp-table based on, this we simply structure it into a string then create a temp-table

--DECLARE VARIABLES USED ONLY IN THE Procedure
DECLARE @Columnstring VARCHAR(MAX),
        @Insert VARCHAR(MAX),
        @dropString VARCHAR(200)


IF OBJECT_ID('tempdb..##tblColumns') IS NOT NULL
   DROP TABLE ##tblColumns;
--Drop Global DynamicTVP
SET @dropString = '  DROP TABLE ' + @tblName +';'
IF OBJECT_ID('tempdb..' + @tblName +'') IS NOT NULL
   EXEC(@dropString)

--GET ME THE COLUMNS THAT I NEED FOR MY TEMP TABLE
    SELECT (REPLACE(CAST((CONVERT(varchar,COLUMN_NAME) + ' ' 
            + UPPER(CONVERT(varchar,DATA_TYPE)) 
            + '(' +CONVERT(varchar,ISNULL(CHARACTER_MAXIMUM_LENGTH,' ')) + ')') 
            AS VARCHAR(200)), '(0)', '')) AS ColumnNames
        INTO ##tblColumns
    FROM INFORMATION_SCHEMA.COLUMNS
    WHERE TABLE_NAME = @processtbl AND TABLE_CATALOG = @processdb

--CREATE DYNAMIC TVP
SELECT @Columnstring = COALESCE(@Columnstring + ', ', '') + ColumnNames
FROM ##tblColumns
BEGIN
    SET @Columnstring = N'CREATE TABLE ' + @tblName + ' (' + @Columnstring + ');';
    EXEC(@Columnstring)
END
GO

 

Now that we have that stored procedure in our database, next step is to tackle this from beginners C# code.

First stop is to create our database connection, which can be parameterised based on what you have, then from there, create a DataTable, that we will store the information that we have. note it is wise to try simplify this by having a dataTable that has the same column structure as your temp-table, so you might need to clean your your data first.

string connectionstring = ("Data Source=" + servername + ";Initial Catalog=" + databasename + ";User ID=" + dbusername + ";Password=" + dbpassword + "; Application Name=DynamicTVP");
                conn = new SqlConnection(connectionstring);
                SqlCommand command;
                var dts = new DataTable();

 

Once the datatable is available all I had to do was to fill it up, but because I was testing the concept, I went on to query the the table that already existed, just to get my data.

 //For testing purpose, fill up my data table first.
                SqlDataAdapter da = new SqlDataAdapter(command);
                da.Fill(dts);
                conn.Close();
                da.Dispose();

 

Once I had filled up my dataTable, I needed my temp table up and ready on SQL, so to do this, i just create a new connection to create this temp table, but remember my stored procedure that creates such table is dynamic enough to accept certain parameters, I had to execute with all the parameters, see below:

command = new SqlCommand("InsertTable", conn);
                conn.Open();
                //execute stored procedure that creates my temp TVP.
                command.CommandText = "DynamicTVP";
                command.CommandType = CommandType.StoredProcedure;
                command.Parameters.AddWithValue("@tblName", tblTemp);
                command.Parameters.AddWithValue("@processtbl", tblProcess);
                command.Parameters.AddWithValue("@processdb", dbProcess);
                command.ExecuteNonQuery();

 

Lastly I had to import that datatable into SQL again, and I found SqlBulkCopy quite simple to use, efficient and responsive.

only hurdle I had was the timeout, which I extended to 200 seconds and it worked fine. Remember the parameters that we need to pass to the stored procedure in order to create the temp-table. additional information can be acquired about sqlbulkcopy from the internet.

SqlBulkCopy dynamicInsert = new SqlBulkCopy(connectionstring, SqlBulkCopyOptions.KeepIdentity);
                dynamicInsert.DestinationTableName = tblTemp;
                dynamicInsert.BulkCopyTimeout = 200; 
                dynamicInsert.WriteToServer(dts);
                dynamicInsert.Close();

//From here you can execute other stored procedures, but note that the temp table only lives while this connection exists.
                conn.Close();

 

 

Points of Interest

Only problem/challange here had was maitaining database connection at all times as without it you simply loose your temp-table. also, keeping inmind that I had just migrated from simple bulkInsert in SQL that worked perfectly on local sql instance and not remote instance. dealing with legacy applications, whereby I could not change any code or procedures required me to come up with a working solution that can import my data in seconds. this was by far the simplest which did not affect current tables, and I could perform any data cleanups that needed to happen straight up.

History

Keep a running update of any changes or improvements you've made here.

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here