ChoETL is an open source ETL (extract, transform and load) framework for .NET. It is a code based library for extracting data from multiple sources, transforming, and loading into your very own data warehouse in .NET environment. You can have data in your data warehouse in no time.
Contents
- Introduction
- Requirement
- "Hello World!" Sample
- Quick write - Data First Approach
- Code First Approach
- Configuration First Approach
- Writing All Records
- Write Records Manually
- Customize CSV Record
- Customize CSV Header
- Customize CSV Fields
- DefaultValue
- ChoFallbackValue
- Type Converters
- Validations
- Excel Field Separator
- Callback Mechanism
- BeginWrite
- EndWrite
- BeforeRecordWrite
- AfterRecordWrite
- RecordWriteError
- BeforeRecordFieldWrite
- AfterRecordFieldWrite
- RecordWriteFieldError
- Customization
- Using Dynamic Object
- Exceptions
- Tips
- Using MetadataType Annotation
- Configuration Choices
- Manual Configuration
- Auto Map Configuration
- Attaching MetadataType class
- ToText Helper Method
- Writing DataReader Helper Method
- Writing DataTable Helper Method
- Advanced Topics
- Override Converters Format Specs
- Currency Support
- Enum Support
- Boolean Support
- DateTime Support
- Fluent API
- WithDelimiter
- WithFirstLineHeader
- WithFields
- WithField
- QuoteAllFields
- History
ChoETL is an open source ETL (extract, transform and load) framework for .NET. It is a code based library for extracting data from multiple sources, transforming, and loading into your very own data warehouse in .NET environment. You can have data in your data warehouse in no time.
This article talks about using CSVRWriter
component offered by ChoETL framework. It is a simple utility class to save CSV data to a file.
The corresponding CSVReader article can be found here.
Features
- Follows CSV standard file rules. Gracefully handles data fields that contain commas and line breaks.
- In addition to comma, most delimiting characters can be used, including tab delimited fields.
- Supports culture specific date, currency and number formats while generating files.
- Supports different character encoding.
- Provides fine control of date, currency, enum, boolean, number formats when writing files.
- Detailed and robust error handling, allowing you to quickly find and fix problems.
- Shortens your development time.
This framework library is written in C# using .NET 4.5 Framework / .NET Core 2.x.
- Open VS.NET 2017 or higher.
- Create a sample VS.NET (.NET Framework 4.x / .NET Core 2.x) Console Application project.
- Install ChoETL via Package Manager Console using Nuget Command based on working .NET version:
Install-Package ChoETL
Install-Package ChoETL.NETStandard
- Use the
ChoETL
namespace.
Let's begin by looking into a simple example of generating the below CSV file having 2 columns.
Listing 3.1 Sample CSV data file (Emp.csv)
1,Tom
2,Carl
3,Mark
There are number of ways you can get the CSV file created with minimal setup.
This is the zero-config and quickest approach to create CSV file in no time. No typed POCO object is needed. Sample code below shows how to generate sample CSV file using dynamic objects.
Listing 3.1.1 Write list of objects to CSV file
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 1;
rec1.Name = "Mark";
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 2;
rec2.Name = "Jason";
objs.Add(rec2);
using (var parser = new ChoCSVWriter("Emp.csv").WithFirstLineHeader())
{
parser.Write(objs);
}
In the above sample, we give the list of dynamic objects to CSVWriter
at one pass to write them to CSV file.
Sample fiddle: https://dotnetfiddle.net/PZLBAg
Listing 3.1.2 Write each object to CSV file
using (var parser = new ChoCSVWriter("Emp.csv").WithFirstLineHeader())
{
dynamic rec1 = new ExpandoObject();
rec1.Id = 1;
rec1.Name = "Mark";
parser.Write(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 2;
rec2.Name = "Jason";
parser.Write(rec2);
}
In the above sample, we take control of constructing, passing each individual dynamic record to the CSVWriter
to generate the CSV file using Write
overload.
Sample fiddle: https://dotnetfiddle.net/zWnFtk
This is another zeo-config way to generate CSV file using typed POCO class. First, define a simple POCO class to match the underlying CSV file layout:
Listing 3.2.1 Simple POCO entity class
public partial class EmployeeRecSimple
{
public int Id { get; set; }
public string Name { get; set; }
}
In the above, the POCO class defines two properties matching the sample CSV file template.
Listing 3.2.2 Saving to CSV file
List<EmployeeRecSimple> objs = new List<EmployeeRecSimple>();
EmployeeRecSimple rec1 = new EmployeeRecSimple();
rec1.Id = 1;
rec1.Name = "Mark";
objs.Add(rec1);
EmployeeRecSimple rec2 = new EmployeeRecSimple();
rec2.Id = 2;
rec2.Name = "Jason";
objs.Add(rec2);
using (var parser = new ChoCSVWriter<EmployeeRecSimple>("Emp.csv").WithFirstLineHeader())
{
parser.Write(objs);
}
The above sample shows how to create CSV file from typed POCO class objects.
Sample fiddle: https://dotnetfiddle.net/gQoQq4
In this model, we define the CSV configuration with all the necessary parameters along with CSV columns required to generate the sample CSV file.
Listing 3.3.1 Define CSV configuration
ChoCSVRecordConfiguration config = new ChoCSVRecordConfiguration();
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Id", 1));
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Name", 2));
In the above, the class defines two CSV properties matching the sample CSV file template.
Listing 3.3.2 Generate CSV file without POCO object
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 1;
rec1.Name = "Mark";
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 2;
rec2.Name = "Tom";
objs.Add(rec2);
using (var parser = new ChoCSVWriter("Emp.csv", config)WithFirstLineHeader())
{
parser.Write(objs);
}
The above sample code shows how to generate CSV file from a list of dynamic objects using predefined CSV configuration setup. In the CSVWriter
constructor, we specified the CSV configuration object to obey the CSV layout schema while creating the file. If there are any mismatches in the name or count of CSV columns, it will be reported as an error and stops the writing process.
Sample fiddle: https://dotnetfiddle.net/xeC0ww
Listing 3.3.3 Saving CSV file with POCO object
List<EmployeeRecSimple> objs = new List<EmployeeRecSimple>();
EmployeeRecSimple rec1 = new EmployeeRecSimple();
rec1.Id = 1;
rec1.Name = "Mark";
objs.Add(rec1);
EmployeeRecSimple rec2 = new EmployeeRecSimple();
rec2.Id = 2;
rec2.Name = "Jason";
objs.Add(rec2);
using (var parser = new ChoCSVWriter<EmployeeRecSimple>("Emp.csv", config))
{
parser.Write(objs);
}
The above sample code shows how to generate CSV file from list of POCO objects with CSV configuration object. In the CSVWriter
constructor, we specified the CSV configuration configuration object.
3.4. Code First with Declarative Configuration
This is the combined approach to define POCO entity class along with attaching CSV configuration parameters declaratively. id
is required column and name
is optional value column with default value "XXXX
". If name
is not present, it will take the default value.
Listing 3.4.1 Define POCO Object
public class EmployeeRec
{
[ChoCSVRecordField(1)]
[Required]
public int? Id
{
get;
set;
}
[ChoCSVRecordField(2)]
[DefaultValue("XXXX")]
public string Name
{
get;
set;
}
public override string ToString()
{
return "{0}. {1}.".FormatString(Id, Name);
}
}
The code above illustrates about defining POCO object with nessasary attributes required to generate CSV file. First thing defines property for each record field with ChoCSVRecordFieldAttribute
to qualify for CSV record mapping. Each property must specify position
in order to be mapped to CSV column. Position is 1 based. Id
is a required property. We decorated it with RequiredAttribute
. Name
is given default value using DefaultValueAttribute
. It means that if the Name
value is not set in the object, CSVWriter
spits the default value 'XXXX
' to the file.
It is very simple and ready to save CSV data in no time.
Listing 3.4.2 Saving CSV file with POCO object
List<EmployeeRec> objs = new List<EmployeeRec>();
EmployeeRec rec1 = new EmployeeRec();
rec1.Id = 10;
rec1.Name = "Mark";
objs.Add(rec1);
EmployeeRec rec2 = new EmployeeRec();
rec2.Id = 200;
rec2.Name = "Lou";
objs.Add(rec2);
using (var parser = new ChoCSVWriter<EmployeeRec>("Emp.csv"))
{
parser.Write(objs);
}
We start by creating a new instance of ChoCSVWriter
object. That's all! All the heavy lifting of genering CSV data from the objects is done by the writer under the hood.
By default, CSVWriter
discovers and uses default configuration parameters while saving CSV file. These can be overridable according to your needs. The following sections will give you in-depth details about each configuration attributes.
Sample fiddle: https://dotnetfiddle.net/3iOhib
It is as easy as setting up POCO object match up with CSV file structure, construct the list of objects and pass it to CSVWriter
's Write
method. This will write the entire list of objects into CSV file in one single call.
Listing 4.1 Write to CSV File
List<EmployeeRec> objs = new List<EmployeeRec>();
...
using (var parser = new ChoCSVWriter<EmployeeRec>("Emp.csv"))
{
parser.Write(objs);
}
or:
Listing 4.2 Writer to CSV file stream
List<EmployeeRec> objs = new List<EmployeeRec>();
...
using (var tx = File.OpenWrite("Emp.csv"))
{
using (var parser = new ChoCSVWriter<EmployeeRec>(tx))
{
parser.Write(objs);
}
}
This model keeps your code elegant, clean, easy to read and maintain.
This is an alternative way to write each and individual record to CSV file in case the POCO objects are constructed in a disconnected way.
Listing 5.1 Wrting to CSV file
var writer = new ChoCSVWriter<EmployeeRec>("Emp.csv");
EmployeeRec rec1 = new EmployeeRec();
rec1.Id = 10;
rec1.Name = "Mark";
writer.Write(rec1);
EmployeeRec rec2 = new EmployeeRec();
rec2.Id = 11;
rec2.Name = "Tom";
writer.Write(rec2);
Sample fiddle: https://dotnetfiddle.net/fgTV2y
In rare situations, you may want to write custom lines to CSV file (example, footer line). This can be done by using WriteFields()
method.
using (var w = new ChoCSVWriter(Console.Out)
.WithFirstLineHeader()
.WithField("Id")
.WithField("Name")
)
{
dynamic rec = new ExpandoObject();
rec.Id = 10;
rec.Name = "Mark";
w.Write(rec);
w.WriteFields("RecordCount", 3);
}
In the above sample, after writing all the records to CSV file, it writes the CSV footer line (RecordCount, 1
) to the end of the file using WriteFields()
method.
Sample fiddle: https://dotnetfiddle.net/1QG21M
Normally, all CSV files are data centric. Very few may contain comments in them as well. CSVWriter
exposes method to write comment using WriteComment()
method. In order to use it, you must set the comment character using configuration to write one. Otherwise, the call will throw an exception.
using (var w = new ChoCSVWriter(Console.Out)
.WithFirstLineHeader()
.WithField("Id")
.WithField("Name")
.Configure(c => c.Comment = "#")
)
{
w.WriteComment("CSV Comment Line");
w.Write(rec);
}
Sample fiddle: https://dotnetfiddle.net/EnZey3
Using ChoCSVRecordObjectAttribute
, you can customize the POCO entity object declaratively.
Listing 6.1 Customizing POCO object for each record
[ChoCSVRecordObject(Encoding = "Encoding.UTF32",
ErrorMode = ChoErrorMode.IgnoreAndContinue,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All)]
public class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName ="Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
public string Name { get; set; }
}
Here are the available attributes to carry out customization of CSV load operation on a file.
Delimiter
- The value used to separate the fields in a CSV row. Default is Culture.TextInfo.ListSeparator
used. EOLDelimiter
- The value used to separate CSV rows. Default is \r\n
(NewLine). Culture
- The culture info used to read and write. IgnoreEmptyLine
- N/A Comments
- N/A QuoteChar
- The value used to escape fields that contain a delimiter, quote, or line ending. QuoteAllFields
- A flag that tells the writer whether all fields written should have quotes around them; regardless of whether the field contains anything that should be escaped. Encoding
- The encoding of the CSV file. HasExcelSeperator
- A flag that tells the writer to spit out the Excel separator information in the out file. ColumnCountStrict
- This flag indicates if an exception should be thrown if there is a CSV field configuration mismatch with the data object members. ColumnOrderStrict
- N/A BufferSize
- The size of the internal buffer that is used when reader is from the StreamWriter
. - NullValue - Special
null
value text expect to be treated as null
value from CSV file at the record level. ErrorMode
- This flag indicates if an exception should be thrown if writing and an expected field is failed to write. This can be overridden per property. Possible values are:
IgnoreAndContinue
- Ignore the error, record will be skipped and continue with next. ReportAndContinue
- Report the error to POCO entity if it is of IChoNotifyRecordWrite
type ThrowAndStop
- Throw the error and stop the execution
IgnoreFieldValueMode
- N/A ObjectValidationMode
- A flag to let the reader know about the type of validation to be performed with record object. Possible values are:
Off
- No object validation performed. (Default) MemberLevel
- Validation performed before each CSV property gets written to the file. ObjectLevel
- Validation performed before all the POCO properties are written to the file.
By attaching ChoCSVFileHeaderAttribute
to POCO entity object declaratively, you can influence the writer to generate CSV header when creating CSV file.
Listing 6.1 Customizing POCO object for file header
[ChoCSVFileHeader]
public class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName ="Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
public string Name { get; set; }
}
Here are the available members to add some customization to it according to your need.
FillChar
- Padding character used when size of the CSV column header is short of the column size (ChoCSVRecordFieldAttribute.Size
or ChoCSVRecordFieldConfiguration.Size
). Default is '\0
', padding will be off. Justification
- Column header alignment. Default is Left
. TrimOption
- N/A Truncate
- This flag tells that the writer to truncate the CSV column header value if it over the column size. Default is false
.
7.1 Override CSV Header
In general, CSVWriter
automatically generates header from the objects. Rarely you may want to override the header with your own text. ChoCSVWriter
exposes callback event, where you can subscribe to generate CSV header the way you want. Sample code shows how to do it:
using (var w = new ChoCSVWriter<Site>(new StringWriter(csv))
.WithFirstLineHeader()
.Setup(s => s.FileHeaderWrite += (o, e) =>
{
e.HeaderText = "ID, House";
})
)
{
w.Write(site);
}
Here is one another simple way to write header to CSV file by using WriteHeader()
method:
using (var w = new ChoCSVWriter<Site>(new StringWriter(csv))
.WithFirstLineHeader()
)
{
w.WriteHeader("ID", "House");
w.Write(site);
}
For each CSV column, you can specify the mapping in POCO entity property using ChoCSVRecordFieldAttribute
.
Listing 6.1 Customizing POCO object for CSV columns
[ChoCSVFileHeader]
public class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName ="Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
public string Name { get; set; }
}
Here are the available members to add some customization to it for each property:
FieldPosition
- When mapping by position, you specify the index of the CSV column that you want to use for that property. It is 1 based. FieldName
- CSV Column name header. If not specified, POCO object property name will be used as column header. FillChar
- Padding character used when size of the CSV column value is short of the column size. Default is '\0
', padding will be off. FieldValueJustification
- Column value alignment. Default is Left
. FieldValueTrimOption
- N/A Truncate
- This flag tells the writer to truncate the CSV column value if it is over the column size. Default is false
. Size
- Size of CSV column value. QuoteField
- A flag that tells the writer that the CSV column value is surrounded by quotes. - NullValue - Special
null
value text expects to be treated as null
value from CSV file at the field level. ErrorMode
- This flag indicates if an exception should be thrown if writing and an expected field failed to convert and write. Possible values are:
IgnoreAndContinue
- Ignore the error and continue to load other properties of the record. ReportAndContinue
- Report the error to POCO entity if it is of IChoRecord
type. ThrowAndStop
- Throw the error and stop the execution.
IgnoreFieldValueMode
- N/A
Any POCO entity property can be specified with default value using System.ComponentModel.DefaultValueAttribute
. It is the value used to write when the CSV value null
(controlled via IgnoreFieldValueMode
).
Any POCO entity property can be specified with fallback value using ChoETL.ChoFallbackValueAttribute
. It is the value used when the property is failed to writer to CSV. Fallback
value only set when ErrorMode
is either IgnoreAndContinue
or ReportAndContinue
.
Most of the primitive types are automatically converted to string/text and save them to CSV file. If the values of the CSV field aren't automatically converted into the text value, you can specify a custom / built-in .NET converters to convert the value to text. These can be either IValueConverter, IChoValueConverter
or TypeConverter
converters.
There are couple of ways you can specify the converters for each field:
- Declarative Approach
- Configuration Approach
This model is applicable to POCO entity object only. If you have POCO class, you can specify the converters to each property to carry out necessary conversion on them. Samples below show the way to do it.
Listing 8.3.1.1 Specifying type converters
[ChoCSVFileHeader]
public class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName ="Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
public string Name { get; set; }
}
Listing 8.3.1.2 IntConverter implementation
public class IntConverter : IValueConverter
{
public object Convert
(object value, Type targetType, object parameter, CultureInfo culture)
{
return value;
}
public object ConvertBack
(object value, Type targetType, object parameter, CultureInfo culture)
{
int intValue = (int)value;
return intValue.ToString("D4");
}
}
In the example above, we defined custom IntConverter
class. And showed how to format 'Id
' CSV property with leading zeros.
This model is applicable to both dynamic and POCO entity object. This gives freedom to attach the converters to each property at runtime. This takes precedence over the declarative converters on POCO classes.
Listing 8.3.2.1 Specifying TypeConverters
ChoCSVRecordConfiguration config = new ChoCSVRecordConfiguration();
config.FileHeaderConfiguration.HasHeaderRecord = true;
config.ThrowAndStopOnMissingField = false;
ChoCSVRecordFieldConfiguration idConfig = new ChoCSVRecordFieldConfiguration("Id", 1);
idConfig.AddConverter(new IntConverter());
config.CSVRecordFieldConfigurations.Add(idConfig);
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Name", 2));
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Name1", 2));
In the above, we construct and attach the IntConverter
to 'Id
' field using AddConverter
helper method in ChoCSVRecordFieldConfiguration
object.
Likewise, if you want to remove any converter from it, you can use RemoveConverter
on ChoCSVRecordFieldConfiguration
object.
This approach allows to attach value converter to each CSV member using Fluent API. This is a quick way to handle any odd conversion process and avoid creating value converter class.
Listing 8.3.3.1 POCO class
[ChoCSVFileHeader]
public class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName ="Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
public string Name { get; set; }
}
With the Fluent API, the sample below shows how to attach value converter to Id
column.
Listing 8.3.3.2 Attaching Value Converter
using (var dr = new ChoCSVWriter<EmployeeRec>(@"Test.csv")
.WithFirstLineHeader()
.WithField(c => c.Id, valueConverter: (v) =>
((int)value).ToString("C3", CultureInfo.CurrentCulture))
)
{
Console.WriteLine(rec);
}
CSVWriter
leverages both System.ComponentModel.DataAnnotations and Validation Block
validation attributes to specify validation rules for individual fields of POCO entity. Refer to the MSDN site for a list of available DataAnnotations
validation attributes.
Listing 8.4.1 Using validation attributes in POCO entity
[ChoCSVFileHeader]
[ChoCSVRecordObject(Encoding = "Encoding.UTF32", ErrorMode = ChoErrorMode.IgnoreAndContinue,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All,
ThrowAndStopOnMissingField = false)]
public partial class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, int.MaxValue, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name")]
[Required]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
}
In the example above, used Range
validation attribute for Id
property. Required validation attribute to Name
property. CSVWriter
performs validation on them before saving the data to file when Configuration.ObjectValidationMode
is set to ChoObjectValidationMode.MemberLevel
or ChoObjectValidationMode.ObjectLevel
.
In some cases, you may want to take control and perform manual self validation within the POCO entity class. This can be achieved by inheriting POCO object from IChoValidatable
interface.
Listing 8.4.2 Manual validation on POCO entity
[ChoCSVFileHeader]
[ChoCSVRecordObject(Encoding = "Encoding.UTF32",
ErrorMode = ChoErrorMode.IgnoreAndContinue,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All,
ThrowAndStopOnMissingField = false)]
public partial class EmployeeRec : IChoValidatable
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, int.MaxValue, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name")]
[Required]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
public bool TryValidate(object target,
ICollection<ValidationResult> validationResults)
{
return true;
}
public bool TryValidateFor
(object target, string memberName, ICollection<ValidationResult> validationResults)
{
return true;
}
}
The sample above shows how to implement custom self-validation in POCO object.
IChoValidatable
interface exposes the below methods:
TryValidate
- Validate entire object, return true
if all validation passed. Otherwise return false
. TryValidateFor
- Validate specific property of the object, return true
if all validation passed. Otherwise, return false
.
If you want to ignore a POCO class member from CSV parsing in OptOut
mode, decorate them with ChoIgnoreMemberAttribute
. The sample below shows Title
member is ignored from CSV loading process.
[ChoCSVFileHeader]
public class EmployeeRec
{
public int Id { get; set; }
public string Name { get; set; }
[ChoIgnoreMember]
public string Title { get; set; }
}
In OptOut
mode, you can specify the size of the CSV column by using System.ComponentModel.DataAnnotations.StringLengthAttribute
.
[ChoCSVFileHeader]
public class EmployeeRec
{
public int Id { get; set; }
[StringLength(25)]
public string Name { get; set; }
[ChoIgnoreMember]
public string Title { get; set; }
}
In OptOut
mode, you can specify the name of CSV column mapped to member using System.ComponentModel.DataAnnotations.DisplayAttribute
.
[ChoCSVFileHeader]
public class EmployeeRec
{
public int Id { get; set; }
[Display(Name="FullName")]
[StringLength(25)]
public string Name { get; set; }
[ChoIgnoreMember]
public string Title { get; set; }
}
8.7. DisplayName
In OptOut
mode, you can specify the name of CSV column mapped to member using System.ComponentModel.DataAnnotations.DisplayNameAttribute.
Listing 8.7.1 Specifying name of CSV column
[ChoCSVFileHeader]
public class EmployeeRec
{
public int Id { get; set; }
[Display(Name="FullName")]
[StringLength(25)]
public string Name { get; set; }
[ChoIgnoreMember]
public string Title { get; set; }
}
By setting HasExcelSeperator
declaratively on POCO object or ChoCSVRecordConfiguration.HasExcelSeperator
to true
to generate Excel field separator in the data file.
Listing 9.1 Specifying HasExcelSeperator to POCO object declaratively
[ChoCSVFileHeader]
[ChoCSVRecordObject(HasExcelSeparator = true)]
public class EmployeeRec
{
[ChoCSVRecordField(1)]
[Required]
[ChoFallbackValue(100)]
[Range(100, 10000)]
public int? Id
{
get;
set;
}
[ChoCSVRecordField(2)]
[DefaultValue("XXXX")]
public string Name
{
get;
set;
}
public override string ToString()
{
return "{0}. {1}.".FormatString(Id, Name);
}
}
Listing 9.2 Specifying HasExcelSeperator via configuration
ChoCSVRecordConfiguration config = new ChoCSVRecordConfiguration();
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Id", 1));
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Name", 2));
config.HasExcelSeparator = true;
List<EmployeeRecSimple> objs = new List<EmployeeRecSimple>();
EmployeeRecSimple rec1 = new EmployeeRecSimple();
rec1.Id = 1;
rec1.Name = "Mark";
objs.Add(rec1);
EmployeeRecSimple rec2 = new EmployeeRecSimple();
rec2.Id = 2;
rec2.Name = "Jason";
objs.Add(rec2);
using (var parser = new ChoCSVWriter<EmployeeRecSimple>("Emp.csv", config))
{
parser.Write(objs);
}
Listing 9.3 Sample CSV file with Excel field separator
sep=,
1,Mark
2,Jason
CSVWriter
offers industry standard CSV data file generation out of the box to handle most of the needs. If the generation process is not handling any of your needs, you can use the callback mechanism offered by CSVWriter
to handle such situations. In order to participate in the callback mechanism, you can use either of the following models:
- Using event handlers exposed by
CSVWriter
via IChoWriter
interface. - Inheriting POCO entity object from
IChoNotifyRecordWrite / IChoNotifyFileWrite / IChoNotifyRecordFieldWrite
interfaces - Inheriting
DataAnnotation
's MetadataType
type object by IChoNotifyRecordWrite
/ IChoNotifyFileWrite / IChoNotifyRecordFieldWrite
interfaces.
In order to participate in the callback mechanism, either POCO entity object or DataAnnotation
's MetadataType
type object must be inherited by IChoNotifyRecordWrite
interface.
Tip: Any exceptions raised out of these interface methods will be ignored.
IChoRecorder
exposes the below methods:
BeginWrite
- Invoked at the begin of the CSV file write EndWrite
- Invoked at the end of the CSV file write BeforeRecordWrite
- Raised before the CSV record write AfterRecordWrite
- Raised after CSV record write RecordWriteError
- Raised when CSV record errors out while writing BeforeRecordFieldWrite
- Raised before CSV column value write AfterRecordFieldWrite
- Raised after CSV column value write RecordFieldWriteError
- Raised when CSV column value errors out while writing
IChoNotifyRecordWrite
exposes the below methods:
BeforeRecordWrite
- Raised before the CSV record write AfterRecordWrite
- Raised after CSV record write RecordWriteError
- Raised when CSV record write errors out
IChoNotifyFileWrite
exposes the below methods:
BeginWrite
- Invoked at the begin of the CSV file write EndWrite
- Invoked at the end of the CSV file write
IChoNotifyRecordFieldWrite
exposes the below methods:
BeforeRecordFieldWrite
- Raised before CSV column value write AfterRecordFieldWrite
- Raised after CSV column value write RecordFieldWriteError
- Raised when CSV column value write errors out
IChoNotifyFileHeaderArrange
exposes the below methods:
FileHeaderArrange
- Raised before CSV file header is written to file, an opportunity to rearrange the CSV columns
IChoNotifyFileHeaderWrite
exposes the below methods:
FileHeaderWrite
- Raised before CSV file header is written to file, an opportunity to customize the header.
This is more direct and the simplest way to subscribe to the callback events and handle your odd situations in parsing CSV files. Downside is that code can't be reusable as you do by implementing IChoNotifyRecordRead
with POCO record object.
The sample below shows how to use the BeforeRecordLoad
callback method to skip lines stating with '%
' characters.
Listing 10.1.1 Using CSVWriter callback events
static void IgnoreLineTest()
{
using (var parser = new ChoCSVWriter("IgnoreLineFile.csv").WithFirstLineHeader())
{
parser.Configuration.Encoding = Encoding.BigEndianUnicode;
parser.BeforeRecordWrite += (o, e) =>
{
if (e.Source != null)
{
e.Skip = ((string)e.Source).StartsWith("%");
}
};
parser.Write(rec);
}
}
Likewise, you can use other callback methods as well with CSVWriter
.
The sample below shows how to implement IChoNotifyRecordWrite
interface to direct POCO class.
Listing 10.2.1 Direct POCO callback mechanism implementation
[ChoCSVFileHeader]
[ChoCSVRecordObject(Encoding = "Encoding.UTF32", ErrorMode = ChoErrorMode.IgnoreAndContinue,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All, ThrowAndStopOnMissingField = false)]
public partial class EmployeeRec : IChoNotifyrRecordWrite
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, int.MaxValue, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
public bool AfterRecordWrite(object target, int index, object source)
{
throw new NotImplementedException();
}
public bool BeforeRecordWrite(object target, int index, ref object source)
{
throw new NotImplementedException();
}
public bool RecordWriteError(object target, int index, object source, Exception ex)
{
throw new NotImplementedException();
}
}
The sample below shows how to attach Metadata
class to POCO class by using MetadataTypeAttribute
on it.
Listing 10.2.2 MetaDataType based callback mechanism implementation
[ChoCSVFileHeader]
[ChoCSVRecordObject(Encoding = "Encoding.UTF32", ErrorMode = ChoErrorMode.IgnoreAndContinue,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All, ThrowAndStopOnMissingField = false)]
public class EmployeeRecMeta : IChoNotifyRecordWrite
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, int.MaxValue, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
public bool AfterRecordWrite(object target, int index, object source)
{
throw new NotImplementedException();
}
public bool BeforeRecordWrite(object target, int index, ref object source)
{
throw new NotImplementedException();
}
public bool RecordWriteError(object target, int index, object source, Exception ex)
{
throw new NotImplementedException();
}
}
[MetadataType(typeof(EmployeeRecMeta))]
public partial class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, int.MaxValue, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
}
The sample below shows how to attach Metadata
class for sealed or third party POCO class by using ChoMetadataRefTypeAttribute
on it.
Listing 10.2.2 MetaDataType based callback mechanism implementation
[ChoMetadataRefType(typeof(EmployeeRec))]
[ChoCSVFileHeader]
[ChoCSVRecordObject(Encoding = "Encoding.UTF32", ErrorMode = ChoErrorMode.IgnoreAndContinue,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All, ThrowAndStopOnMissingField = false)]
public class EmployeeRecMeta : IChoNotifyRecordWrite
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, int.MaxValue, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
public bool AfterRecordWrite(object target, int index, object source)
{
throw new NotImplementedException();
}
public bool BeforeRecordWrite(object target, int index, ref object source)
{
throw new NotImplementedException();
}
public bool RecordWriteError(object target, int index, object source, Exception ex)
{
throw new NotImplementedException();
}
}
public partial class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "id")]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, int.MaxValue, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name", QuoteField = true)]
[Required]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
}
This callback invoked once at the beginning of the CSV file write. source
is the CSV file stream object. In here, you have a chance to inspect the stream, return true
to continue the CSV generation. Return false
to stop the generation.
Listing 10.1.1 BeginWrite Callback Sample
public bool BeginWrite(object source)
{
StreamReader sr = source as StreamReader;
return true;
}
This callback invoked once at the end of the CSV file generation. source
is the CSV file stream object. In here, you have a chance to inspect the stream, do any post steps to be performed on the stream.
Listing 10.2.1 EndWrite Callback Sample
public void EndWrite(object source)
{
StreamReader sr = source as StreamReader;
}
This callback invoked before each POCO record object is written to CSV file. target is the instance of the POCO record object. index
is the line index in the file. source
is the CSV record line. In here, you have a chance to inspect the POCO object, and generate the CSV record line if needed.
Tip: If you want to skip the record from writing, set the source to null
.
Tip: If you want to take control of CSV record line generation, set the source to valid CSV record line text.
Return true
to continue the load process, otherwise return false
to stop the process.
Listing 10.3.1 BeforeRecordWrite Callback Sample
public bool BeforeRecordWrite(object target, int index, ref object source)
{
source = "1,Raj";
return true;
}
This callback invoked after each POCO record object is written to CSV file. target
is the instance of the POCO record object. index
is the line index in the file. source
is the CSV record line. In here, you have a chance to do any post step operation with the record line.
Return true
to continue the load process, otherwise return false
to stop the process.
Listing 10.4.1 AfterRecordWrite Callback Sample
public bool AfterRecordWrite(object target, int index, object source)
{
string line = source as string;
return true;
}
This callback invoked if error encountered while writing POCO record object. target
is the instance of the POCO record object. index
is the line index in the file. source
is the CSV record line. ex
is the exception object. In here, you have a chance to handle the exception. This method is invoked only when Configuration.ErrorMode
is ReportAndContinue
.
Return true
to continue the load process, otherwise return false
to stop the process.
Listing 10.5.1 RecordWriteError Callback Sample
public bool RecordLoadError(object target, int index, object source, Exception ex)
{
string line = source as string;
return true;
}
This callback invoked before each CSV record column is written to CSV file. target
is the instance of the POCO record object. index
is the line index in the file. propName
is the CSV record property name. value
is the CSV column value. In here, you have a chance to inspect the CSV record property value and perform any custom validations, etc.
Return true
to continue the load process, otherwise return false
to stop the process.
Listing 10.6.1 BeforeRecordFieldWrite Callback Sample
public bool BeforeRecordFieldWrite(object target, int index, string propName, ref object value)
{
return true;
}
This callback invoked after each CSV record column value is written to CSV file. target
is the instance of the POCO record object. index
is the line index in the file. propName
is the CSV record property name. value is the CSV column value. Any post field operation can be performed here, like computing other properties, validations, etc.
Return true
to continue the load process, otherwise return false
to stop the process.
Listing 10.7.1 AfterRecordFieldWrite Callback Sample
public bool AfterRecordFieldWrite(object target, int index, string propName, object value)
{
return true;
}
This callback invoked when error encountered while writing CSV record column value. target
is the instance of the POCO record object. index
is the line index in the file. propName
is the CSV record property name. value
is the CSV column value. ex
is the exception object. In here, you have chance to handle the exception. This method invoked only after the below two sequences of steps performed by the CSVWriter
.
CSVWriter
looks for FallbackValue
value of each CSV property. If present, it tries to use it to write. - If the
FallbackValue
value is not present and the Configuration.ErrorMode
is specified as ReportAndContinue
, this callback will be executed.
Return true
to continue the load process, otherwise return false
to stop the process.
Listing 10.8.1 RecordFieldWriteError Callback Sample
public bool RecordFieldWriteError
(object target, int index, string propName, object value, Exception ex)
{
return true;
}
CSVWriter
automatically detects and loads the configuration settings from POCO entity. At runtime, you can customize and tweak these parameters before CSV generation. CSVWriter
exposes Configuration
property, it is of ChoCSVRecordConfiguration
object. Using this property, you can perform the customization.
Listing 11.1 Customizing CSVWriter at run-time
class Program
{
static void Main(string[] args)
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 1;
rec1.Name = "Mark";
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 2;
rec2.Name = "Jason";
objs.Add(rec2);
using (var parser = new ChoCSVWriter("Emp.csv"))
{
parser.Configuration.ColumnCountStrict = true;
parser.Write(objs);
}
}
}
So far, the article explained about using CSVWriter
with POCO object. CSVWriter
also supports generating CSV file without POCO entity objects. It leverages .NET dynamic feature. The sample below shows how to generate CSV stream using dynamic objects. The CSV schema is determined from first object. If there is a mismatch found in the dynamic objects member values, error will be raised and stops the generation process.
The sample below shows it.
Listing 12.1 Generating CSV file from dynamic objects
class Program
{
static void Main(string[] args)
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 1;
rec1.Name = "Mark";
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 2;
rec2.Name = "Jason";
objs.Add(rec2);
using (var parser = new ChoCSVWriter("Emp.csv"))
{
parser.Configuration.ColumnCountStrict = true;
parser.Write(objs);
}
}
}
CSVWriter
throws different types of exceptions in different situations:
ChoParserException
- CSV file is bad and parser not able to recover. ChoRecordConfigurationException
- Any invalid configuration settings are specified, this exception will be raised. ChoMissingRecordFieldException
- A property is missing for a CSV column, this exception will be raised.
CSVWriter
automatically quotes the column values if either of the following conditions is met:
- the value contains newline / delimiter characters
- the value contains quote character
- the value contains leading or trailing spaces
Other situations, if you want to add quotes around values, it can be specified in QuoteField
parameter as true
.
Listing 14.1.1 Multiline column values in CSV file
[ChoCSVFileHeader]
[ChoCSVRecordObject(HasExcelSeparator = true)]
public class EmployeeRec
{
[ChoCSVRecordField(1, FieldName = "NewId")]
[Required]
[ChoFallbackValue(100)]
[Range(100, 10000)]
public int? Id
{
get;
set;
}
[ChoCSVRecordField(2, QuoteField = true)]
[DefaultValue("XXXX")]
public string Name
{
get;
set;
}
public override string ToString()
{
return "{0}. {1}.".FormatString(Id, Name);
}
}
Cinchoo ETL works better with data annotation's MetadataType
model. It is a way to attach MetaData
class to data model class. In this associated class, you provide additional metadata information that is not in the data model. Its role is to add attribute to a class without having to modify this one. You can add this attribute that takes a single parameter to a class that will have all the attributes. This is useful when the POCO classes are auto generated (by Entity Framework, MVC, etc.) by automatic tools. This is why the second class comes into play. You can add new stuff without touching the generated file. Also, this promotes modularization by separating the concerns into multiple classes.
For more information about it, please search in MSDN.
Listing 15.1 MetadataType annotation usage sample
[MetadataType(typeof(EmployeeRecMeta))]
public class EmployeeRec
{
public int Id { get; set; }
public string Name { get; set; }
}
[ChoCSVFileHeader]
[ChoCSVRecordObject(Encoding = "Encoding.UTF32", ErrorMode = ChoErrorMode.ThrowAndStop,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All, ThrowAndStopOnMissingField = false,
ObjectValidationMode = ChoObjectValidationMode.MemberLevel)]
public class EmployeeRecMeta : IChoNotifyRecordWrite, IChoValidatable
{
[ChoCSVRecordField(1, FieldName = "id", ErrorMode = ChoErrorMode.ReportAndContinue )]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, 1, ErrorMessage = "Id must be > 0.")]
[ChoFallbackValue(1)]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name", QuoteField = true)]
[StringLength(1)]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
public bool AfterRecordWrite(object target, int index, object source)
{
throw new NotImplementedException();
}
public bool BeforeRecordWrite(object target, int index, ref object source)
{
throw new NotImplementedException();
}
public bool RecordWriteError(object target, int index, object source, Exception ex)
{
throw new NotImplementedException();
}
public bool TryValidate(object target, ICollection<ValidationResult> validationResults)
{
return true;
}
public bool TryValidateFor
(object target, string memberName, ICollection<ValidationResult> validationResults)
{
return true;
}
}
In the above, EmployeeRec
is the data class. Contains only domain specific properties and operations. Mark it very simple class to look at it.
We separate the validation, callback mechanism, configuration, etc. into metadata type class, EmployeeRecMeta
.
If the POCO entity class is an auto-generated class or exposed via library or it is a sealed class, it limits you to attach CSV schema definition to it declaratively. In such case, you can choose one of the options below to specify CSV layout configuration:
- Manual Configuration
- Auto Map Configuration
- Attaching
MetadataType
class
I'm going to show you how to configure the below POCO entity class on each approach.
Listing 16.1 Sealed POCO entity class
public sealed class EmployeeRec
{
public int Id { get; set; }
public string Name { get; set; }
}
Define a brand new configuration object from scratch and add all the necessary CSV fields to the ChoCSVConfiguration.CSVRecordFieldConfigurations
collection property. This option gives you greater flexibility to control the configuration of CSV parsing. But the downside is the possibility of making mistakes and it is hard to manage them if the CSV file layout is large.
Listing 16.1.1 Manual Configuration
ChoCSVRecordConfiguration config = new ChoCSVRecordConfiguration();
config.CSVFileHeaderConfiguration.HasHeaderRecord = true;
config.ThrowAndStopOnMissingField = true;
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Id", 1));
config.CSVRecordFieldConfigurations.Add(new ChoCSVRecordFieldConfiguration("Name", 2));
This is an alternative approach and very less error-prone method to auto map the CSV columns for the POCO entity class.
First, define a schema class for EmployeeRec
POCO entity class as below.
Listing 16.2.1 Auto Map class
public class EmployeeRecMap
{
[ChoCSVRecordField(1, FieldName = "id")]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name")]
public string Name { get; set; }
}
Then you can use it to auto map CSV columns by using ChoCSVRecordConfiguration.MapRecordFields
method.
Listing 16.2.2 Using Auto Map configuration
ChoCSVRecordConfiguration config = new ChoCSVRecordConfiguration();
config.MapRecordFields<EmployeeRecMap>();
EmployeeRec rec1 = new EmployeeRec();
rec1.Id = 2;
rec1.Name = "Jason";
foreach (var e in new ChoCSVWriter<EmployeeRec>("Emp.csv", config))
w.Write(rec1);
This is another approach to attach MetadataType
class for POCO entity object. Previous approach simply cares for auto mapping of CSV columns only. Other configuration properties like property converters, parser parameters, default/fallback values, etc. are not considered.
This model accounts for everything by defining MetadataType
class and specifying the CSV configuration parameters declaratively. This is useful when your POCO entity is sealed and not partial class. Also, it is one of favorable and less error-prone approach to configure CSV parsing of POCO entity.
Listing 16.3.1 Define MetadataType class
[ChoCSVFileHeader()]
[ChoCSVRecordObject(Encoding = "Encoding.UTF32", ErrorMode = ChoErrorMode.ReportAndContinue,
IgnoreFieldValueMode = ChoIgnoreFieldValueMode.All, ThrowAndStopOnMissingField = false,
ObjectValidationMode = ChoObjectValidationMode.MemberLevel)]
public class EmployeeRecMeta : IChoNotifyRecordWrite, IChoValidatable
{
[ChoCSVRecordField(1, FieldName = "id", ErrorMode = ChoErrorMode.ReportAndContinue )]
[ChoTypeConverter(typeof(IntConverter))]
[Range(1, 1, ErrorMessage = "Id must be > 0.")]
public int Id { get; set; }
[ChoCSVRecordField(2, FieldName = "Name", QuoteField = true)]
[StringLength(1)]
[DefaultValue("ZZZ")]
[ChoFallbackValue("XXX")]
public string Name { get; set; }
public bool AfterRecordWrite(object target, int index, object source)
{
throw new NotImplementedException();
}
public bool BeforeRecordWrite(object target, int index, ref object source)
{
throw new NotImplementedException();
}
public bool RecordWriteError(object target, int index, object source, Exception ex)
{
throw new NotImplementedException();
}
public bool TryValidate(object target, ICollection<ValidationResult> validationResults)
{
return true;
}
public bool TryValidateFor(object target, string memberName,
ICollection<ValidationResult> validationResults)
{
return true;
}
}
Listing 16.3.2 Attaching MetadataType class
ChoMetadataObjectCache.Default.Attach<EmployeeRec>(new EmployeeRecMeta());
using (var tx = File.OpenWrite("Emp.csv"))
{
using (var parser = new ChoCSVWriter<EmployeeRec>(tx))
{
parser.Write(objs);
}
}
This is little nifty helper method to generate CSV formatted output from list of objects. It helps you to run and play with different options to see the CSV output quickly in test environment.
static void ToTextTest()
{
List<EmployeeRec> objs = new List<EmployeeRec>();
EmployeeRec rec1 = new EmployeeRec();
rec1.Id = 10;
rec1.Name = "Mark";
objs.Add(rec1);
EmployeeRec rec2 = new EmployeeRec();
rec2.Id = 200;
rec2.Name = "Lou";
objs.Add(rec2);
Console.WriteLine(ChoCSVWriter.ToTextAll(objs));
}
This is little nifty helper method to generate CSV formatted output from an object. It helps you to run and play with different options to see the CSV output quickly in test environment.
static void ToTextTest()
{
EmployeeRec rec1 = new EmployeeRec();
rec1.Id = 10;
rec1.Name = "Mark";
objs.Add(rec1);
Console.WriteLine(ChoCSVWriter.ToText(rec1));
}
This helper method lets you create CSV file / stream from ADO.NET DataReader
.
static void WriteDataReaderTest()
{
SqlConnection conn = new SqlConnection(connString);
conn.Open();
SqlCommand cmd = new SqlCommand("SELECT * FROM Members", conn);
IDataReader dr = cmd.ExecuteReader();
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv))
{
parser.Write(dr);
}
Console.WriteLine(csv.ToString());
}
This helper method lets you create CSV file / stream from ADO.NET DataTable
.
static void WriteDataTableTest()
{
string connString = @"Data Source=(localdb)\v11.0;
Initial Catalog=TestDb;Integrated Security=True";
SqlConnection conn = new SqlConnection(connString);
conn.Open();
SqlCommand cmd = new SqlCommand("SELECT * FROM Members", conn);
SqlDataAdapter da = new SqlDataAdapter(cmd);
DataTable dt = new DataTable();
da.Fill(dt);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
)
{
parser.Write(dt);
}
Console.WriteLine(csv.ToString());
}
Cinchoo ETL automatically parses and converts each CSV column values to the corresponding CSV column's underlying data type seamlessly. Most of the basic .NET types are handled automatically without any setup needed.
This is achieved through two key settings in the ETL system:
ChoCSVRecordConfiguration.CultureInfo
- Represents information about a specific culture including the names of the culture, the writing system, and the calendar used, as well as access to culture-specific objects that provide information for common operations, such as formatting dates and sorting strings. Default is 'en-US
'. ChoTypeConverterFormatSpec
- It is global format specifier class holds all the intrinsic .NET types formatting specs.
In this section, I'm going to talk about changing the default format specs for each .NET intrinsic data types according to parsing needs.
ChoTypeConverterFormatSpec
is a singleton class, the instance is exposed via 'Instance
' static
member. It is thread local, means that there will be a separate instance copy kept on each thread.
There are 2 sets of format specs members given to each intrinsic type, one for loading and another one for writing the value, except for Boolean
, Enum
, DataTime
types. These types have only one member for both loading and writing operations.
Specifying each intrinsic data type format specs through ChoTypeConverterFormatSpec
will impact system wide, i.e., by setting ChoTypeConverterFormatSpec.IntNumberStyle = NumberStyles.AllowParentheses
, will impact all integer members of CSV objects to allow parentheses. If you want to override this behavior and take control of specific CSV data member to handle its own unique parsing of CSV value from global system wide setting, it can be done by specifying TypeConverter
at the CSV field member level. Refer to section 13.4 for more information.
Listing 20.1.1 ChoTypeConverterFormatSpec Members
public class ChoTypeConverterFormatSpec
{
public static readonly ThreadLocal<ChoTypeConverterFormatSpec>
Instance = new ThreadLocal<ChoTypeConverterFormatSpec>
(() => new ChoTypeConverterFormatSpec());
public string DateTimeFormat { get; set; }
public ChoBooleanFormatSpec BooleanFormat { get; set; }
public ChoEnumFormatSpec EnumFormat { get; set; }
public NumberStyles? CurrencyNumberStyle { get; set; }
public string CurrencyFormat { get; set; }
public NumberStyles? BigIntegerNumberStyle { get; set; }
public string BigIntegerFormat { get; set; }
public NumberStyles? ByteNumberStyle { get; set; }
public string ByteFormat { get; set; }
public NumberStyles? SByteNumberStyle { get; set; }
public string SByteFormat { get; set; }
public NumberStyles? DecimalNumberStyle { get; set; }
public string DecimalFormat { get; set; }
public NumberStyles? DoubleNumberStyle { get; set; }
public string DoubleFormat { get; set; }
public NumberStyles? FloatNumberStyle { get; set; }
public string FloatFormat { get; set; }
public string IntFormat { get; set; }
public NumberStyles? IntNumberStyle { get; set; }
public string UIntFormat { get; set; }
public NumberStyles? UIntNumberStyle { get; set; }
public NumberStyles? LongNumberStyle { get; set; }
public string LongFormat { get; set; }
public NumberStyles? ULongNumberStyle { get; set; }
public string ULongFormat { get; set; }
public NumberStyles? ShortNumberStyle { get; set; }
public string ShortFormat { get; set; }
public NumberStyles? UShortNumberStyle { get; set; }
public string UShortFormat { get; set; }
}
The sample below shows how to load CSV data stream having 'se-SE
' (Swedish) culture specific data using CSVWriter
. Also the input feed comes with 'EmployeeNo
' values containing parentheses. In order to make the load successful, we have to set the ChoTypeConverterFormatSpec.IntNumberStyle
to NumberStyles.AllowParenthesis
.
Listing 20.1.2 Using ChoTypeConverterFormatSpec in code
static void FormatSpecDynamicTest()
{
ChoTypeConverterFormatSpec.Instance.DateTimeFormat = "d";
ChoTypeConverterFormatSpec.Instance.BooleanFormat = ChoBooleanFormatSpec.YOrN;
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
Cinchoo ETL provides ChoCurrency
object to read and write currency values in CSV files. ChoCurrency
is a wrapper class to hold the currency value in decimal type along with support of serializing them in text format during CSV load.
Listing 20.2.1 Using Currency members in dynamic model
static void CurrencyDynamicTest()
{
ChoTypeConverterFormatSpec.Instance.CurrencyFormat = "C2";
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
The sample above shows how to output currency values using dynamic object model. As the currency output will have thousand comma separator, this will fail to generate CSV file. To overcome this issue, we specify the writer to quote all fields.
P.S.: The format of the currency value is figured by CSVWriter
through ChoRecordConfiguration.Culture
and ChoTypeConverterFormatSpec.CurrencyFormat
.
The sample below shows how to use ChoCurrency
CSV field in POCO entity class.
Listing 20.2.2 Using Currency members in POCO model
public class EmployeeRecWithCurrency
{
public int Id { get; set; }
public string Name { get; set; }
public ChoCurrency Salary { get; set; }
}
static void CurrencyPOCOTest()
{
List<EmployeeRecWithCurrency> objs = new List<EmployeeRecWithCurrency>();
EmployeeRecWithCurrency rec1 = new EmployeeRecWithCurrency();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
EmployeeRecWithCurrency rec2 = new EmployeeRecWithCurrency();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
Cinchoo ETL implicitly handles parsing/writing of enum
column values from CSV files. If you want to fine control the parsing of these values, you can specify them globally via ChoTypeConverterFormatSpec.EnumFormat
. Default is ChoEnumFormatSpec.Value
.
FYI, changing this value will impact system wide.
There are three possible values that can be used:
ChoEnumFormatSpec.Value
- Enum
value is used for parsing. ChoEnumFormatSpec.Name
- Enum
key name is used for parsing. ChoEnumFormatSpec.Description
- If each enum
key is decorated with DescriptionAttribute
, its value will be used for parsing.
Listing 20.3.1 Specifying Enum format specs during parsing
public enum EmployeeType
{
[Description("Full Time Employee")]
Permanent = 0,
[Description("Temporary Employee")]
Temporary = 1,
[Description("Contract Employee")]
Contract = 2
}
static void EnumTest()
{
ChoTypeConverterFormatSpec.Instance.EnumFormat = ChoEnumFormatSpec.Description;
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
rec1.Status = EmployeeType.Permanent;
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
rec2.Status = EmployeeType.Contract;
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
Cinchoo ETL implicitly handles parsing/writing of boolean CSV column values from CSV files. If you want to fine control the parsing of these values, you can specify them globally via ChoTypeConverterFormatSpec.BooleanFormat
. Default value is ChoBooleanFormatSpec.ZeroOrOne
.
FYI, changing this value will impact system wide.
There are four possible values can be used:
ChoBooleanFormatSpec.ZeroOrOne
- '0
' for false
. '1
' for true
. ChoBooleanFormatSpec.YOrN
- 'Y
' for true
, 'N
' for false
. ChoBooleanFormatSpec.TrueOrFalse
- 'True
' for true
, 'False
' for false
. ChoBooleanFormatSpec.YesOrNo
- 'Yes
' for true
, 'No
' for false
.
Listing 20.4.1 Specifying boolean format specs during parsing
static void BoolTest()
{
ChoTypeConverterFormatSpec.Instance.BooleanFormat = ChoBooleanFormatSpec.YOrN;
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
rec1.Status = EmployeeType.Permanent;
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
rec2.Status = EmployeeType.Contract;
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
Cinchoo ETL implicitly handles parsing/writing of datetime
CSV column values from CSV files using system Culture or custom set culture. If you want to fine control the parsing of these values, you can specify them globally via ChoTypeConverterFormatSpec.DateTimeFormat
. Default value is 'd
'.
FYI, changing this value will impact system wide.
You can use any valid standard or custom datetime .NET format specification to parse the datetime
CSV values from the file.
Listing 20.5.1 Specifying datetime format specs during parsing
static void DateTimeDynamicTest()
{
ChoTypeConverterFormatSpec.Instance.DateTimeFormat = "MMM dd, yyyy";
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
The sample above shows how to generate custom datetime
values to CSV file.
Note: As the datetime
values contain CSV separator, we instruct the writer to quote all fields.
CSVWriter
exposes few frequent to use configuration parameters via Fluent API methods. This will make the programming of generating CSV files quicker.
This API method sets the CSV field separator on CSVWriter
.
static void QuickDynamicDelimiterTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.WithDelimiter("|")
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
This API method flags the CSV file contains first row as header or not. Optional bool parameter specifies the first row header or not. Default is true
.
static void QuickDynamicTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.QuoteAllFields()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
This API method specifies the list of CSV fields to be considered for writing CSV file. Other fields will be discarded. Field names are case-insensitive.
static void QuickDynamicTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.WithFields("Id", "Name")
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
This API method used to add CSV column with specific date type, quote flag, and/or quote character. This method is helpful in dynamic object model, by specifying each and individual CSV column with appropriate datatype
.
static void QuickDynamicTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.WithField("Id", typeof(int))
.WithField("Name"))
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
This API method used to specify whether all fields are to be surrounded by quotes or not.
static void QuickDynamicTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.QuoteAllFields()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
This API method used to set the CSVWriter
to perform check on column countness before writing CSV file.
static void QuickDynamicTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
rec1.JoinedDate = new DateTime(2001, 2, 2);
rec1.IsActive = true;
rec1.Salary = new ChoCurrency(100000);
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
rec2.JoinedDate = new DateTime(1990, 10, 23);
rec2.IsActive = false;
rec2.Salary = new ChoCurrency(150000);
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.ColumnCountStrict()
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
This API method used to configure all configuration parameters which are not exposed via Fluent API.
static void ConfigureTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.Configure(c => c.ErrorMode = ChoErrorMode.ThrowAndStop)
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
This API method used to setup the writer's parameters / events via Fluent API.
static void SetupTest()
{
List<ExpandoObject> objs = new List<ExpandoObject>();
dynamic rec1 = new ExpandoObject();
rec1.Id = 10;
rec1.Name = "Mark";
objs.Add(rec1);
dynamic rec2 = new ExpandoObject();
rec2.Id = 200;
rec2.Name = "Lou";
objs.Add(rec2);
StringBuilder csv = new StringBuilder();
using (var parser = new ChoCSVWriter(csv)
.WithFirstLineHeader()
.Setup(r => r.BeforeRecordWrite += (o, e) =>
{
})
)
{
parser.Write(objs);
}
Console.WriteLine(csv.ToString());
}
For more information about Cinchoo ETL, please visit the other CodeProject articles:
Lookup existing issues / open new issues at
- 20th December, 2016: Initial version
- 11th April, 2020: Article updated