File Source Encoding

I’ve been doing some migration from CSV files over the last few months that resulted in some interesting exercises in the cleansing of data before it gets to Dynamics.

For all of this work, the code was written using C# and the Dynamics SDK.


I ran into a number of encoding problems in transferring data from the files into Dynamics. Initially, I thought Dynamics could not handle the character set but on closer inspection, it was the code that wasn’t loading it properly.  I tried a number of different Encoding formats but the one that addressed all scenarios as using the Encoding.Default and let the system decide which to use and when.

Therefore, when loading the data, my initial delimited looked as follows;
var data = File.ReadAllLines(file, Encoding.Default).Select(a => a.Split('\n').ToList());


Since all the data was CSV, there was bound to be commas within the data as well, the simplest way to get around this problem is to identify the patterns in your delimiters and create your own.  Most likely you will use ‘\n’ for new lines which works great for getting individual rows of data.

Clean your data

Lastly, as with any data migration, it’s all about avoiding “Garbage In, Garbage Out”.  If you are not taking the time to properly parse, cleanse and link data to the correct entities or option sets) don’t fall back to making wide-open text fields.  It will come back to bite you.  Very simple, but often avoided practices such as trimming spaces, doing case-insensitive comparisons and validating inputs can save you a significant amount of time when the customer goes to do QA.

Post A Reply

%d bloggers like this: