Posts in Category: dynamics 365

The Dynamics Package Deployer

I was looking to automate some of my solution deployments for customers to simplify the deployment of solutions into new environments and ensure the right steps were being followed pre and post deployment.

To do this I wanted to leverage the Package Deployer from Dynamics.  There is already some pretty good information on how to get started building your first package, however I ran into a few problems not covered in the documentation.

Cannot Connect to an Dynamics365 instance

There are many articles on this topic as it ranges from TLS to security permissions, however for me, it came down to SDK versions.  I generally work on a number of different instances across many customers and versions so I need to keep my versions strongly in check.  As such, when trying (and

Read More


Bulk Delete with Dynamics365

No matter what project you are on, you are at one point or another going to need to delete data.

Your environment will become polluted and you going through each entity or manually queuing up your own delete jobs will not be worth your time.

To bulk delete your entire environment you can make use of the Bulk Delete API with the following lines of code.

QueryExpression bulkquery = new QueryExpression();
bulkquery.EntityName = entity.LogicalName;
bulkquery.Distinct = false;

BulkDeleteRequest request = new BulkDeleteRequest
{
JobName = String.Format("System Purge Requested for Entity [{0}] - All Records", bulkquery.EntityName),
QuerySet = new[] { bulkquery },
StartDateTime = DateTime.Now.ToUniversalTime(),
ToRecipients = new[] { currentUserId },
CCRecipients = new Guid[] { },
SendEmailNotification = false,
RecurrencePattern = String.Empty
};

ExecuteMultipleResponse response = new ExecuteMultipleResponse();
response = (ExecuteMultipleResponse)_svc.CrmService.Execute(multipleRequest);

Of critical importance is the

Read More


File Source Encoding

I’ve been doing some migration from CSV files over the last few months that resulted in some interesting exercises in the cleansing of data before it gets to Dynamics.

For all of this work, the code was written using C# and the Dynamics SDK.

Encoding

I ran into a number of encoding problems in transferring data from the files into Dynamics. Initially, I thought Dynamics could not handle the character set but on closer inspection, it was the code that wasn’t loading it properly.  I tried a number of different Encoding formats but the one that addressed all scenarios as using the Encoding.Default and let the system decide which to use and when.

Therefore, when loading the data, my initial delimited looked as follows;
var data = File.ReadAllLines(file, Encoding.Default).Select(a => a.Split('\n').ToList());

Delimiters

Since all the data

Read More


Joining Queries in Dynamics

I recently finished a project that required some pretty complicated querying between entities to get at the data.

What I found is that I was making too many calls to get linked data between EntityReferences in my code to the server.

To improve the performance of these calls I made use of the Addlink functionality on the QueryExpression.

Whereas before I was making two calls to get at my entity data, now I was only going to make one.

My initial query looked like this

QueryExpression query = new QueryExpression(); query.EntityName = "ENTITY"; query.Criteria.AddCondition(new ConditionExpression("ATTR_FILTER1", ConditionOperator.Equal, APPID)); query.Criteria.AddCondition(new ConditionExpression("ATTR_FILTER2", ConditionOperator.Equal, SearchId)); query.ColumnSet = new ColumnSet(new string[] { ENT_LOOKUP_ID, ENT_LOOKUP_ID2 }); 

var plink = query.AddLink("ENTITY2", ENT_LOOKUP_ID2D , ENT_LOOKUP_ID2 , JoinOperator.Inner); plink.EntityAlias = "PH"; plink.Columns.AddColumns("ATTR_FILTER4", "ATTR_FILTER5");

 

When executed as a normal query, this in turn

Read More


Using the Bulk API to Create Records

I wanted to build a data populator for Dynamics that created a bunch of custom records and did some associations between them.

(The reasoning is so I could be sure I was always validating the same thing).

To do this, I wanted to use the Bulk API.  The Bulk API has a limit of 1000 requests per message so in addition to bulk loading my requests, I needed to ensure that multiple requests could be queued up and handled by the server.

Here is some of the sample code I wrote for this interaction.

public void BulkDispatchCreateRequests<T>(List<T> requests)
 {
 int MaxBatchSize = 999;
 bool CreateBatchRequest = true;

ExecuteMultipleRequest requestWithResults = null;

for (int i = 0; i < requests.Count(); i++)
 {
 if (CreateBatchRequest)
 {
 requestWithResults = new ExecuteMultipleRequest()
 {
 // Assign settings that define execution behavior:															

Read More