Posts in Category: dynamics 365

Bulk Delete with Dynamics365

No matter what project you are on, you are at one point or another going to need to delete data.

Your environment will become polluted and you going through each entity or manually queuing up your own delete jobs will not be worth your time.

To bulk delete your entire environment you can make use of the Bulk Delete API with the following lines of code.

QueryExpression bulkquery = new QueryExpression();
bulkquery.EntityName = entity.LogicalName;
bulkquery.Distinct = false;

BulkDeleteRequest request = new BulkDeleteRequest
{
JobName = String.Format("System Purge Requested for Entity [{0}] - All Records", bulkquery.EntityName),
QuerySet = new[] { bulkquery },
StartDateTime = DateTime.Now.ToUniversalTime(),
ToRecipients = new[] { currentUserId },
CCRecipients = new Guid[] { },
SendEmailNotification = false,
RecurrencePattern = String.Empty
};

ExecuteMultipleResponse response = new ExecuteMultipleResponse();
response = (ExecuteMultipleResponse)_svc.CrmService.Execute(multipleRequest);

Of critical importance is the

Read More


File Source Encoding

I’ve been doing some migration from CSV files over the last few months that resulted in some interesting exercises in the cleansing of data before it gets to Dynamics.

For all of this work, the code was written using C# and the Dynamics SDK.

Encoding

I ran into a number of encoding problems in transferring data from the files into Dynamics. Initially, I thought Dynamics could not handle the character set but on closer inspection, it was the code that wasn’t loading it properly.  I tried a number of different Encoding formats but the one that addressed all scenarios as using the Encoding.Default and let the system decide which to use and when.

Therefore, when loading the data, my initial delimited looked as follows;
var data = File.ReadAllLines(file, Encoding.Default).Select(a => a.Split('\n').ToList());

Delimiters

Since all the data

Read More


Joining Queries in Dynamics

I recently finished a project that required some pretty complicated querying between entities to get at the data.

What I found is that I was making too many calls to get linked data between EntityReferences in my code to the server.

To improve the performance of these calls I made use of the Addlink functionality on the QueryExpression.

Whereas before I was making two calls to get at my entity data, now I was only going to make one.

My initial query looked like this

QueryExpression query = new QueryExpression(); query.EntityName = "ENTITY"; query.Criteria.AddCondition(new ConditionExpression("ATTR_FILTER1", ConditionOperator.Equal, APPID)); query.Criteria.AddCondition(new ConditionExpression("ATTR_FILTER2", ConditionOperator.Equal, SearchId)); query.ColumnSet = new ColumnSet(new string[] { ENT_LOOKUP_ID, ENT_LOOKUP_ID2 }); 

var plink = query.AddLink("ENTITY2", ENT_LOOKUP_ID2D , ENT_LOOKUP_ID2 , JoinOperator.Inner); plink.EntityAlias = "PH"; plink.Columns.AddColumns("ATTR_FILTER4", "ATTR_FILTER5");

 

When executed as a normal query, this in turn

Read More


Using the Bulk API to Create Records

I wanted to build a data populator for Dynamics that created a bunch of custom records and did some associations between them.

(The reasoning is so I could be sure I was always validating the same thing).

To do this, I wanted to use the Bulk API.  The Bulk API has a limit of 1000 requests per message so in addition to bulk loading my requests, I needed to ensure that multiple requests could be queued up and handled by the server.

Here is some of the sample code I wrote for this interaction.

public void BulkDispatchCreateRequests<T>(List<T> requests)
 {
 int MaxBatchSize = 999;
 bool CreateBatchRequest = true;

ExecuteMultipleRequest requestWithResults = null;

for (int i = 0; i < requests.Count(); i++)
 {
 if (CreateBatchRequest)
 {
 requestWithResults = new ExecuteMultipleRequest()
 {
 // Assign settings that define execution behavior:															

Read More


Dynamics SDK Paging Records

I never thought I’d have a case when I’d need to work with over 5,000 records from the Dynamics SDK.

The scenario here is that I was trying to reproduce some production issues and validate out some code changes with a higher threshold of objects.  However, as I started testing, I noticed I wasn’t bringing back everything and had hit the that magical 5,000 limit.

Without wanting to change it, I modified by query to support the retrieval of 5,000+ items that would return to my service to be worked on.

bool KeepSearching = true;
 EntityCollection results = null;
 Dictionary <Guid, Entity> Accessors = new Dictionary<Guid, Entity>();

QueryExpression query = new QueryExpression();
 query.EntityName = "contact";
 query.Criteria.AddCondition(new ConditionExpression("contactid", ConditionOperator.Equal, ContactId));
 query.ColumnSet = new ColumnSet(new string[] { "contactid", "fullname"});
 query.PageInfo = new PagingInfo();
  query.PageInfo.PageNumber =															

Read More