Posts in Category: server sdk

Bulk Delete with Dynamics365

No matter what project you are on, you are at one point or another going to need to delete data.

Your environment will become polluted and you going through each entity or manually queuing up your own delete jobs will not be worth your time.

To bulk delete your entire environment you can make use of the Bulk Delete API with the following lines of code.

QueryExpression bulkquery = new QueryExpression();
bulkquery.EntityName = entity.LogicalName;
bulkquery.Distinct = false;

BulkDeleteRequest request = new BulkDeleteRequest
JobName = String.Format("System Purge Requested for Entity [{0}] - All Records", bulkquery.EntityName),
QuerySet = new[] { bulkquery },
StartDateTime = DateTime.Now.ToUniversalTime(),
ToRecipients = new[] { currentUserId },
CCRecipients = new Guid[] { },
SendEmailNotification = false,
RecurrencePattern = String.Empty

ExecuteMultipleResponse response = new ExecuteMultipleResponse();
response = (ExecuteMultipleResponse)_svc.CrmService.Execute(multipleRequest);

Of critical importance is the

Read More

Dynamics SDK Fault Errors

Recently I was running some code against the Dynamics SDK that was taking a significant amount of time to run (transferring of data).  In my naivety, I had thought that by constantly doing “something” on that live connection my token would be refreshed.

Not the case.

It seems tokens from the Dynamics SDK are set at the time of connection and kept for the duration of the connection and not updated until you do so again.

To avoid having my connection cut during this migration (which was bad), I added some logic to check when the connection had last logged in, do a quick comparison of X minutes (in my case I chose 90 minutes but it could be more) and triggered a reconnect/refresh of my token.

The code is nothing spectacular, but understanding

Read More

Joining Queries in Dynamics

I recently finished a project that required some pretty complicated querying between entities to get at the data.

What I found is that I was making too many calls to get linked data between EntityReferences in my code to the server.

To improve the performance of these calls I made use of the Addlink functionality on the QueryExpression.

Whereas before I was making two calls to get at my entity data, now I was only going to make one.

My initial query looked like this

QueryExpression query = new QueryExpression(); query.EntityName = "ENTITY"; query.Criteria.AddCondition(new ConditionExpression("ATTR_FILTER1", ConditionOperator.Equal, APPID)); query.Criteria.AddCondition(new ConditionExpression("ATTR_FILTER2", ConditionOperator.Equal, SearchId)); query.ColumnSet = new ColumnSet(new string[] { ENT_LOOKUP_ID, ENT_LOOKUP_ID2 }); 

var plink = query.AddLink("ENTITY2", ENT_LOOKUP_ID2D , ENT_LOOKUP_ID2 , JoinOperator.Inner); plink.EntityAlias = "PH"; plink.Columns.AddColumns("ATTR_FILTER4", "ATTR_FILTER5");


When executed as a normal query, this in turn

Read More

Using the Bulk API to Create Records

I wanted to build a data populator for Dynamics that created a bunch of custom records and did some associations between them.

(The reasoning is so I could be sure I was always validating the same thing).

To do this, I wanted to use the Bulk API.  The Bulk API has a limit of 1000 requests per message so in addition to bulk loading my requests, I needed to ensure that multiple requests could be queued up and handled by the server.

Here is some of the sample code I wrote for this interaction.

public void BulkDispatchCreateRequests<T>(List<T> requests)
 int MaxBatchSize = 999;
 bool CreateBatchRequest = true;

ExecuteMultipleRequest requestWithResults = null;

for (int i = 0; i < requests.Count(); i++)
 if (CreateBatchRequest)
 requestWithResults = new ExecuteMultipleRequest()
 // Assign settings that define execution behavior:															

Read More

Dynamics SDK Paging Records

I never thought I’d have a case when I’d need to work with over 5,000 records from the Dynamics SDK.

The scenario here is that I was trying to reproduce some production issues and validate out some code changes with a higher threshold of objects.  However, as I started testing, I noticed I wasn’t bringing back everything and had hit the that magical 5,000 limit.

Without wanting to change it, I modified by query to support the retrieval of 5,000+ items that would return to my service to be worked on.

bool KeepSearching = true;
 EntityCollection results = null;
 Dictionary <Guid, Entity> Accessors = new Dictionary<Guid, Entity>();

QueryExpression query = new QueryExpression();
 query.EntityName = "contact";
 query.Criteria.AddCondition(new ConditionExpression("contactid", ConditionOperator.Equal, ContactId));
 query.ColumnSet = new ColumnSet(new string[] { "contactid", "fullname"});
 query.PageInfo = new PagingInfo();
  query.PageInfo.PageNumber =															

Read More