New CrmSdk force us to use .NET v4.6.2

It’s that time of the year when all older organizations are upgraded to the latest version of Dynamics and with that upgrade tons of new issues are produced as usual.

First of all, I have noticed that a lot of async jobs started to fail (not every time) due to 2 errors:

  • System.ServiceModel.CommunicationObjectFaultedException: The communication object, System.ServiceModel.Channels.ServiceChannel, cannot be used for communication because it is in the Faulted state
  • System.TimeoutException: Couldn’t complete execution of the custom activity XYZ plug-in within the 2-minute time limit.

The main problem was that I forgot to update the SDK on the workflow activities project to the v9.x version, but when I tried to update to the latest SDK version I was stopped by an interesting error.

There is no info about required .NET framework version on the NuGet installer screen.

But when you search up for the NuGet package on the web ( you will find the answer very quickly.

The answer is that from now on you need to use .NET v4.6.2 on all of your projects that will use new SDK NuGet packages.

This is a smart move by Microsoft that finally confirms that .NET v4.6.2 DLL assemblies are officially supported on D365.

It’s time to update all those .NET v4.5.2 projects to the new version of the framework and take advantages of it.

Embedded canvas apps – dynamic D365 context


Embedded canvas apps are in public preview for a few weeks now and I’ve been quite disappointed because of the context that is passed to them. The context that is passed to canvas app is a static one and it doesn’t display changes that are made on actual records so we need to use the data passed as context to make it dynamic in our app.

Canvas app control can be added to 2 elements on the form at the moment:

  • Single Line of Text
  • Subgrid

Single line of text control passes current record with all attributes as the context.

Subgrid control passes all records from the subgrid as context, but some fields are missing or just empty.


I wanted to create a canvas app that will allow us to edit contacts that are connected to the account directly from the account form.

Let’s create an app on a single line of text control.

If you ask yourself why would we use a single line of text instead of subgrid which is the more intuitive way the answer is really simple, I still didn’t find the way to do it on subgrid control because of the missing fields passed to the context (especially lookup ones). Hopefully, that issue will be fixed in near future.


First thing we need to input some test data into our D365 to validate our formulas in canvas app, othervise we will not be sure if the final result is ok.

I created 2 accounts with contacts that have email populated:

  • Constoso (Account)
    • John Doe (Contact)
    • Jane Doe (Contact)
  • Span (Account)
    • Ivan Ficko (Contact)

Just like is shown on the image below.

Next step is opening a classic Account form editor because canvas app controls are not supported on the modern designer and open properties of the one single line of text field (Ticker Symbol field in this example).

When field properties is opened click on the Controls tab and then Add control button.

After that you should have Add control dialog opened and there you need to pick Canvas app option.

You will be redirected to the field properties dialog again. Few extra fields were added here, but you should leave them as is because everything will be set automatically. The only thing you need to change here is Web radio button from Text Box to Canvas App. All you need to do here is to click Customize button to open Canvas App designer.

This screen will popup when you enter the app designer and as you can see it looks like a blank template with Gallery control.

First thing we need to do here is to make our list dynamic because in main object ModelDrivenFormIntegration.Data we have just a static list.

Since we know that we will use Contacts in our Gallery we need to find out Account ID that will be passed to our Filter function to filter up contacts.

Account ID can be found by executing following function:

You need to know that ModelDrivenFormIntegration.Data object will always be a list so you need to search for the first element of the list.

Now when we know the ID of account we can start writing filter function on Items property of the Gallery control.

We need to change Data property of Gallery from Custom to Dynamics data source by clicking on Add a data source button.

On the next screen click on the New connection button.

From the list choose Dynamics and click on the Create button.

Now you need to select which entity will be shown in the gallery control. Search for contacts and select Contacts checkbox and press Connect button.

If you done everything right you will see the list of all contacts in your gallery, but we need just contacts related to the account on our form.

Next step is to write a filter function in Items property of the gallery control.

This function filters all contacts by ‘Company Name’ field which is GUID of the account. Right side of equation is familiar from the previous step where we found a GUID of the account. After you type this function in the items field you should see the filtered contacts list.

Now we have the first part of the dynamics context done. It’s time to do the edit form for our contacts.

Press the New screen button and add the new Form screen.

Again we need to set Data property to Dynamics data source and select Contacts entity. We don’t need all the columns that are selected as default, so you can just uncheck them and include only first name, last name and email field.

Next, you need to set Item property to selected item from the gallery on the first screen. In my case, it’s called Gallery1 and the property for the selected item is called Selected, so the final formula is shown below:

When you type that in Item property of EditForm control you should see details from the first contact on the list in your app.

Every data link is now connected as it should be, but we still need to add some navigation controls.

First you need to edit next arrow on the list screen. You need to add one more action to OnSelect poperty and it’s a easy one that will just open second screen. Formula in OnSelect poprety should look like one below:

Screen2 is the name of the second screen in my example and it can be different in your app. The second parameter is transition animation and you can choose any of those.

Final step is to add Navigation function on the second screen. Again change OnSelect property, but this time on accept button that looks like a white tick symbol in the upper right corner of the screen. Function here must look like:

The only thing you need to change here is to input name of the first screen.

Now you can test your app by opening the app and selecting one of the contacts and then changing the email on the edit form. If everything is set up well you should see a changed email when you go back to the list form.

Your app is still not saved so you need to save it first by pressing the File -> Save button. When you saved your app you should see the app ID field populated on the Field properties dialog.

Finally we have our canvas app 100% complete. Now we need to test it on the Unified Interface because it’s not working on classic Web view.

Final result on the account form should look like this:

Video showing the full flow is shown below:


This is the trick how to add life to the context of the canvas app on the entity form. I hope that in future this will be OOB functionality, but we know that in the early days we still need to do hacks to implement something.

Probbably the same thing will be possible on the subgrid control when they fix the context fields bug.

Virtual Entities Part 3 – Custom Data Provider


This is the final blog post of the Virtual Entities series and this time I will show you how to create and set up a custom data provider for virtual entities.

Entity #1

The first step should be creating a virtual entity in Dynamics.

Creating a virtual entity for the custom connector is quite simple and I can say that is even easier than creating it for the OOB connectors. It’s easy because you don’t need to take care of external names, data types or field requirements because everything you do will be defined in the plugin code and not on the UI.

Let’s create a simple entity for our demo. The entity will hold a simple sensor data that includes temperature and a timestamp for the measurement.

The first step is obviously opening solution explorer and creating a new entity. The only things you need to be aware of here are Virtual Entity checkbox that must be checked and data source part that will remain empty for now till we create one later.

After that let’s create 2 fields that will hold our sensor data.

The Name field should a single line of text (primary field will do the job).

The Temperature field should be a whole number.

The data type is the only thing that you need to set up and leave everything else as default values. As you can see here we don’t need to think about external names because regular field names will be used in our plugin code.

That’s everything we need to set up in our entity, so we are ready to jump to the next step.


Next step is writing a simple plugin that will handle the read operations for our data.

We need to implement some data source logic in our plugin so we can fetch data from 3rd party system. This example will use a static class with some predefined data that will be used to demonstrate how to write a plugin.

First of all, we need to define a class that will hold our data. Code that describes our class is shown in the snippet below.

Next, we need to write some service that will fetch data for us. Service in this example is a simple repository that has 2 basic methods Get and GetAll which fetch a single record and all records respectively. Code for that static class is shown below.

The class is prepopulated with a list of 10 records that will be shown in Dynamics at the end.

Finally, we got to the part where we need to write a plugin. Writing a custom data provider plugin is not that different than writing one for Retrieve/RetrieveMultiple messages. All you need to do here is to set the right output parameter at the end of plugin code.

Retrieve code that will be used in this example is shown below.

The Retrieve plugin must implement IPlugin interface as any other plugin for Dynamics. Whole logic is defined in the Execute method that is part of the implemented interface. We must fetch single record here and construct the Entity object, populate it with retrieved data and pass it as BusinessEntity output parameter in the end. With that, we are done with our Retrieve method so we can move forward to RetrieveMultiple.

RetrieveMultiple code that will be used in this example is shown below.

The RetrieveMultiple plugin must also implement IPlugin interface. This time we must fetch a list of data, create an Entity collection, populate it with Entity objects with fetched data and pass the collection to the BusinessEntityCollection output parameter. Even few small changed that feel so unimportant can make this code not to work.

There are few ways of initializing the Entity object in C# notation, but only one of those work and that is one shown above.

For example, if you try to use initializations like:

It just won’t work and I don’t understand why is that like it, but you must be careful when initializing the entity object to strictly add GUID to the dictionary.

Now when we have our plugins done we can move to the next part.

Data Provider

Registering a custom data provider is the part where most people give up because there are some errors and unusual stuff that stop you from doing it the right way.

The data provider can be added only via the newest Plugin Registration Tool which can be downloaded on by clicking on the Get the SDK Tools! link.

After you downloaded the tool you need to register the newly created DLL with Retrieve and RetrieveMultiple plugins to your organization.

When you registered the assembly it’s time to create a new data provider. Creating is done by clicking on the Register drop menu and selecting Register New Data Provider.

After clicking on the last option popup menu appears. On this screen, you only need to select the Create New Data Source option.

On the next screen, you need to set Display and Plural Name that can be some random names, a solution that will hold data source metadata and logical name for the data source. Finally, you press the Create button.

But after you pressed the Create button and unpleasant error appears on your screen, but don’t worry you did everything right and the data source is created successfully. Press the Refresh button in the Plugin Registration tool and data source will be shown on the list.

Now it’s time to again press the Register New Data Provider button and start creating the data provider. The most important things here are to select the right assembly and the right Retrieve/RetrieveMultiple methods form the assembly. When you filled the data needed it’s time to hit that Update button. Plugin Registration Tool can crash in this step, but if it does crash just repeat the process until it finishes successfully.

When you created the data provider it’s time to leave Plugin Registration Tool and go to the next part.

Data Source

Every Virtual Entity needs a data source configuration that is used in entity configuration. Configurations are created in Settings -> Administration -> Virtual Entity Data Sources. Here you just need to hit New, select data source created in the previous step, input some name and press Save & Close button. Now we have our data source configuration.

Entity #2

It’s time to get back to solution explorer and our entities.

The first step is to find the Data Source entity. open the fields list, open the configuration for ID field and there you need to copy the value of the Name field to the External Name field. If this step is not done everything else will not matter because Dynamics would not know how to find Data Source configuration while using your data provider.

The final step in this example is to select the data source configuration in the virtual entity editor by adding value to the Data Source field.

After you did everything like it was shown here you should have your custom data provider Virtual Entity live & running.

Let’s check the list of Sensor Measurements list to check if everything is working.

If you see the list like one above on your screen everything is working just fine. Let’s try to click on the single record on the list to see if the form is opening as expected.


Setting up a custom data provider is kinda hard for someone that is doing it for the first time because all the errors that can show up on the way, but I think that is worth the time spent for sure in the end. Possibilities are endless if you try to implement some complex scenarios like showing virtual entities in related records subgrid on the entity form which is really a great way to use VE.

This was my final post in Virtual Entities mini-series, but I will try to post as much Virtual Entities tips and tricks as I face the problems while using them.

Virtual Entities Part 2 – Cosmos DB


Here is the second post of the series that will focus on connecting Virtual Entities with Azure Cosmos DB. Last time I covered basic concepts of virtual entities and now it’s time to explore more friendly & flexible OOB connector for Cosmos DB.

Cosmos DB Connector

Last time we jumped straight to the point of creating a new data source in Dynamics 365, but this time we don’t even have a connector installed on our instance. So the first step will be the installation of the connector.

Thankfully installing the connector is not that hard since it’s available on AppSource. You can open it by clicking Settings -> Microsoft AppSource on your instance.

When you get an AppSource popup all you need is to search for “Cosmos DB” and after you find it click on Get it now button.

You will just need to select organization on which you want to install the connector and agree with terms and conditions.

It may take before connector becomes available on your instance,  but when you see that DocumentDBDataProvider solution is added to your instance you are ready to start exploring.

Azure Cosmos DB

Before we get to the data source creating part we need first get an instance of Azure Cosmos DB.

Fist you need to open and create a new resource. Click on Create a resource button, then search for “Azure Cosmos DB” and after you select is press Create button.

Now you need to input some of the crucial information for your Cosmos DB. After you select your subscription and resource group you need to choose the name for your DB which will be used later in the data source configuration. The most important thing here is API part. You MUST select SQL  here because virtual entities don’t support any other API method mentioned in the dropdown menu. Other parameters are your own choice.

When you are done with your parameters form should look like on the image above and just be sure that API is set to SQL. After clicking on Review + create button you must wait a few minutes while your DB is being created in Azure.

When the installation process is finished we need to create the collection which will hold our documents. First, you need to open Cosmos DB resource that is created, then choose Data Explorer on the left menu. After data explorer is opened you need to press New collection button which will lead you to the collection configuration form.

You need to populate a few fields on the configuration form. First is, of course, creating a new database and after that, you need to choose a name for your collection (sensor-data collection will be used in this tutorial). I suggest you choose Fixed (10 GB) option for Storage capacity field and 400 RU/s for Throughput field for lowest cost option, of course, you can change it to whatever suits you best, but since this one is just for testing purposes I will go with the lowest cost option here.

Pressing OK button will create your database and collection.

Creating Data Source

We have our database set up so we can jump to creating a data source. You need to open Settings -> Administration -> Virtual Entity Data Sources in order to create a new data source.

Click on New opens a dialog where you need to select Azure Cosmos DB for DocumentDB API Data Provider option from the dropdown menu. When you do so you will get Cosmos DB connector configuration form.

The first parameter is a name which is not that important and it can be something that you want.

The second parameter is Collection Name which is quite undescriptive and can probably lead to an error on your first try because it’s not actually a collection name. In this field, you need to type your database name (db in our case).

The third parameter is Authorization Key which can be found on Cosmos DB resource page on the Azure portal. When you open your Cosmos DB resource you need to select Keys from the left menu. There you have your primary and secondary authentication key. Copy any of those 2 values to the Authorization Key field.

The fourth parameter is Uri that can also be found on the keys section. So copy/paste that value to the Uri parameter too.

You can leave Timeout in seconds field on the default value and click Save.

Now we have our data source ready to use.

Creating a Virtual Entity

Next step is to create a virtual entity on our Dynamics 365 instance. Go to your solution and create a new entity. First, you need to tick the Virtual Entity option and then you will get virtual entity related parameters to fill. Most important ones are External Name, External Collection Name and Data Source. External Name and External Collection Name must hold the same value and that value is the collection name of our Cosmos DB collection (sensor-data in our case). In Data Source we must select a newly created data source from the previous step. Final form must look like the one below.

We need to set primary field requirement to Optional because we will not use it in this example.

Next, we need to add External Name to our primary key field. Since we will use lower case key in our Cosmos DB documents we will set the value to id.

After that, we will add 2 more fields:  Temperature and Sensor Type field.

The Temp field will be Whole Number value field with External Name equals to temp.

The Sensor Type field will be Lookup field on Sensor Type entity (basic entity with just name field that must be created) with External Name equals to type.

When we have all the field created let’s create some documents in Cosmos DB.

Documents in Cosmos DB

First, you need to create a few Sensor Type records so we can have GUID values for those records before heading to creating documents.

Documents must be JSON formatted objects, so in our case example would be a simple JSON object like this one:

JSON Parameters that will be used are:

  1. id – randomly generated GUID (eg. GUID generator)
  2. temp – temperature value that will hold integer values
  3. type – GUID of a Sensor Type entity record

Add a few documents by going to Data Explorer -> db -> sensor-data -> Documents and click on New Document button. Input your JSON object and hit Save.

When you have a few documents in DB it’s time to try our Virtual Entity in action.

Viewing Virtual Entity Data

The most basic way to see if your virtual entity is working is going to list of records of your virtual entity. If you did everything right your records should appear in the list.

Hell yeah! Now you have working virtual entities connected to the Azure Cosmos DB.


As you can see it’s not that hard to create a solution with Cosmos DB and Virtual entities, but there are still some bugs to be fixed.

We have used a lookup field in our virtual entity. That is populated just fine by just passing the GUID to the lookup field. You can click on the lookup field and Sensor Data will be opened.

If that is working you can consider let’s do the advanced find query where we will include filter by that lookup field or add a subgrid on Sensor Type form that will show us only records for that particular Sensor Type. Everything other than list view is not possible with virtual entities, it will just show us an empty list in both cases.

I’m really looking forward to the future of Cosmos DB Virtual entities because it has such a huge potential. Imagine a very fast DB (this one is Cosmos DB) that holds tons of data (that is offloaded from Dynamics 365) that you can view it inside of entity form with just a few clicks of configuration.

I hope that this future will come as soon as possible and make our lives so much easier, but till then we need to create the custom connectors for virtual entities that will do the magic for us.

Next one up in the series, as you can expect to from my last sentence, will be creating a custom connector for a virtual entity.


Virtual Entities Part 1 – OData v4


It’s been a while since virtual entities came out, but I don’t see that people use them on the regular basis. The big question is WHY?

I can remember when I tried to make a simple OData example just after the release and it was not possible due to various bugs that were present at the time. Now after almost a year, I thought maybe it’s the right moment to try them again and go for a deep dive this time.

I plan to split the series into few smaller ones to display multiple data connectors for virtual entities so today it’s time for the one that was released first and you all know that it’s OData v4 connector.

OData v4 Connector

The first thing we need to make an integration is OData v4 web service that will be a source for our data.

I will use public OData service that can be found on:

It’s an official example for OData v4 service made against all OData standards and it has several collections that can be used. The one that will be used in this tutorial is Advertisements collection (others cannot be used with virtual entities).

Adding new Data Source

Now after we have our web service URL ready it’s time to add a new data source to the D365 instance.

Data sources can be found at Settings > Administration

When you open Virtual Entity Data Sources section it’s time to add a new Data source.

When you open create window you need to input few parameters:

  • Name – random name that will describe your data source
  • URL – bade URL of your web service
  • Timeout – time in seconds after which D365 will timeout
  • Pagination Mode – use client-side if your service doesn’t support filtering parameters and if it does set it to server-side
  • Return Inline Count – true if your service returns a number of records retrieved (supports $inclinecount parameter), otherwise, you should put false here

Here is the populated form for our example

After saving this one we have our data source ready and we are heading to the next part.

Creating a new virtual entity

The virtual entity is created like any other entity, but the magic starts when you tick the Virtual entity checkbox.

There are 3 new fields that we need to populate after selecting Virtual Entity checkbox. Data Source is more than obvious that we need to select our newly created data source as value there. Other 2 fields are a bit more tricky and can make you freak out if you don’t understand the OData structure.

First you need to input an external collection name that will be used while fetching the data for the virtual entity and as we said earlier we will use Advertisements collection. We will use this collection only because it’s the only collection that uses GUIDs for record identification numbers and you can’t use OData source if your records don’t have GUID as a unique ID in output.

The external name field is the one that we need to populate and it can be hard to put the right value field if you don’t know where to find it.

You need to check OData metadata for the collection that you will use as Data source. The answer for our case can be found on:$metadata#Advertisements

The response is quite large XML file, but all we need to find is Entity Type node that has the same parameters as our response that we get when we query Advertisements collection and the one that we need is shown below.

Now we have all the parameters that are needed on the general information tab.

Next one up is field setup.

Before we hit save we should switch on the Primary Field tab to set a few more things. It’s crucial that you set the primary field right because in virtual entities every single field that is created must be mapped to one of the fields in OData output.

Here you need to take attention on a few things.

First one is External Name parameter that can be found on the same XML snippet we used before and look at the line below:

Property Name value is the one that you need to input in External name parameter, but you need to be sure that you type/copy it just the way it’s shown in the metadata because the parameter is case sensitive and it will not work if the case doesn’t match.

Next parameter is Data Type that can also lead to tons of errors. You need to specify the right data type for the OData parameter.  In this example, it’s quite easy to select Single Line of Text because you can spot the Type String in XML, but you must be careful with number types because it must match 100%. For example, if there is decimal in XML you can’t put whole number type because it will fail when it tries to cast decimal to an integer.

The final thing is Field Requirement parameter that can also lead to errors if not set correctly.

For example, let’s assume that XML returned is modified a little bit:

Now you can see that there is also a Nullable attribute in the property node. That nullable attribute can be translated to our Field Requirement parameter as follows:

  • Business Required
    • Nullable=”false”
  • Optional
    • Nullable=”true” OR Nullable attribute at all

In the end, primary field configuration should look like one on the image below.

When you are done with adding the fields that you want like we did on the previous example there is only one thing to do and that is setting External Name on the Virtual Entity Id field.

The ID field is defined in the XML line below.

Property Name has uppercase value ID which must be typed in the External Name in the entity GUID field in the fields configuration and it should look like on the image below.

Finally, your virtual entity is ready to use. You should just add all the fields you want on the form and views and you are ready to go.

After the fields are added to the view you can try to open the newly created entity form the ribbon to open the record list.

If everything is configured right you should see the list of records as shown below.

Tip & Tricks

The most common error that you can face when configuring virtual entities is:

Lately, Microsoft added a Download Log File button, but in the past versions, it was not possible to download the log file for errors like this one.

If you are the lucky one that doesn’t have a log button in the popup there is a solution hidden in plugin trace log.

Plugin trace log must be enabled if you want to get the error. If you don’t have it enabled you can enable it in Settings -> Administration -> System Settings.

Enabling is done by setting “Enable logging to plug-in trace log” to All or Exception Only.

After you activated plug-in trace log you can reproduce the error and check out what is happening in Settings -> Plug-In Trace Log.

When you open individual record you can find out details about the error in Exception Details field.


I hope that I managed to help you to create your first virtual entity with OData v4 data source.

Series will continue in few days with connecting virtual entities with Cosmos DB which is quite a new feature, but it’s not that complex to achieve it.

Daily async job report with Flow

It’s kinda annoying when you don’t know that something went wrong with background workflows. Last time I used console application that was running on a dedicated server, but it kind died with the end of server subscription.

I thought let’s make the same thing, but this time using Flow.

The first thing we need to add a schedule component that will run our Flow every day. I think that the best way to do it is to schedule it after the midnight because all workflows that started yesterday will be finished by that time (I like to pick 2:00 AM as my schedule time on almost all flows that do such things).

Image 310.png

Example schedule component is configured to run once a day at 2:00 AM and the most important part here is to set your timezone correctly, otherwise, it will run in UTC zone by default.

Next step is to add few variables that will be used on several paces in the flow. Those 2 variables are Organization URL and Organization name. We can’t get organization URL or organization name via OOB Flow connectors so we need to manually type them.

Image 311.png

Simply add 2 initialize variable components to the flow and configure them as string variables as shown on the upper image.

After we have those variables it’s time to get some data from Dynamics. It’s time to add Dynamics 365 – List records component. After a component is added we need to do some configuration that is the main part here. The most important field is Filter Query one that defines the dataset for our report.

Image 312.png

The entity we need to select is the System Jobs (asyncoperation) entity. Next step is to define filter query. Since we will do report every night for the day before it’s crucial to filter the records by yesterdays date. We can’t use well-known operator Yesterday that is used in Advanced find so we need to construct the query the other way around or should I say the old way.

Modified On date must be less than today and greater or equal than yesterday. That can be done by using simple scripts that will be inserted in our query and are presented as purple FX rectangles in the query.

First, we need to write a script to get today’s date in yyyy-MM-dd format which is done by typing the script in functions textbox.

The script we need to use is pretty straightforward.

The second one is a bit more complex than first one, but still pretty straightforward if you are familiar with the Flow functions. This one will get us yesterday’s date in the same format.

With those 2 scripts, we have all dynamic stuff that is needed for our report in the query filter.

We need to filter the results by statuses. The values that we need to track are Waiting, Waiting For Resources and Failed.

The failed status will always point us to the failed workflow, but Waiting and Waiting for resources will not so we need to add a few more filters.

The most important of those filters is Error Code column that must contain data. Workflow can end in Waiting and Waiting for Resource status, but it actually ended in virtual failed status that can be found if we add this filter.

Sorting can be set on Modified On column to view the results in the ascending order in the final report.

Finally, we came to the end in configuring List records component and our dataset is ready to be used in the reporting.

When we have our dataset we can add condition component to check if there is any data in the results of the previous component. I will send mail just if there is at least one record retrieved.

Image 313.png

Length of the dataset array can be determined by adding a simple script to the textbox.

‘List_records’ is the default name for the first List records components that you add to your flow (it can be changed only after adding and before configuration).

If the value of the upper script is greater than 0 than we want to send an email with the information for the dataset but in some table.

We can use the Create HTML table component for this.

Image 314.png

The columns field value must be set to Custom so we can add our own values.

The first column that is shown is Job Name column that will be HTML a tag link to the definition of the workflow so we can click on the link and open the workflow designer page.

Message column will be used for displaying a user-friendly message that describes an error in the workflow.

Time column will show us time when workflow actually failed and the value for this column must be converted to some more readable format since date time fields are returned as ISO formatted date time string. This can be done by adding a simple script to the value field.

The last parameter is date time format string which can be changed as you like, but since I do daily report it’s enough to show just time part.

Finally, the last field of our table is a link to the actual workflow execution record where we can find all useful information that can help us to solve the problem.

After we have our table ready it’s time to send it via email. We will use Mail – Send an email notification component since it does not require any mail configuration and it uses OOB Flow mailing service as the email provider.

Image 316.png

The subject can, of course, contain any kind of text, but I think that the best approach is to put a name of the project and yesterdays date there so you can easily track your reports. We used addDays again to get the value of yesterday’s date.

If you just pass the output of Create HTML table in the Body of the email it will not work as expected. All the A tags will be displayed as HTML code and not as clickable links as you thought. The solution is to use replace function to replace all the &lt and &gt signs with actual < and > signs via expression.

When you set this expression your report is finally ready to be executed.

The final result is shown in the picture below (of course the email component has some additional CSS for table styling so ignore that part for now) for the report that came a few days ago for one of my D365 instances.

Image 317.png
Here is the image of the whole flow if someone wondered how it looks like on a bigger picture.
Image 318.png

I hope that you will find this Flow useful in your daily routine and save some time when you need to investigate the issues on your D365 instances. The great thing about Flow is that you can export this Flow and import it to any online instance and with a small manual work make it run for that new instance.

Web API v8.2 – Bad request when setting lookup

I’m working on lots of v8.2 instances and this issue was one of the most annoying issues I experienced.


You want to create or update custom entity (it’s not a problem on OOB ones) with custom value in one of the custom lookup fields. CRM REST Builder will always be my tool number one when it comes to creating Web API requests on Dynamics, but if you try it on v8.2 for the upper problem it will fail.

Here is the simple jQuery code that will update one lookup on the custom entity.

There is an “ic_consulting” entity that contains one lookup to contact that has a logical name “ic_contact”.  If you try this code on v9.0 or any other earlier version that supports Web API it will work flawlessly, but not on the v8.2.

When you execute that code you will get a 400 Bad Request error message that says:

An undeclared property ‘ic_contact’ which only has property annotations in the payload but no property value was found in the payload. In OData, only declared navigation properties and declared named streams can be represented as properties without values.

An error message will not point us to the problem because of it’s saying that there is no property with the name “ic_contact” which obviously exists if we created it.


The problem lays in the second row of the code. Just in v8.2, we need to change the input parameter to match schema name instead of logical name for lookup fields.

Schema name can be found in fields list under Schema Name column.

After we found schema name for our field it’s time to replace the logical name with a newly found schema name.

Working code for this example would be:

The only change here was capital letter C in the logical name “ic_contact” which is changed to “ic_Contact”, but in your case, it can be even bigger change (usually it’s only a capital letter change).

After we run this code everything will work as expected and we will get response code 204 No Content which is sent when the record is created successfully in Dynamics.


You must be aware that this issue is ONLY present when you use v8.2 Web API in Dynamics. In most cases, schema name will only be a logical name with the first letter written in caps so you don’t need to check for the schema name every time.

Change webresource height dynamically


The feature that I really miss on form designer is that you can set more properties on web resources. One thing that gave me headaches is web resource height. It’s just too small or it can’t be big enough to fill all the needs.

I’ve been using small JS snippet to overcome such issues so I can show web resource the way I want.

All that snippet does is that it takes the height of the HTML body tag and it sets the height of the web resource container to that value.

There is always a catch around those unsupported changes that you make to the forms. Problem is if you don’t set the Row Layout section in the properties the right way, JS snippet above will not work the way we wanted. Everything that you need to set is shown on the image below and it’s located in the Row Layout section.

The number of rows is setting is not that important for the large web resources because you will need more than 40 rows in most cases so you can set it to 1 in this case.

“Automatically expand to use available space” is where the magic happens. You need to set this one for the web resources that will change the height of the body on the form events. For example, if your HTML is not always the same height (eg. grid that displays few datasets that are not the same size) you need to check the checkbox because if you don’t do it height will not be changed dynamically every time the HTML height is changed. Height will be calculated on the initial load and the container will get fixed padding that will not let you manipulated height just by setting the height value.

The checked checkbox will solve all the problems for you. Once it’s set you will be able to change the height any time you want and it will be displayed as expected.

Of course, there is a form limitation that will force you to have only one checked checkbox on the single form so you need to make sure that you design your forms that way that you don’t need multiple or just put the web resources on the bottom of the page so the whitespace that is added to the bottom of the web resource container doesn’t matter that much.

Final result

I wanted to show you the result on the small demo web resource that displays some dataset based on the date selected in the dropdown menu. Every dataset will result with different HTML height and I called the JS function after every change of the date value in the dropdown.

Situation #1

Situation #2

As you can see the is the same amount of whitespace between the Orders section and the Consulting notes section because height is calculated dynamically after every change of the date dropdown field.


It’s obvious that this is not a supported way of doing it and it can be broken with next future Dynamics 365 update, but for now it’s my way of doing it till Microsoft doesn’t implement the supported way to achieve things like this one.


Strange whitespace on form


Today I faced a strange issue that was driving me crazy. I noticed that some entities on the Online instance (v8.2.2) have strange whitespace on the right side of the screen. I couldn’t even find the DOM element that is sitting there in this whitespace. Quick JS fix could be applied to set the width of the left element(actual form) to 100% on form load and everything will work just fine, but it’s not how I wanted to solve this issue. The issue is shown in the image below.

Image 298

I tried to create a new form, change entity properties, etc, but nothing happened. I even created a new entity that had the same problem. Every single entity that is newly created had this issue.


The problem can be resolved only by publishing the main form with some configuration changes in the Display tab.

Fields that can be changed are:

  • Show navigation items
  • Show image in the form
  • Max Width

When you any of those fields and publish your form issue will not be there anymore.

When I dived deeper into research I found out that after I exported the solution with the affected entity and compared customization.xml before the fix and after the fix there were some differences in it.

Before the fix, there was just the blank form tag defined in XML:

After the fix, there were some additional parameters defined:

But there is also a navigation node added:


Change to

Hi everyone!

Since I had some problems when trying to post code snippets on and for me it’s an important thing that I can post code snippets in my posts I decided to set up version of the blog.

From now on the blog has a new URL and it can be found on the

I will keep the old blog up till I change all the links posted around to the new address, but new posts will only be posted on the new blog.

I can’t tell you how happy am I because I can finally format my code snippets the way I always wanted.