How to embed Forms Pro in Dynamics 365

Introduction

It’s been a while since Forms Pro became available as a public preview and I must say that I fell in love with them when I saw them for the first time. From the very first moment, I was thinking how can I use them inside Dynamics 365 interface.

There are 3 types of embedded controls for Forms Pro:

  • Inline
  • Pop-up
  • Button

The first one that got my attention was Pop-up because I thought that it might serve me as a dialog (happily after old dialogs became deprecated). Sadly that option disappointed me really fast because of CORS (Cross-Origin Resource Sharing). When you make a ribbon button/field event that will call a simple function for showing pop-up you will get a CORS warning in the console and nothing will happen in the end.

The problem is in the authentication that is posting to the microsoft.com domain and not to the dynamics.com so there is no way of making it work in any kind supported manner.

The same thing happens if you try to call other to type (Inline and Button) of embedded controls from the ribbon or form event.

The only way to overcome CORS limitation is to put the code into the IFRAME and invoke the code inside of it, but let’s make it more useful by putting the Dynamics form context to the survey response.

Goal

Our goal for this one will be to create a survey for the contact form. The survey will have 2 questions about customers satisfaction with some CRM product. We will concatenate those answers to one string, update the survey response record (activity entity), set regarding to the contact and finally show it in the Timeline on the contact form.

Solution

The first thing we need to do is to create our survey that will be showed on the contact form.

First question will choice between 3 answers:

  • Salesforce
  • Dynamics 365
  • Zoho

Second question will be just a rating one that will hold values 1-5.

Next step is the Send Survey tab in which we need to choose an Embed option. Since we will do it on the form inside an IFRAME the best option here is to choose Inline.

Since it will be embedded on the contact form we need to create a custom parameter that will hold id of the contact.

After adding the custom parameter it’s time to press generate code button. You will get something like this:

Key things here are 2 parameters:

  • parentElementId
  • context

parentElementId parameter is just a string value that represents an id of the div that will be used as survey container.

context parameter is used for sending additional data to the survey response activity that is created once you submit the survey.

Let’s make some small changes to the code to make it work in our case. We need to add container DIV for the survey and get the contact id from the form context.

Now when we did some modifications we need to create new HTML web resource in Dynamics. It will be called fic_Embed.html and content will be just the same as the one in the code box above.

After that add a new tab to the contact form and inside it add a section with a web resource (in this case fic_Embed.html).

You should see something like this when you save and publish the form and open a Survey tab.

Congratulations, you have successfully added a survey to your Dynamics form.

Now we need to see what is going on with data created in the background. Let’s test the survey with options Dynamics 365 and rate it as 5 stars. After we submit the answers some records will be created in the background.

The first record that is created is Survey response record, which is the activity entity by the way. It contains a few fields that are important in our case.

The Subject field is a field that is inherited from activity entity and it’s blank when you submit the survey which is not great in our case because we want to show the activity in the Timeline.

Context Data field is a field that contains JSON object with one core node called EmbedContextParameters that contains all the custom parameters (only one contactId in our case) that we posted in JS code.

Every Survey response record has his child entities called Survey question responses. Those records hold the values of the submitted answers. Answers are stored in the Response field in plain text format.

Now when we know how the data is stored it’s time to make a Flow that will transform the data in the way we want.

Let’s make a blank Flow to start.

Trigger that will be used is Common Data Service one that is triggered when the Survery response record is created.

The first action will be used to retrieve all Survey question responses related to the record that triggered the Flow.

The only parameter we need to set here is a filter query that will filter our all related question responses. Survey response GUID is hidden in Activity parameter from the trigger. Question responses are always returned in the order that is the same as question numbers sorted ascending.

Next, we need to parse the JSON that is holding EmbedContextParameters from the Context Data field of Survey response entity.

Schema that is generated by payload looks like:

Finally in the last step of the Flow we will update the Survey response entity.

As record ID we will use Activity field from the trigger. Regarding field has 2 parts. Regarding needs to be set to the contactId value that we got from the JSON in the previous part and the Regarding Type needs to be set to contacts.

The subject is the part that will be shown in the Timeline so we need to put something human readable here. Let’s concatenate the answers of the questions with “-” sign in between. We know that in returned Survey question responses dataset answer of the first question will be returned as the first item in the array and the answer of the second one will be returned as the second item in the array. Knowing that it’s easy to make a concatenation.

First body(…) part from Subject field in the picture must be equal to:

Second body(…) part from the Subject field in the picture must be equal to:

After we set those parameters in the last action and started the Flow everything is ready for the final test.

Let’s submit new answers to the same contact as before.

After we submit the answers we should wait for a few seconds for Flow execution and then refresh the Timeline on the contact entity.

Well done if you see the submitted answers on the Timeline!

Conclusion

It’s quite easy and intuitive to work with survey answers once you understand the data model in behind, but I see there so much to be done. As I said at the beginning I see big potential in this for dialog scenarios in the future, but first, they need to change the authentication method to the same domain as Dynamics so we can use it anywhere on the form.

I’m really looking forward to this feature and I think that big things will come really soon because feedback in the community regarding the feature is just huge.

When smart matching became dumb matching

There are 2 ways of tracking emails in Dynamics 365: folder-level tracking and correlation method. Folder-level tracking will fit only a few cases, but the correlation method will do the job for most cases and it’s a way to go for most customers.

When you decide to go correlation route there are also 2 (or even 0) ways to do that. First one is to use tracking token which is the most precise method, but also the least accepted method by customers because it adds a token in the subject of the mail. The second one is the smart matching method that is suggested by most people because it gives the flexibility to match emails that are started from the new thread, but is it really worth it?

What is smart matching?

When you search for definition in docs about the smart matching it says that it’s done by checking the email subject, sender and recipients to find the matching email thread in the emails that are synced to Dynamics.

Subject matching is done by setting the regex expression that will remove the all unnecessary characters from the subject to improve the matching algorithm.

Sender and recipient matching allows us to set parameters for maximum numbers of recipients to get better results allowing us to tweak the matching to match our specific needs. In most cases it takes some time until you find the best batch for your specific case.

Sounds like a perfect solution, does it? There is always a but!

Problematic case

I had a problem with some emails that were not synced in Dynamics for a long period of time. Since then I tried to find some patterns between unsynced emails, but with no luck.

Problematic emails were automatic emails sent every now and then with the same subject and to nearly the same recipients by the same sender. Half of the emails are synced to Dynamics and the other half ends in the error. The error message on mailbox just doesn’t tell much about the actual issue.

Emails that were synced successfully looked just like the ones that didn’t when you are looking at the metadata of the email message, so there should be no reason for sync to fail.

Investigation

Investigation started with EWS Editor for Dynamics 365 tool which is used to troubleshoot issues with Exchange server integrations with Dynamics 365. With this tool you can view all the metadata of the particular email and test if that mail can be synced to Dynamics. We had no luck with the tool in this case, but maybe it can point you in right direction while you try to troubleshoot the issues.

After a deep dive into the problem one strange pattern emerged.

Emails with the exact same subject and recipients started to fail after the initial period of successful syncs and after that, they fail to sync every single time. Since it was on the on-premise installation we started the trace log on the server to monitor the errors coming from the Dynamics.

You can force the sync on the same email by modifying the one property of the email or just moving it to the different folder and then back to the inbox.

Every single time mail failed to sync with the exact same error message and it’s quite a simple one “SQL timeout expired” and this one leads us to the problem with SQL queries that are executed on email sync.

When we scrolled through the trace log we found one SQL query above the timeout messages that looked like this one:

When we tried to execute the query in SQL Management Studio it was running for almost a minute which was a serious problem if you consider that it’s run every time a single email is synced to the Dynamics.

The problem lays in EmailHashBase table that has a lot of records in it and with all those filtering and joins it really take time to execute the query. After a call with Microsoft, we were told that it looks like smart matching fallback SQL query.

Finally when we had that information we tried to switch off the smart matching feature and everything started working like a charm. Looks like when we made our solution less smart everything is now working better.

Conclusion

Query on email hash table is causing the issue if you have the email with same subject and recipients over and over again.

You can increase the timeout limit which is not the smartest idea to do or you can just get rid of the smart matching feature because it will just kill your integration. Switching off the smart matching feature is not that bad idea if you consider that it’s way older feature than the correlation method one. Maybe it’s looking like that it’s just an addition to the correlation method because it is shown as a nested feature under a correlation, it really isn’t. It’s only a fallback option if the correlation method fails to find the right match.

There will always be a case where the smart matching feature is a good one to use, but you should really think if you will have the situation described above before you decide to tick that checkbox.

I can’t believe that I can say that a “dumb” solution made me so happy that day. Looks like that you don’t need to go the smart route every time, maybe the dumb one will just save your time and nerves.

D365 Forms – VE vs​ Embedded Canvas App

A lot of people asked me an opinion on using embedded canvas apps instead of virtual entities to show additional information on the form, so I decided to put those things into the blog post.

Both approaches have a bright future and I’m like 100% sure in that because they fit different design scenarios. Let’s discuss the diferences between those two and when to use them.

Comparison

Virtual EntitiesEmbedded Canvas Apps
OOB Data Sources2200+
Custom Data SourcesAny Data SourceOnly OData v4
Form ContextAll FieldsAll Fields
Related Entities ContextAll FieldsLimited Fields
AppearanceDynamics ViewCanvas App
Elements Per PageUnlimited1

Virtual Entities

Virtual Entities look like forgotten feature by Microsoft, but some people from the Microsoft confirmed that they are doing some serious work regarding Virtual Entities in the background. On the other hand, some people have reverse engineered the Plugin Registration Tool and found out that everything is set for full CRUD support in the tool right now, but it’s not accessible via UI. I’m a big fan of VEs, so I was really happy when I heard that information.

Data sources are really a thing that can make you go both ways. Virtual entities can handle 2 OOB connectors: OData v4 & Cosmos DB (buggy at the moment). OData v4 connector is the most stable connector at the moment in the VE world and it’s definitely the way to go if you have web services that meet the standard. On the other hand, you can literally connect to any data source writing the C# code for the custom connector, but you must be aware that this type of integration will take a lot of your time because you need to implement every little thing (fetching, filtering, sorting, paging,…) by yourself or your friend developer.

You should use virtual entities if you want to use that data on places other than your entity from (related records subgrid), for example, charts, dashboards, reports, advanced find.

Ribbon buttons are also a great way to interact (select few records and fire action) with records shown on the subgrid and at the moment it’s not possible to configure those buttons to interact with elements inside the embedded canvas app.

Limit to the number of subgrids is also a plus for VEs. When you are using UCI you can put as many subgrids as you want to the form, but when you use Classic view that number is set to 10 maximum.

Editing the actual UI representation is easier because it can be done by anyone that used Dynamics views before.

Embedded Canvas App

This feature is still marked as a preview, but lately, I can see that it became more stable, so you should definitely consider using this one. Microsoft is pointing us in the direction to use Canvas Apps as much as we can and I think we should definitely give them a try even in embedded scenarios.

When we come to data sources it’s pretty clear that more than 200 OOB connectors are a big plus for the Canvas Apps approach, but the situation changes when we need to use data outside those OOB connectors. Data that you want to fetch must be available as REST web service and we can expect SOAP support in the near future.

You should consider using a canvas approach if you need to enrich your records list with additional graphical elements (eg. images) or adding some buttons that will do some action to each record.

Getting some callback actions from Canvas App to the form is not available at the moment, but I think that it’s a must feature in future and that we will not wait too much for that one.

Limit to only one canvas app per page is the con here, but you can make your app as complex as you want to show more than one list. If you go with a complex app you should consider performance issues when fetching the data from the data sources.

Canvas Apps give us more flexibility when building the actual UI of the app, but it also requires people that are familiar with designing the Canvas Apps which can take some time before you can achieve your idea inside the app.

Conclusion

You should really consider every single limitation of the VEs and Embedded Canvas Apps before you do the final decision which way is the right one. Sometimes both ways are good and will do the job, but when you think in future changes maybe the one will be just a better choice. I’m sure that there are scenarios that will make you stick to the one option because the other one will not fit the requirements in the start.

If you ever find yourself in the position where you really need to choose either VEs or Canvas Apps approach and you are sure that both will do the job just fine I will suggest you go Canvas Apps route because of the compatibility issues in future and the popularity of Canvas Apps in general that makes Microsoft rapidly develop the new features that will help you even more.

Virtual Entities Part 3 – Custom Data Provider

Introduction

This is the final blog post of the Virtual Entities series and this time I will show you how to create and set up a custom data provider for virtual entities.

Entity #1

The first step should be creating a virtual entity in Dynamics.

Creating a virtual entity for the custom connector is quite simple and I can say that is even easier than creating it for the OOB connectors. It’s easy because you don’t need to take care of external names, data types or field requirements because everything you do will be defined in the plugin code and not on the UI.

Let’s create a simple entity for our demo. The entity will hold a simple sensor data that includes temperature and a timestamp for the measurement.

The first step is obviously opening solution explorer and creating a new entity. The only things you need to be aware of here are Virtual Entity checkbox that must be checked and data source part that will remain empty for now till we create one later.

After that let’s create 2 fields that will hold our sensor data.

The Name field should a single line of text (primary field will do the job).

The Temperature field should be a whole number.

The data type is the only thing that you need to set up and leave everything else as default values. As you can see here we don’t need to think about external names because regular field names will be used in our plugin code.

That’s everything we need to set up in our entity, so we are ready to jump to the next step.

Plugin

Next step is writing a simple plugin that will handle the read operations for our data.

We need to implement some data source logic in our plugin so we can fetch data from 3rd party system. This example will use a static class with some predefined data that will be used to demonstrate how to write a plugin.

First of all, we need to define a class that will hold our data. Code that describes our class is shown in the snippet below.

Next, we need to write some service that will fetch data for us. Service in this example is a simple repository that has 2 basic methods Get and GetAll which fetch a single record and all records respectively. Code for that static class is shown below.

The class is prepopulated with a list of 10 records that will be shown in Dynamics at the end.

Finally, we got to the part where we need to write a plugin. Writing a custom data provider plugin is not that different than writing one for Retrieve/RetrieveMultiple messages. All you need to do here is to set the right output parameter at the end of plugin code.

Retrieve code that will be used in this example is shown below.

The Retrieve plugin must implement IPlugin interface as any other plugin for Dynamics. Whole logic is defined in the Execute method that is part of the implemented interface. We must fetch single record here and construct the Entity object, populate it with retrieved data and pass it as BusinessEntity output parameter in the end. With that, we are done with our Retrieve method so we can move forward to RetrieveMultiple.

RetrieveMultiple code that will be used in this example is shown below.

The RetrieveMultiple plugin must also implement IPlugin interface. This time we must fetch a list of data, create an Entity collection, populate it with Entity objects with fetched data and pass the collection to the BusinessEntityCollection output parameter. Even few small changed that feel so unimportant can make this code not to work.

There are few ways of initializing the Entity object in C# notation, but only one of those work and that is one shown above.

For example, if you try to use initializations like:

It just won’t work and I don’t understand why is that like it, but you must be careful when initializing the entity object to strictly add GUID to the dictionary.

Now when we have our plugins done we can move to the next part.

Data Provider

Registering a custom data provider is the part where most people give up because there are some errors and unusual stuff that stop you from doing it the right way.

The data provider can be added only via the newest Plugin Registration Tool which can be downloaded on https://xrm.tools/ by clicking on the Get the SDK Tools! link.

After you downloaded the tool you need to register the newly created DLL with Retrieve and RetrieveMultiple plugins to your organization.

When you registered the assembly it’s time to create a new data provider. Creating is done by clicking on the Register drop menu and selecting Register New Data Provider.

After clicking on the last option popup menu appears. On this screen, you only need to select the Create New Data Source option.

On the next screen, you need to set Display and Plural Name that can be some random names, a solution that will hold data source metadata and logical name for the data source. Finally, you press the Create button.

But after you pressed the Create button and unpleasant error appears on your screen, but don’t worry you did everything right and the data source is created successfully. Press the Refresh button in the Plugin Registration tool and data source will be shown on the list.

Now it’s time to again press the Register New Data Provider button and start creating the data provider. The most important things here are to select the right assembly and the right Retrieve/RetrieveMultiple methods form the assembly. When you filled the data needed it’s time to hit that Update button. Plugin Registration Tool can crash in this step, but if it does crash just repeat the process until it finishes successfully.

When you created the data provider it’s time to leave Plugin Registration Tool and go to the next part.

Data Source

Every Virtual Entity needs a data source configuration that is used in entity configuration. Configurations are created in Settings -> Administration -> Virtual Entity Data Sources. Here you just need to hit New, select data source created in the previous step, input some name and press Save & Close button. Now we have our data source configuration.

Entity #2

It’s time to get back to solution explorer and our entities.

The first step is to find the Data Source entity. open the fields list, open the configuration for ID field and there you need to copy the value of the Name field to the External Name field. If this step is not done everything else will not matter because Dynamics would not know how to find Data Source configuration while using your data provider.

The final step in this example is to select the data source configuration in the virtual entity editor by adding value to the Data Source field.

After you did everything like it was shown here you should have your custom data provider Virtual Entity live & running.

Let’s check the list of Sensor Measurements list to check if everything is working.

If you see the list like one above on your screen everything is working just fine. Let’s try to click on the single record on the list to see if the form is opening as expected.

Conclusion

Setting up a custom data provider is kinda hard for someone that is doing it for the first time because all the errors that can show up on the way, but I think that is worth the time spent for sure in the end. Possibilities are endless if you try to implement some complex scenarios like showing virtual entities in related records subgrid on the entity form which is really a great way to use VE.

This was my final post in Virtual Entities mini-series, but I will try to post as much Virtual Entities tips and tricks as I face the problems while using them.

Virtual Entities Part 2 – Cosmos DB

Introduction

Here is the second post of the series that will focus on connecting Virtual Entities with Azure Cosmos DB. Last time I covered basic concepts of virtual entities and now it’s time to explore more friendly & flexible OOB connector for Cosmos DB.

Cosmos DB Connector

Last time we jumped straight to the point of creating a new data source in Dynamics 365, but this time we don’t even have a connector installed on our instance. So the first step will be the installation of the connector.

Thankfully installing the connector is not that hard since it’s available on AppSource. You can open it by clicking Settings -> Microsoft AppSource on your instance.

When you get an AppSource popup all you need is to search for “Cosmos DB” and after you find it click on Get it now button.

You will just need to select organization on which you want to install the connector and agree with terms and conditions.

It may take before connector becomes available on your instance,  but when you see that DocumentDBDataProvider solution is added to your instance you are ready to start exploring.

Azure Cosmos DB

Before we get to the data source creating part we need first get an instance of Azure Cosmos DB.

Fist you need to open https://portal.azure.com and create a new resource. Click on Create a resource button, then search for “Azure Cosmos DB” and after you select is press Create button.

Now you need to input some of the crucial information for your Cosmos DB. After you select your subscription and resource group you need to choose the name for your DB which will be used later in the data source configuration. The most important thing here is API part. You MUST select SQL  here because virtual entities don’t support any other API method mentioned in the dropdown menu. Other parameters are your own choice.

When you are done with your parameters form should look like on the image above and just be sure that API is set to SQL. After clicking on Review + create button you must wait a few minutes while your DB is being created in Azure.

When the installation process is finished we need to create the collection which will hold our documents. First, you need to open Cosmos DB resource that is created, then choose Data Explorer on the left menu. After data explorer is opened you need to press New collection button which will lead you to the collection configuration form.

You need to populate a few fields on the configuration form. First is, of course, creating a new database and after that, you need to choose a name for your collection (sensor-data collection will be used in this tutorial). I suggest you choose Fixed (10 GB) option for Storage capacity field and 400 RU/s for Throughput field for lowest cost option, of course, you can change it to whatever suits you best, but since this one is just for testing purposes I will go with the lowest cost option here.

Pressing OK button will create your database and collection.

Creating Data Source

We have our database set up so we can jump to creating a data source. You need to open Settings -> Administration -> Virtual Entity Data Sources in order to create a new data source.

Click on New opens a dialog where you need to select Azure Cosmos DB for DocumentDB API Data Provider option from the dropdown menu. When you do so you will get Cosmos DB connector configuration form.

The first parameter is a name which is not that important and it can be something that you want.

The second parameter is Collection Name which is quite undescriptive and can probably lead to an error on your first try because it’s not actually a collection name. In this field, you need to type your database name (db in our case).

The third parameter is Authorization Key which can be found on Cosmos DB resource page on the Azure portal. When you open your Cosmos DB resource you need to select Keys from the left menu. There you have your primary and secondary authentication key. Copy any of those 2 values to the Authorization Key field.

The fourth parameter is Uri that can also be found on the keys section. So copy/paste that value to the Uri parameter too.

You can leave Timeout in seconds field on the default value and click Save.

Now we have our data source ready to use.

Creating a Virtual Entity

Next step is to create a virtual entity on our Dynamics 365 instance. Go to your solution and create a new entity. First, you need to tick the Virtual Entity option and then you will get virtual entity related parameters to fill. Most important ones are External Name, External Collection Name and Data Source. External Name and External Collection Name must hold the same value and that value is the collection name of our Cosmos DB collection (sensor-data in our case). In Data Source we must select a newly created data source from the previous step. Final form must look like the one below.

We need to set primary field requirement to Optional because we will not use it in this example.

Next, we need to add External Name to our primary key field. Since we will use lower case key in our Cosmos DB documents we will set the value to id.

After that, we will add 2 more fields:  Temperature and Sensor Type field.

The Temp field will be Whole Number value field with External Name equals to temp.

The Sensor Type field will be Lookup field on Sensor Type entity (basic entity with just name field that must be created) with External Name equals to type.

When we have all the field created let’s create some documents in Cosmos DB.

Documents in Cosmos DB

First, you need to create a few Sensor Type records so we can have GUID values for those records before heading to creating documents.

Documents must be JSON formatted objects, so in our case example would be a simple JSON object like this one:

JSON Parameters that will be used are:

  1. id – randomly generated GUID (eg. GUID generator)
  2. temp – temperature value that will hold integer values
  3. type – GUID of a Sensor Type entity record

Add a few documents by going to Data Explorer -> db -> sensor-data -> Documents and click on New Document button. Input your JSON object and hit Save.

When you have a few documents in DB it’s time to try our Virtual Entity in action.

Viewing Virtual Entity Data

The most basic way to see if your virtual entity is working is going to list of records of your virtual entity. If you did everything right your records should appear in the list.

Hell yeah! Now you have working virtual entities connected to the Azure Cosmos DB.

Conclusion

As you can see it’s not that hard to create a solution with Cosmos DB and Virtual entities, but there are still some bugs to be fixed.

We have used a lookup field in our virtual entity. That is populated just fine by just passing the GUID to the lookup field. You can click on the lookup field and Sensor Data will be opened.

If that is working you can consider let’s do the advanced find query where we will include filter by that lookup field or add a subgrid on Sensor Type form that will show us only records for that particular Sensor Type. Everything other than list view is not possible with virtual entities, it will just show us an empty list in both cases.

I’m really looking forward to the future of Cosmos DB Virtual entities because it has such a huge potential. Imagine a very fast DB (this one is Cosmos DB) that holds tons of data (that is offloaded from Dynamics 365) that you can view it inside of entity form with just a few clicks of configuration.

I hope that this future will come as soon as possible and make our lives so much easier, but till then we need to create the custom connectors for virtual entities that will do the magic for us.

Next one up in the series, as you can expect to from my last sentence, will be creating a custom connector for a virtual entity.

 

Virtual Entities Part 1 – OData v4

Introduction

It’s been a while since virtual entities came out, but I don’t see that people use them on the regular basis. The big question is WHY?

I can remember when I tried to make a simple OData example just after the release and it was not possible due to various bugs that were present at the time. Now after almost a year, I thought maybe it’s the right moment to try them again and go for a deep dive this time.

I plan to split the series into few smaller ones to display multiple data connectors for virtual entities so today it’s time for the one that was released first and you all know that it’s OData v4 connector.

OData v4 Connector

The first thing we need to make an integration is OData v4 web service that will be a source for our data.

I will use public OData service that can be found on:

https://services.odata.org/V4/OData/OData.svc

It’s an official example for OData v4 service made against all OData standards and it has several collections that can be used. The one that will be used in this tutorial is Advertisements collection (others cannot be used with virtual entities).

Adding new Data Source

Now after we have our web service URL ready it’s time to add a new data source to the D365 instance.

Data sources can be found at Settings > Administration

When you open Virtual Entity Data Sources section it’s time to add a new Data source.

When you open create window you need to input few parameters:

  • Name – random name that will describe your data source
  • URL – bade URL of your web service
  • Timeout – time in seconds after which D365 will timeout
  • Pagination Mode – use client-side if your service doesn’t support filtering parameters and if it does set it to server-side
  • Return Inline Count – true if your service returns a number of records retrieved (supports $inclinecount parameter), otherwise, you should put false here

Here is the populated form for our example

After saving this one we have our data source ready and we are heading to the next part.

Creating a new virtual entity

The virtual entity is created like any other entity, but the magic starts when you tick the Virtual entity checkbox.

There are 3 new fields that we need to populate after selecting Virtual Entity checkbox. Data Source is more than obvious that we need to select our newly created data source as value there. Other 2 fields are a bit more tricky and can make you freak out if you don’t understand the OData structure.

First you need to input an external collection name that will be used while fetching the data for the virtual entity and as we said earlier we will use Advertisements collection. We will use this collection only because it’s the only collection that uses GUIDs for record identification numbers and you can’t use OData source if your records don’t have GUID as a unique ID in output.

The external name field is the one that we need to populate and it can be hard to put the right value field if you don’t know where to find it.

You need to check OData metadata for the collection that you will use as Data source. The answer for our case can be found on:

https://services.odata.org/V4/OData/OData.svc/$metadata#Advertisements

The response is quite large XML file, but all we need to find is Entity Type node that has the same parameters as our response that we get when we query Advertisements collection and the one that we need is shown below.

Now we have all the parameters that are needed on the general information tab.

Next one up is field setup.

Before we hit save we should switch on the Primary Field tab to set a few more things. It’s crucial that you set the primary field right because in virtual entities every single field that is created must be mapped to one of the fields in OData output.

Here you need to take attention on a few things.

First one is External Name parameter that can be found on the same XML snippet we used before and look at the line below:

Property Name value is the one that you need to input in External name parameter, but you need to be sure that you type/copy it just the way it’s shown in the metadata because the parameter is case sensitive and it will not work if the case doesn’t match.

Next parameter is Data Type that can also lead to tons of errors. You need to specify the right data type for the OData parameter.  In this example, it’s quite easy to select Single Line of Text because you can spot the Type String in XML, but you must be careful with number types because it must match 100%. For example, if there is decimal in XML you can’t put whole number type because it will fail when it tries to cast decimal to an integer.

The final thing is Field Requirement parameter that can also lead to errors if not set correctly.

For example, let’s assume that XML returned is modified a little bit:

Now you can see that there is also a Nullable attribute in the property node. That nullable attribute can be translated to our Field Requirement parameter as follows:

  • Business Required
    • Nullable=”false”
  • Optional
    • Nullable=”true” OR Nullable attribute at all

In the end, primary field configuration should look like one on the image below.

When you are done with adding the fields that you want like we did on the previous example there is only one thing to do and that is setting External Name on the Virtual Entity Id field.

The ID field is defined in the XML line below.

Property Name has uppercase value ID which must be typed in the External Name in the entity GUID field in the fields configuration and it should look like on the image below.

Finally, your virtual entity is ready to use. You should just add all the fields you want on the form and views and you are ready to go.

After the fields are added to the view you can try to open the newly created entity form the ribbon to open the record list.

If everything is configured right you should see the list of records as shown below.

Tip & Tricks

The most common error that you can face when configuring virtual entities is:

Lately, Microsoft added a Download Log File button, but in the past versions, it was not possible to download the log file for errors like this one.

If you are the lucky one that doesn’t have a log button in the popup there is a solution hidden in plugin trace log.

Plugin trace log must be enabled if you want to get the error. If you don’t have it enabled you can enable it in Settings -> Administration -> System Settings.

Enabling is done by setting “Enable logging to plug-in trace log” to All or Exception Only.

After you activated plug-in trace log you can reproduce the error and check out what is happening in Settings -> Plug-In Trace Log.

When you open individual record you can find out details about the error in Exception Details field.

Conclusion

I hope that I managed to help you to create your first virtual entity with OData v4 data source.

Series will continue in few days with connecting virtual entities with Cosmos DB which is quite a new feature, but it’s not that complex to achieve it.

Daily async job report with Flow

It’s kinda annoying when you don’t know that something went wrong with background workflows. Last time I used console application that was running on a dedicated server, but it kind died with the end of server subscription.

I thought let’s make the same thing, but this time using Flow.

The first thing we need to add a schedule component that will run our Flow every day. I think that the best way to do it is to schedule it after the midnight because all workflows that started yesterday will be finished by that time (I like to pick 2:00 AM as my schedule time on almost all flows that do such things).

Image 310.png

Example schedule component is configured to run once a day at 2:00 AM and the most important part here is to set your timezone correctly, otherwise, it will run in UTC zone by default.

Next step is to add few variables that will be used on several paces in the flow. Those 2 variables are Organization URL and Organization name. We can’t get organization URL or organization name via OOB Flow connectors so we need to manually type them.

Image 311.png

Simply add 2 initialize variable components to the flow and configure them as string variables as shown on the upper image.

After we have those variables it’s time to get some data from Dynamics. It’s time to add Dynamics 365 – List records component. After a component is added we need to do some configuration that is the main part here. The most important field is Filter Query one that defines the dataset for our report.

Image 312.png

The entity we need to select is the System Jobs (asyncoperation) entity. Next step is to define filter query. Since we will do report every night for the day before it’s crucial to filter the records by yesterdays date. We can’t use well-known operator Yesterday that is used in Advanced find so we need to construct the query the other way around or should I say the old way.

Modified On date must be less than today and greater or equal than yesterday. That can be done by using simple scripts that will be inserted in our query and are presented as purple FX rectangles in the query.

First, we need to write a script to get today’s date in yyyy-MM-dd format which is done by typing the script in functions textbox.

The script we need to use is pretty straightforward.

The second one is a bit more complex than first one, but still pretty straightforward if you are familiar with the Flow functions. This one will get us yesterday’s date in the same format.

With those 2 scripts, we have all dynamic stuff that is needed for our report in the query filter.

We need to filter the results by statuses. The values that we need to track are Waiting, Waiting For Resources and Failed.

The failed status will always point us to the failed workflow, but Waiting and Waiting for resources will not so we need to add a few more filters.

The most important of those filters is Error Code column that must contain data. Workflow can end in Waiting and Waiting for Resource status, but it actually ended in virtual failed status that can be found if we add this filter.

Sorting can be set on Modified On column to view the results in the ascending order in the final report.

Finally, we came to the end in configuring List records component and our dataset is ready to be used in the reporting.

When we have our dataset we can add condition component to check if there is any data in the results of the previous component. I will send mail just if there is at least one record retrieved.

Image 313.png

Length of the dataset array can be determined by adding a simple script to the textbox.


‘List_records’ is the default name for the first List records components that you add to your flow (it can be changed only after adding and before configuration).

If the value of the upper script is greater than 0 than we want to send an email with the information for the dataset but in some table.

We can use the Create HTML table component for this.

Image 314.png

The columns field value must be set to Custom so we can add our own values.

The first column that is shown is Job Name column that will be HTML a tag link to the definition of the workflow so we can click on the link and open the workflow designer page.

Message column will be used for displaying a user-friendly message that describes an error in the workflow.

Time column will show us time when workflow actually failed and the value for this column must be converted to some more readable format since date time fields are returned as ISO formatted date time string. This can be done by adding a simple script to the value field.

The last parameter is date time format string which can be changed as you like, but since I do daily report it’s enough to show just time part.

Finally, the last field of our table is a link to the actual workflow execution record where we can find all useful information that can help us to solve the problem.

After we have our table ready it’s time to send it via email. We will use Mail – Send an email notification component since it does not require any mail configuration and it uses OOB Flow mailing service as the email provider.

Image 316.png

The subject can, of course, contain any kind of text, but I think that the best approach is to put a name of the project and yesterdays date there so you can easily track your reports. We used addDays again to get the value of yesterday’s date.

If you just pass the output of Create HTML table in the Body of the email it will not work as expected. All the A tags will be displayed as HTML code and not as clickable links as you thought. The solution is to use replace function to replace all the &lt and &gt signs with actual < and > signs via expression.

When you set this expression your report is finally ready to be executed.

The final result is shown in the picture below (of course the email component has some additional CSS for table styling so ignore that part for now) for the report that came a few days ago for one of my D365 instances.

Image 317.png
Here is the image of the whole flow if someone wondered how it looks like on a bigger picture.
Image 318.png

I hope that you will find this Flow useful in your daily routine and save some time when you need to investigate the issues on your D365 instances. The great thing about Flow is that you can export this Flow and import it to any online instance and with a small manual work make it run for that new instance.

Web API v8.2 – Bad request when setting lookup

I’m working on lots of v8.2 instances and this issue was one of the most annoying issues I experienced.

Problem

You want to create or update custom entity (it’s not a problem on OOB ones) with custom value in one of the custom lookup fields. CRM REST Builder will always be my tool number one when it comes to creating Web API requests on Dynamics, but if you try it on v8.2 for the upper problem it will fail.

Here is the simple jQuery code that will update one lookup on the custom entity.

There is an “ic_consulting” entity that contains one lookup to contact that has a logical name “ic_contact”.  If you try this code on v9.0 or any other earlier version that supports Web API it will work flawlessly, but not on the v8.2.

When you execute that code you will get a 400 Bad Request error message that says:

An undeclared property ‘ic_contact’ which only has property annotations in the payload but no property value was found in the payload. In OData, only declared navigation properties and declared named streams can be represented as properties without values.

An error message will not point us to the problem because of it’s saying that there is no property with the name “ic_contact” which obviously exists if we created it.

Solution

The problem lays in the second row of the code. Just in v8.2, we need to change the input parameter to match schema name instead of logical name for lookup fields.

Schema name can be found in fields list under Schema Name column.

After we found schema name for our field it’s time to replace the logical name with a newly found schema name.

Working code for this example would be:

The only change here was capital letter C in the logical name “ic_contact” which is changed to “ic_Contact”, but in your case, it can be even bigger change (usually it’s only a capital letter change).

After we run this code everything will work as expected and we will get response code 204 No Content which is sent when the record is created successfully in Dynamics.

Conclusion

You must be aware that this issue is ONLY present when you use v8.2 Web API in Dynamics. In most cases, schema name will only be a logical name with the first letter written in caps so you don’t need to check for the schema name every time.

Change webresource height dynamically

Challenge

The feature that I really miss on form designer is that you can set more properties on web resources. One thing that gave me headaches is web resource height. It’s just too small or it can’t be big enough to fill all the needs.

I’ve been using small JS snippet to overcome such issues so I can show web resource the way I want.

All that snippet does is that it takes the height of the HTML body tag and it sets the height of the web resource container to that value.

There is always a catch around those unsupported changes that you make to the forms. Problem is if you don’t set the Row Layout section in the properties the right way, JS snippet above will not work the way we wanted. Everything that you need to set is shown on the image below and it’s located in the Row Layout section.

The number of rows is setting is not that important for the large web resources because you will need more than 40 rows in most cases so you can set it to 1 in this case.

“Automatically expand to use available space” is where the magic happens. You need to set this one for the web resources that will change the height of the body on the form events. For example, if your HTML is not always the same height (eg. grid that displays few datasets that are not the same size) you need to check the checkbox because if you don’t do it height will not be changed dynamically every time the HTML height is changed. Height will be calculated on the initial load and the container will get fixed padding that will not let you manipulated height just by setting the height value.

The checked checkbox will solve all the problems for you. Once it’s set you will be able to change the height any time you want and it will be displayed as expected.

Of course, there is a form limitation that will force you to have only one checked checkbox on the single form so you need to make sure that you design your forms that way that you don’t need multiple or just put the web resources on the bottom of the page so the whitespace that is added to the bottom of the web resource container doesn’t matter that much.

Final result

I wanted to show you the result on the small demo web resource that displays some dataset based on the date selected in the dropdown menu. Every dataset will result with different HTML height and I called the JS function after every change of the date value in the dropdown.

Situation #1

Situation #2

As you can see the is the same amount of whitespace between the Orders section and the Consulting notes section because height is calculated dynamically after every change of the date dropdown field.

Conclusion

It’s obvious that this is not a supported way of doing it and it can be broken with next future Dynamics 365 update, but for now it’s my way of doing it till Microsoft doesn’t implement the supported way to achieve things like this one.

 

Strange whitespace on form

Problem

Today I faced a strange issue that was driving me crazy. I noticed that some entities on the Online instance (v8.2.2) have strange whitespace on the right side of the screen. I couldn’t even find the DOM element that is sitting there in this whitespace. Quick JS fix could be applied to set the width of the left element(actual form) to 100% on form load and everything will work just fine, but it’s not how I wanted to solve this issue. The issue is shown in the image below.

Image 298

I tried to create a new form, change entity properties, etc, but nothing happened. I even created a new entity that had the same problem. Every single entity that is newly created had this issue.

Solution

The problem can be resolved only by publishing the main form with some configuration changes in the Display tab.

Fields that can be changed are:

  • Show navigation items
  • Show image in the form
  • Max Width

When you any of those fields and publish your form issue will not be there anymore.

When I dived deeper into research I found out that after I exported the solution with the affected entity and compared customization.xml before the fix and after the fix there were some differences in it.

Before the fix, there was just the blank form tag defined in XML:

After the fix, there were some additional parameters defined:

But there is also a navigation node added: