Daily async job report with Flow

It’s kinda annoying when you don’t know that something went wrong with background workflows. Last time I used console application that was running on a dedicated server, but it kind died with the end of server subscription.

I thought let’s make the same thing, but this time using Flow.

The first thing we need to add a schedule component that will run our Flow every day. I think that the best way to do it is to schedule it after the midnight because all workflows that started yesterday will be finished by that time (I like to pick 2:00 AM as my schedule time on almost all flows that do such things).

Image 310.png

Example schedule component is configured to run once a day at 2:00 AM and the most important part here is to set your timezone correctly, otherwise, it will run in UTC zone by default.

Next step is to add few variables that will be used on several paces in the flow. Those 2 variables are Organization URL and Organization name. We can’t get organization URL or organization name via OOB Flow connectors so we need to manually type them.

Image 311.png

Simply add 2 initialize variable components to the flow and configure them as string variables as shown on the upper image.

After we have those variables it’s time to get some data from Dynamics. It’s time to add Dynamics 365 – List records component. After a component is added we need to do some configuration that is the main part here. The most important field is Filter Query one that defines the dataset for our report.

Image 312.png

The entity we need to select is the System Jobs (asyncoperation) entity. Next step is to define filter query. Since we will do report every night for the day before it’s crucial to filter the records by yesterdays date. We can’t use well-known operator Yesterday that is used in Advanced find so we need to construct the query the other way around or should I say the old way.

Modified On date must be less than today and greater or equal than yesterday. That can be done by using simple scripts that will be inserted in our query and are presented as purple FX rectangles in the query.

First, we need to write a script to get today’s date in yyyy-MM-dd format which is done by typing the script in functions textbox.

The script we need to use is pretty straightforward.

The second one is a bit more complex than first one, but still pretty straightforward if you are familiar with the Flow functions. This one will get us yesterday’s date in the same format.

With those 2 scripts, we have all dynamic stuff that is needed for our report in the query filter.

We need to filter the results by statuses. The values that we need to track are Waiting, Waiting For Resources and Failed.

The failed status will always point us to the failed workflow, but Waiting and Waiting for resources will not so we need to add a few more filters.

The most important of those filters is Error Code column that must contain data. Workflow can end in Waiting and Waiting for Resource status, but it actually ended in virtual failed status that can be found if we add this filter.

Sorting can be set on Modified On column to view the results in the ascending order in the final report.

Finally, we came to the end in configuring List records component and our dataset is ready to be used in the reporting.

When we have our dataset we can add condition component to check if there is any data in the results of the previous component. I will send mail just if there is at least one record retrieved.

Image 313.png

Length of the dataset array can be determined by adding a simple script to the textbox.


‘List_records’ is the default name for the first List records components that you add to your flow (it can be changed only after adding and before configuration).

If the value of the upper script is greater than 0 than we want to send an email with the information for the dataset but in some table.

We can use the Create HTML table component for this.

Image 314.png

The columns field value must be set to Custom so we can add our own values.

The first column that is shown is Job Name column that will be HTML a tag link to the definition of the workflow so we can click on the link and open the workflow designer page.

Message column will be used for displaying a user-friendly message that describes an error in the workflow.

Time column will show us time when workflow actually failed and the value for this column must be converted to some more readable format since date time fields are returned as ISO formatted date time string. This can be done by adding a simple script to the value field.

The last parameter is date time format string which can be changed as you like, but since I do daily report it’s enough to show just time part.

Finally, the last field of our table is a link to the actual workflow execution record where we can find all useful information that can help us to solve the problem.

After we have our table ready it’s time to send it via email. We will use Mail – Send an email notification component since it does not require any mail configuration and it uses OOB Flow mailing service as the email provider.

Image 316.png

The subject can, of course, contain any kind of text, but I think that the best approach is to put a name of the project and yesterdays date there so you can easily track your reports. We used addDays again to get the value of yesterday’s date.

If you just pass the output of Create HTML table in the Body of the email it will not work as expected. All the A tags will be displayed as HTML code and not as clickable links as you thought. The solution is to use replace function to replace all the &lt and &gt signs with actual < and > signs via expression.

When you set this expression your report is finally ready to be executed.

The final result is shown in the picture below (of course the email component has some additional CSS for table styling so ignore that part for now) for the report that came a few days ago for one of my D365 instances.

Image 317.png
Here is the image of the whole flow if someone wondered how it looks like on a bigger picture.
Image 318.png

I hope that you will find this Flow useful in your daily routine and save some time when you need to investigate the issues on your D365 instances. The great thing about Flow is that you can export this Flow and import it to any online instance and with a small manual work make it run for that new instance.