Skip to Content

How to monitor your pipelines using Azure Data Factory Analytics from Microsoft Azure Cloud

Sogeti Labs
December 02, 2020

Don’t be caught off guard! Never lose focus and check the details

One of the most important aspects of designing a good ELT solution is being able to control its performance. It is of little use to have the systems working if you are not able to know if they are doing it correctly or, if on the contrary, they are causing failures in them. That is why tools such as Azure Data Factory Analytics are increasingly necessary.

Although Azure Data Factory Analytics is a preview service in the Microsoft cloud, which allows integrating the information from Azure Data Factory through Log Analytics. It offers us in a simple way an overview of the state of our ADF instance, allowing us to go down to the detail of what happened, in order to find out what happened. As a prerequisite, comment on the need to have Log Analytics.

Today we will describe the complete deployment process for this solution. Let’s proceed

The first step is to create a Log Analytics workspace. To do this, the first thing to do is to register a Records Analysis (OMS) service in our resource group.

Then we must complete the fields of the form, indicating the name of our work instance and its location.

Perfect, we have completed the first step of our itinerary. Now we must jump to our Azure Data Factory instance to link it with the newly created Log Analytics workspace.

This is done by clicking on the Diagnostic Settings field in the menu on the left.

Once inside, we will select all the logs and metrics that we want to control. On the other hand, we must select the destination (s) and finally indicate the Log Analytics work area. In our case we select the first three logs and the set of metrics. As for the destination, we will use our newly created area.

We check that it is correctly configured and go to the next point.

If we recap, we have the Log Analytics workspace that has been linked to our Azure Data Factory instance, what would we have left? Well, implement the Azure Data Factory Analytics solution. To do this, within the resource group, click on add new and indicate the name of the new service. The result should be something like this.

When clicking on create, the screen below will open where we must select the Log Analytics work area that we want. In our case it corresponds to the one created previously.

Once completed, we will be able to obtain the set of KPIs from our Azure Data Factory instance. This will allow us, from a single place, to visualize the main control metrics. In addition, we can obtain a more detailed analysis only by clicking on the elements.

General Screen

If we click on the summary graph, we will access a higher level of detail.

And finally, if we click on any of the details presented, we will be able to access the consultation area. We see an example.

I hope you liked this short example of how to enable Azure Data Factory Analytics in your Azure account. Although it is a preview service, it will most likely end up being permanently included in Microsoft’s portfolio.

About the author

SogetiLabs gathers distinguished technology leaders from around the Sogeti world. It is an initiative explaining not how IT works, but what IT means for business.

    Comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Slide to submit