Introduction
This blog post aims to demonstrate how you can configure an alert system for your ADF pipelines without relying on the in-build alert rule system provided by Azure Data Factory.
But you may wonder why do i need to do that when I already have a in-build alert rule system managed by azure monitor for us?
The answer is very simple. The manual create an alert system within the pipeline can help in following situations:-
1. Lets say, you are a part of a distributed email group that gets an alert when pipeline fails. Due to unusual circumstances at some point in time if the pipeline failure doesn't trigger alert email then it becomes a critical issue as some pipelines may belong to production environment and could certainly impact the business. The problem can be isolated by testing whether distributed email group is unresponsive from the SMTP server or an issue with the alert rule in ADF by creating custom alert emails via logic app through ADF pipeline.
2. To send an alert to the stakeholders just after initial activity failure which is configured at start and not wait for the whole pipeline to fail in general.
3. For some users, custom alert creation is more reliable and suits a business logic on which the ETL pipelines are build.
Fig 1: Failure Alert Mechanism via Logic App Integration
Custom Alert System Creation Via Logic Apps
You need to follow the steps provided below to create your own custom alert system:-
Step-1: Create pipeline in Azure Data Factory
Let's start with creating a new ADF pipeline and bring copy & web activities in the canvas. Connect the failure status to the web activity indicating that "failure of the pipeline should trigger the web activity).
Fig 2: Pipeline Overview For Alert Rule Creation
Once the activities are in the canvas, we have to configure the source and destination for copy activity. We need to ensure that the copy activity must fail.
How can we intentionally fail the ADF pipeline?
Step-2: Configure Linked Service & Datasets For Copy Activity
First create a linked service and configure the path of an blob storage where actual file is present. In our case, we have a storage account named- "testlogicapp" and container name as "test". Inside the container there is one file named-"source_table.csv".

Fig 3: Source Dataset View in Storage Account
Now lets jump to ADF pipeline and create the source and destination datasets via linked service that points to the storage account.
The source dataset can be configured as :-
Fig 4: Source Dataset creation in ADF
Since the above dataset is correctly configured and we need to fail the pipeline, we shall tweet the source dataset and provide a wrong data in the "directory" path as displayed in below image:-
Fig 5: Wrong File Path For Source Dataset
We chose to give wrong file path to the source dataset such that when it tries to check the path for fetching the data in the given path provided in the source dataset, it would result in failure of copy activity.
Similarly we can configure sink dataset as shown below:-
Fig 6: File Path For Sink Dataset
In the above image we can configure to set the output file path on which we want to store the data while creating the sink dataset for the copy activity.
Now we know for the fact that we have intentionally configured wrong path for source dataset , the copy activity pipeline bound to fail. Hence now we can connect the failure thread of copy activity to the web activity as shown in Fig2.
Step-3: Configure Logic App In ADF Pipeline
3.1 Let's create the logic app. On the azure portal search for logic app and create one by choosing "Multi-tenant" pricing model for logic app creation.
3.2 Go to the service and find "Logic App Designer" on "Development Tools" pane. On selecting the logic app designer, you will see a canvas on which you can design your workflow. Click on "Add a trigger" which would provide you with multiple options provided by logic apps.
On the search option type -"When a HTTP request is received" , you will get the following configuration window:-
Fig 6: Configuration For Logic App
3.3 Just click on the "Save" option present on top of logic-designer app, and you will see the following URL for this HTTP request component created in logic app. Refer below image for the same.
Fig 7: URL After Saving Logic App
3.4 Now copy this URL and paste it in the "URL" settings in the web activity of ADF pipeline.
Fig 8: Web Activity Configuration In ADF
3.5 Select the "POST" option from method and paste the following code in the body section:-
{
"pipelineName": "@{pipeline().Pipeline}",
"pipelineRunId": "@{pipeline().RunId}",
"triggerName": "@{trigger().Name}", //{This object is not used in demo}//
"triggerTime": "@{trigger().scheduledTime}", //{This object used in this demo} //
"failureActivity": "CopyFromBlobToSQL",
"errorMessage": "Copy activity failed while transferring data from Blob to SQL",
"environment": "Production",
"project": "CustomerETL",
"timeStamp": "@{utcNow()}"
}
3.6 Copy the above json. Reopen the HTTP trigger component. Refer Fig6, you can see "Use sample to generate json" option is visible. Click on that , a expression builder page will open. Paste the code on that expression builder. The result will appear like the below image:-
Fig 9: Sample JSON Code In HTTP Request Trigger
3.7 After clicking on Done , the "Request Body JSON" option will have a json template ready to be used in the subsequent logic app component.
Fig 10: Final HTTP Request Trigger Configuration
3.8 Let's go back to the Logic App Designer in the logic app created. And click on add to include one more component which would be to send emails to the stakeholders.
Click on "Add action" and search for "send email" in search option, select the "Sent Email V2" under Email section.
Fig 11: Adding Another Action To Logic App
3.9 If you are using this action for the first time, it will ask you the email address from which you want to send the email address and then redirect it to the following configuration pane where you can configure the email-address to which you want the alert to be delivered.
Fig 12: Configuration Setting For "Send Email" Action
Moreover, you can choose Importance level and subject visible in selection from the pane.
The body of the email is where we would use the "Parameters of JSON" that was passed in the body request in ADF web activity. Refer Fig8.
Please follow the below images to understand the flow and configure the same in the body section of "send email (v2) action configuration:-
Click on the lightening option to open the dynamic content expression.
Similarly you can configure the other parameters. On completion, the complete body would look like the below image.
Fig 13: Parametrized Settings For Body In Send Email(v2)
3.10 On clicking on "Save" option in the Logic app designer , you may see the following error in the logic app designer in the "Send Email" action as below:-

Fig 14: Error Message Pop-up In Send Email(v2)
Please don't worry as it doesn't mean you had missed any configuration. Just reopen the logic app again and this issue will disappear. But a piece of warning, Save it before closing the logic app.
Step-4 Run the ADF Pipeline
Our final step is to run the pipeline. You can check your inbox for the alert message.
Fig 15: Logic App Trigger Via Web Activity
Post Copy Data activity failure, the following web activity triggers the http -"POST" Request to the Logic App which would further send email alerts to the email-address mentioned in "Send Email v2" action.
The email alert coming to the inbox would look like the following image:-
Fig 16: Alert Email Regarding Pipeline Failure
Conclusion
This blog aimed to provide the step-by-step process for creating a custom alert system which can be triggered via logic app within the ADF pipeline. I hope you had a great time!
Thanks For Reading!
- MANAN CHOUDHARY
Comments
Post a Comment