Hello everyone,
Today I’ll talk about Apache Airflow usage, a REST API.
I frequently have customers asking about Apache Airflow’s integration with their own applications. “How can I execute a job from my application?” or “how can I get my job status in my dashboard?” are good examples of the questions I receive the most.
I’ll use the following question from a customer to show this great feature in Apache Airflow:
“ I would like to call one specific job orchestrated in Apache Airflow environment in my application, is it possible?”
Quick answer: “Yes, all that you need to do is to call the Airflow DAG using REST API …..“
Details:
The simplest way to show how to achieve this is by using curl to call my Apache Airflow environment. I had one DAG to execute this from a bash operator. Quick example:
curl -X POST \
http://localhost:8080/api/experimental/dags/my_bash_oeprator/dag_runs \
-H ‘Cache-Control: no-cache’ \
-H ‘Content-Type: application/json’ \
-d ‘{“conf”:”{\”key\”:\”value\”}”}’
The curl execution returns the execution date id, with this ID you can use to get an execution status. Like this:
curl -X GET http://localhost:8080/api/experimental/dags/my_bash_oeprator/dag_runs/2020-04-05T00:26:35
{“state”:”running”}
This command can also return other status {“state”:”failed”} or {“state”:”success”}.
I hope you enjoy it!