Hitting the Airflow API through Astronomer


#1

What’s the best way to hit the Airflow API through Astronomer?

I’ve tried using a request like this:

curl -v -X POST https://AIRFLOW_DOMAIN/api/experimental/dags/airflow_test_basics_hello_world/dag_runs -H -H 'Authorization: <token>' -H 'Cache-Control: no-cache' -H 'content-type: application/json' -d '{}'

where the token is a service account token.

However, it seems to redirect me back to the login page.

Trying <IP>...
* TCP_NODELAY set
* Connected to empty-twinkling-7054-airflow.astro.miamed.de (<IP>) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
* successfully set certificate verify locations:
*   CAfile: /etc/ssl/cert.pem
  CApath: none
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
* TLSv1.2 (IN), TLS handshake, Server hello (2):
* TLSv1.2 (IN), TLS handshake, Certificate (11):
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
* TLSv1.2 (IN), TLS handshake, Server finished (14):
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
* TLSv1.2 (OUT), TLS change cipher, Client hello (1):
* TLSv1.2 (OUT), TLS handshake, Finished (20):
* TLSv1.2 (IN), TLS change cipher, Client hello (1):
* TLSv1.2 (IN), TLS handshake, Finished (20):
* SSL connection using TLSv1.2 / ECDHE-RSA-AES256-GCM-SHA384
* ALPN, server accepted to use h2
* Server certificate:
*  subject: CN=*.astro.miamed.de
*  start date: Oct 10 21:41:53 2018 GMT
*  expire date: Jan  8 21:41:53 2019 GMT
*  subjectAltName: host "DOMAIN" matched cert's "*.astro.miamed.de"
*  issuer: C=US; O=Let's Encrypt; CN=Let's Encrypt Authority X3
*  SSL certificate verify ok.
* Using HTTP2, server supports multi-use
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
* Using Stream ID: 1 (easy handle 0x7ff662805400)
> POST /api/experimental/dags/airflow_test_basics_hello_world/dag_runs HTTP/2
> Host: empty-twinkling-7054-airflow.astro.miamed.de
> User-Agent: curl/7.54.0
> Accept: */*
> Cache-Control: no-cache
> content-type: application/json
> Content-Length: 2
>
* Connection state changed (MAX_CONCURRENT_STREAMS updated)!
* We are completely uploaded and fine
< HTTP/2 302
< server: nginx/1.15.2
< date: Mon, 26 Nov 2018 21:07:09 GMT
< content-type: text/html
< content-length: 161
< location: https://DOMAIN/login?rd=https://empty-twinkling-7054-airflow.astro.miamed.de%2Fapi%2Fexperimental%2Fdags%2Fairflow_test_basics_hello_world%2Fdag_runs
< strict-transport-security: max-age=15724800
<
<html>
<head><title>302 Found</title></head>
<body bgcolor="white">
<center><h1>302 Found</h1></center>
<hr><center>nginx/1.15.2</center>
</body>
</html>
* Connection #0 to host empty-twinkling-7054-airflow.astro.miamed.de left intact
Note: Unnecessary use of -X or --request, POST is already inferred.
* Rebuilt URL to: Authorization: <token>/
* Port number ended with ' '
* Closing connection -1
curl: (3) Port number ended with ' '

#2

I got this to work - there was an extra -H in the request above.

  1. Generate a service account for the airflow instance
  2. curl -X POST https://true-aphelion-2649-airflow.astronomer.cloud/api/experimental/dags/example_dag/dag_runs -H 'Authorization: TOKEN' -H 'Cache-Control: no-cache' -H 'content-type: application/json' -d '{}'

You can read about the endpoints avaliable here:

Note; the airflow api is considered experimental, so use at your own risk