What's the best way to get connections into astronomer cloud?

I am accustomed to loading connections with env var script or CLI.

How do you recommend we do this with astronomer cloud? Or enterprise?

One complication with using env variables is that they won’t be accessible by a running instance until reboot. Does astronomer reboot the cluster when you save an env var? Also, env variables are pretty front and center in the astro web UI which is maybe a little uncomfortable for creds…

We can load connections in the airflow UI but that is very manual and error prone and would be a pain if you are spinning up new clusters with any regularity.

Perhaps CI / CD tools would help with this?

Perhaps they should be baked into the image?

What have other customers done and what do you recommend?

2 Likes

Right now .env files and the airflow_settings.yaml only apply to local dev. It does not apply to apply to projects pushed out to clusters (cloud or enterprise). You wouldn’t want those files pushed out to a git repo, so using them in that manner may be a bit hard to keep in sync across teams.

One option is to build a DAG that syncs connections from a centralized repository like Vault or AWS Secrets Manager. You would need to manually create 1 Airflow connection to talk to the central repo, but after that, you could have a task that loops through and creates a connection for each secret in your repo. This dag could be run manually or on a schedule to sync every hour or so.

1 Like

Interesting idea @AndrewHarmon. Do you happen to have some sample code of what that DAG would look like?

Now there is secrets backend

Yes! Thanks for the update here, @dstandish. Support for a Secrets Backends (e.g. Hashicorop vault, SSM Parameter Store, etc.) came with Airflow 1.10.10 and is fully supported on Astronomer.

For details on how to get that working on Astro, check out our “Managing Secrets” doc.

Note: Support for Azure Key Vault Store coming soon, in-progress Airflow PR here.