We will be using Astro CLI for exporting the Airflow Connections. (If you are still not using Astro CLI, you are missing the fun of local development with Airflow, Install Astro CLI now!
- Spin up your local Airflow Environment, if you haven’t already, by following these steps:
a. Install Astro CLI
b. Create an Astro project
c. [Start your Airflow](Create and run your first DAG on Astro | Astronomer Documentation) - Create/Update/Save the Connections using Airflow UI
- To export connections in URI format, use the following command:
astro dev object export --env-export --env="my.env"
This command will also export other Airflow objects stored in your Airflow metadata database, for example variablse and pools.
You can upload these easily to your Secret Manager. To read more options on different ways to export Airflow Connections refer to the Exporting Connections in Airflow Docs.
Examples to export from Astro/Local
- URI
# export all airflow connections, variables and pools to `.env` file
astro dev object export --env-export
# export all airflow connections only to .env file.
astro dev object export --env-export --connections
# export all airflow variables only to .env file.
astro dev object export --env-export --variables
# print the connections in the default URI format to STDOUT.
astro dev run connections export - --file-format=env
- YAML
# export connections in YAML format to conns.yaml in your astro project's include dir
astro dev run connections export --file-format=yaml /usr/local/airflow/include/conns.yaml
- JSON
# export connections in JSON format to conns.json in your astro project's include dir
astro dev run connections export --file-format=json /usr/local/airflow/include/conns.json
# print the connections in the default JSON format to STDOUT
astro dev run connections export - --file-format=env --serialization-format=json
# export variables in JSON format to vars.json in your astro project's include dir
astro dev run variables export /usr/local/airflow/include/vars.json