I am trying to update the variables on deployment using the CI process. Is it possible? Airflow has a CLI to update variables(https://airflow.apache.org/cli.html#variables), but I don’t think I can use that with remote cloud deployment. How can I achieve this?
Hi @shwetha! Right now, you unfortunately cannot programmatically update Variables + Connections on
astro airflow deploy for a remote deployment on Astronomer Cloud. You can always use the
airflow_settings.yaml file for doing so locally and we’re actively working on expanding the functionality of that feature.
We explored a workaround on doing so via a CI/CD process with the API, but unfortunately found it was not a viable solution.
We’ll keep you posted on both that feature and any other livable workarounds if we find any.
Once option may be to create a DAG that updates these variables for you and maybe trigger that dag somehow upon deployment? or have it run on a schedule.
I am trying to create Airflow connections;
Using airflow_settings.yaml requires hard coding the values for secrets such as username and password.
Even for local docker (astronomer) we don’t want to hard code user name and password;
Any update on adding connections and variables to Astronomer Enterprise?
What is that process and please give me the link to that documentation if available.
The options so far :
- create a dag to get data from Secrets manager & SSM parameters and create connections; which needs to be run everytime connections or values changes.
- Have a bootstrap script which runs in docker and also through CI CD pipeline when airflow deploys to other environments including test,non-prod and Prod.
I want to make sure before i start on options 1 or 2 ; whether astronomer provides any approach to do creation of connections and variables.
Hi @ravi! Thanks for reaching out here. Airflow 1.10.10, released a few weeks ago, actually now has a feature that allows you to use Environment Variables to sync Airflow Connections + Variables to secrets held in a few different secret backends, including Hashicorp Vault, GCP Secrets Manager and AWS Parameters Store.
On Astronomer’s 1.10.10 image, the following additional backends are included (not yet available in core Airflow’s 1.10.10 release):
- AWS Secrets Manager
- AWS Vault
Hope this answers your question - let us know
Thanks @paola for the reply;
As per airflow documentation it indicates that the airflow.cfg should have this configured
In astronomer docker ; i don’t see a way to set this as there is no airflow.cfg there.
I checked inside the Astronomer docker container airflow.cfg which is not set as noted below from airflow.cfg:
How do i set values ?
# Full class name of secrets backend to enable (will precede env vars and metastore in search path)
# Example: backend = airflow.contrib.secrets.aws_systems_manager.SystemsManagerParameterStoreBackend
@ravi The best way to configure your airflow.cfg file on Astronomer is by leveraging Environment Variables, which you can define either via the Astronomer UI or via your Dockerfile.
Two other forum posts you can reference: