Call Environment variable when using TaskFLow API

According to the course the Power of environment of Marc Lamberti : https://academy.astronomer.io/astronomer-certification-apache-airflow-dag-authoring-preparation/890829

  1. When I add a variable in the Dockerfile in docker, when i restart by doing astro dev stop && astro dev start, and after reconnecting vs code to the attached containenr, the environment variable was delete. It’s look like the Docker file it’s recreate.

  2. About the code bellow, is it a good way to call the variable environement ?

Dockerfile

ENV AIRFLOW_VAR_WORK_DATA = '{ "company": "amazon", "statut": "Solution Architect"  }'
from airflow.decorators import dag, task
from datetime import datetime, timedelta
from airflow.models import Variable
from airflow.operators.python import PythonOperator
from airflow.operators.bash import BashOperator



@dag("mywork",
     start_date=datetime(2024,10,31), 
     tags=['work'],
     schedule_interval=timedelta(minutes=2),
     catchup=False
     )
def work_dag():
            
    @task
    def Hello():
        print(f"today is {datetime.now}")
        
    @task
    def work():
        return "{{ var.json.work_data.name }}"
        
    hello() >> work()
        
work_dag()
  • I created a file named environ.env and i putted my variables
  • Then after restart buy doing astro dev restart --env /enviro.env
  • In the task I just used the variable like that : print(os.getenv(“AIRFLOW_VAR_WORK_DATA”))

Normally it’s work fine. But is it a good way to protect sensitive data ?