I received mail from Ben about updating cli version ; I will do that first and make sure the dockerfile is pointing to the latest airflow version and try again.
currently! the docker file was showing as :
downloaded latest build : Astro CLI Version: 0.12.0
docker file is showing : FROM astronomerinc/ap-airflow:latest-onbuild
with this error is "Broken DAG: [/usr/local/airflow/dags/list_s3_files.py] No module named ‘airflow.providers’"
Attempt2: As indicated earlier(by Viraj) ;
I have updated docker file to the version :
" FROM astronomerinc/ap-airflow:1.10.7-alpine3.10-onbuild"
and rebuilt image ( astro dev stop and astro dev start )
and did the error is the same:
a) Removed all the images and containers on docker;
b) did astro dev start which downloaded the version “astronomerinc/ap-airflow:1.10.7-alpine3.10-onbuild” image
Same Error "Broken DAG: [/usr/local/airflow/dags/list_s3_files.py] No module named ‘airflow.providers’"
-You can use any operators on either cloud and enterprise; our support covers airflow itself, not the specific operator. You’ll just have to make sure you are importing them from where the operators exist on that particular branch (e.g. you are running airflow 1.10.7, in which the operators exist in the path airflow.contrib.operators...)
Are those separate creds you want to bake in? You can add files to the same directory as the Dockerfile and everything will be baked into the image
While developing with docker locally ;
we first sign up through okta which gets aws credentials for period of time and those credentails are stored locally on windows machine in under /usr/username/.aws.
how can i get this to map with docker ; dag can work with credentials and connect AWS?