When i try use the contrib operator ; it gives an error with import;
Broken DAG: [/usr/local/airflow/dags/list_s3_files.py] No module named 'airflow.contrib.operators.s3_copy_object_operator’
Are we not supposed to use the contrib operators?
this is code line in the dag :
“from airflow.contrib.operators.s3_copy_object_operator import S3CopyObjectOperator”
Thanks Viraj.
I received mail from Ben about updating cli version ; I will do that first and make sure the dockerfile is pointing to the latest airflow version and try again.
currently! the docker file was showing as :
FROM astronomerinc/ap-airflow:0.8.2-1.10.2-onbuild
Attempt 1:
downloaded latest build : Astro CLI Version: 0.12.0
docker file is showing : FROM astronomerinc/ap-airflow:latest-onbuild
with this error is "Broken DAG: [/usr/local/airflow/dags/list_s3_files.py] No module named ‘airflow.providers’"
Attempt2: As indicated earlier(by Viraj) ;
I have updated docker file to the version :
" FROM astronomerinc/ap-airflow:1.10.7-alpine3.10-onbuild"
and rebuilt image ( astro dev stop and astro dev start )
and did the error is the same:
Attempt3:
a) Removed all the images and containers on docker;
b) did astro dev start which downloaded the version “astronomerinc/ap-airflow:1.10.7-alpine3.10-onbuild” image
Same Error "Broken DAG: [/usr/local/airflow/dags/list_s3_files.py] No module named ‘airflow.providers’"
That airflow.providers file path is not avail in the version of the astronomer image you are running (it’s only on master branch for airflow right now)
from airflow.contrib.operators.s3_copy_object_operator import S3CopyObjectOperator
from airflow.contrib.operators.s3_delete_objects_operator import S3DeleteObjectsOperator
That matches up to the 1.10.7 file path in Airflow:
-You can use any operators on either cloud and enterprise; our support covers airflow itself, not the specific operator. You’ll just have to make sure you are importing them from where the operators exist on that particular branch (e.g. you are running airflow 1.10.7, in which the operators exist in the path airflow.contrib.operators...)
Are those separate creds you want to bake in? You can add files to the same directory as the Dockerfile and everything will be baked into the image
Hi Viraj,
While developing with docker locally ;
we first sign up through okta which gets aws credentials for period of time and those credentails are stored locally on windows machine in under /usr/username/.aws.
how can i get this to map with docker ; dag can work with credentials and connect AWS?
The entire directory will be built into your image when you build it, so you should be able to just put credentails inside of that directory. They will by default be found at /usr/loca/airflow