Connect to Google API via Python in Astronomer

I’m using astronomer CLI to create my environment.
I’m trying to run a DAG that connects to GCP.

Code here

When I run my DAG I get:

[2021-05-12 07:50:17,653] {} WARNING - Compute Engine Metadata server unavailable onattempt 1 of 3. Reason: timed out
[2021-05-12 07:50:20,657] {} WARNING - Compute Engine Metadata server unavailable onattempt 2 of 3. Reason: timed out
[2021-05-12 07:50:20,664] {} WARNING - Compute Engine Metadata server unavailable onattempt 3 of 3. Reason: [Errno 111] Connection refused
[2021-05-12 07:50:20,664] {} WARNING - Authentication failed using Compute Engine authentication due to unavailable metadata server.
[2021-05-12 07:50:20,665] {} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/airflow/models/", line 1113, in _run_raw_task
    self._prepare_and_execute_task_with_callbacks(context, task)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/", line 1286, in _prepare_and_execute_task_with_callbacks
    result = self._execute_task(context, task_copy)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/", line 1316, in _execute_task
    result = task_copy.execute(context=context)
  File "/usr/local/lib/python3.7/site-packages/airflow/operators/", line 117, in execute
    return_value = self.execute_callable()
  File "/usr/local/lib/python3.7/site-packages/airflow/operators/", line 128, in execute_callable
    return self.python_callable(*self.op_args, **self.op_kwargs)
  File "/usr/local/airflow/dags/", line 94, in _create_execution
    session = get_session()
  File "/usr/local/airflow/dags/", line 83, in get_session
    admin_creds, _ = google.auth.default(_SCOPES)
  File "/usr/local/lib/python3.7/site-packages/google/auth/", line 364, in default
    raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see
[2021-05-12 07:50:20,680] {} INFO - Marking task as FAILED. dag_id=example_notebooks_executor, task_id=notebooks_executor, execution_date=20210512T075012, start_date=20210512T075014, end_date=20210512T075020
[2021-05-12 07:50:20,761] {} INFO - Task exited with return code 1
def get_session():
  """Gets HTTP authenticated session.

    logger: A MigrationLogger used to print information in StackDriver.

    A google.auth.transport.requests.AuthorizedSession object.
  admin_creds, _ = google.auth.default(_SCOPES)
  session = AuthorizedSession(credentials=admin_creds)
  return session

This same code works locally. (My local Mac)


  1. Add GCP connection
    a) Include ProjectID
    b) Include OAuth scope
    c) Try /include/mykey.json ( As I noticed this folder may be loaded as volume)
    d) Hardcode file in Python code using Env variable.
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/include/mykey.json"

I get a different error:

google.auth.exceptions.DefaultCredentialsError: File /include/mykey.json was not found.


Astronomer Certified: [ **v2.0.0.post5** ]( based on Apache Airflow [ **v2.0.0** ](
Git Version: **.release:2.0.0+astro.5+206c6a66cb671f7fd331fd2c0faa1843b8682792**

I ended up copying my keyfile inside my container
COPY keyfile.json .

having the same issue - is manually modifying the dockerfile for local development the suggested solution?

i also tried including the keyfile.json into the “include” directory, but my DAGs were unable to read the file when the env var pointed inside the folder