Connect to Google API via Python in Astronomer

I’m using astronomer CLI to create my environment.
I’m trying to run a DAG that connects to GCP.

Code here

When I run my DAG I get:

[2021-05-12 07:50:17,653] {_metadata.py:104} WARNING - Compute Engine Metadata server unavailable onattempt 1 of 3. Reason: timed out
[2021-05-12 07:50:20,657] {_metadata.py:104} WARNING - Compute Engine Metadata server unavailable onattempt 2 of 3. Reason: timed out
[2021-05-12 07:50:20,664] {_metadata.py:104} WARNING - Compute Engine Metadata server unavailable onattempt 3 of 3. Reason: [Errno 111] Connection refused
[2021-05-12 07:50:20,664] {_default.py:250} WARNING - Authentication failed using Compute Engine authentication due to unavailable metadata server.
[2021-05-12 07:50:20,665] {taskinstance.py:1457} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1113, in _run_raw_task
    self._prepare_and_execute_task_with_callbacks(context, task)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1286, in _prepare_and_execute_task_with_callbacks
    result = self._execute_task(context, task_copy)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1316, in _execute_task
    result = task_copy.execute(context=context)
  File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 117, in execute
    return_value = self.execute_callable()
  File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 128, in execute_callable
    return self.python_callable(*self.op_args, **self.op_kwargs)
  File "/usr/local/airflow/dags/example-notebooks-executor.py", line 94, in _create_execution
    session = get_session()
  File "/usr/local/airflow/dags/example-notebooks-executor.py", line 83, in get_session
    admin_creds, _ = google.auth.default(_SCOPES)
  File "/usr/local/lib/python3.7/site-packages/google/auth/_default.py", line 364, in default
    raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
[2021-05-12 07:50:20,680] {taskinstance.py:1507} INFO - Marking task as FAILED. dag_id=example_notebooks_executor, task_id=notebooks_executor, execution_date=20210512T075012, start_date=20210512T075014, end_date=20210512T075020
[2021-05-12 07:50:20,761] {local_task_job.py:142} INFO - Task exited with return code 1
def get_session():
  """Gets HTTP authenticated session.

  Args:
    logger: A MigrationLogger used to print information in StackDriver.

  Returns:
    A google.auth.transport.requests.AuthorizedSession object.
  """
  admin_creds, _ = google.auth.default(_SCOPES)
  session = AuthorizedSession(credentials=admin_creds)
  return session

This same code works locally. (My local Mac)

Troubleshooting:

  1. Add GCP connection
    a) Include ProjectID
    b) Include OAuth scope https://www.googleapis.com/auth/cloud-platform
    c) Try /include/mykey.json ( As I noticed this folder may be loaded as volume)
    d) Hardcode file in Python code using Env variable.
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/include/mykey.json"

I get a different error:

google.auth.exceptions.DefaultCredentialsError: File /include/mykey.json was not found.

Version

Astronomer Certified: [ **v2.0.0.post5** ](https://www.astronomer.io/downloads/ac/v2-0-0) based on Apache Airflow [ **v2.0.0** ](https://pypi.python.org/pypi/apache-airflow/2.0.0)
Git Version: **.release:2.0.0+astro.5+206c6a66cb671f7fd331fd2c0faa1843b8682792**

I ended up copying my keyfile inside my container
COPY keyfile.json .

having the same issue - is manually modifying the dockerfile for local development the suggested solution?

i also tried including the keyfile.json into the “include” directory, but my DAGs were unable to read the file when the env var pointed inside the folder