Google Metadata Server unavailable in local testing

Hi Astonomer Team,

I really hope you can help me, I am in the process of updating to airflow 2.0 and have been stuck on a problem for days.

A task in which I want to create a new dataset in google bigquery does not work anymore. I have a service account and the configuration has always worked. But now with airflow 1.10.14 and the google provider package installed i always get this error message.

[2020-12-20 13:27:20,183] {bigquery.py:437} INFO - Creating dataset: xxx_gt_staging in project: xxx-test-271011
[2020-12-20 13:27:20,192] {_metadata.py:104} WARNING - Compute Engine Metadata server unavailable onattempt 1 of 3. Reason: [Errno 111] Connection refused
[2020-12-20 13:27:20,194] {_metadata.py:104} WARNING - Compute Engine Metadata server unavailable onattempt 2 of 3. Reason: [Errno 111] Connection refused
[2020-12-20 13:27:20,196] {_metadata.py:104} WARNING - Compute Engine Metadata server unavailable onattempt 3 of 3. Reason: [Errno 111] Connection refused
[2020-12-20 13:27:20,196] {_default.py:246} WARNING - Authentication failed using Compute Engine authentication due to unavailable metadata server.
[2020-12-20 13:27:20,196] {taskinstance.py:1150} ERROR - Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/google/cloud/operators/bigquery.py", line 1424, in execute
    exists_ok=False,
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/google/common/hooks/base_google.py", line 383, in inner_wrapper
    return func(self, *args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/google/cloud/hooks/bigquery.py", line 438, in create_empty_dataset
    self.get_client(location=location).create_dataset(dataset=dataset, exists_ok=exists_ok)
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/google/cloud/hooks/bigquery.py", line 136, in get_client
    credentials=self._get_credentials(),
  File "/usr/local/lib/python3.7/site-packages/google/cloud/bigquery/client.py", line 180, in __init__
    _http=_http,
  File "/usr/local/lib/python3.7/site-packages/google/cloud/client.py", line 249, in __init__
    _ClientProjectMixin.__init__(self, project=project)
  File "/usr/local/lib/python3.7/site-packages/google/cloud/client.py", line 201, in __init__
    project = self._determine_default(project)
  File "/usr/local/lib/python3.7/site-packages/google/cloud/client.py", line 216, in _determine_default
    return _determine_default_project(project)
  File "/usr/local/lib/python3.7/site-packages/google/cloud/_helpers.py", line 186, in _determine_default_project
    _, project = google.auth.default()
  File "/usr/local/lib/python3.7/site-packages/google/auth/_default.py", line 356, in default
    raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started

I first thought it was a code problem, but I deployed everything to a new astronomer test instance and there I do not have the error. Here everything runs as usual. So I only get the error message when I test locally.

Can the problem be in the configuration of the docker container? I am really starting to get desperate here :frowning:

please help me

Thank you submitting this issue!

If you would like more eyes on the issue and are currently a customer, I would definitely recommend you submit any issue to Astronomer Support. Also you can provide information about your deployment securely in the Zendesk portal so we can hop on your deployment and check if there’s anything out of the ordinary.


As for the actual ticket, are you sure the environment variable is set in your local instance? It can be done through the .env file or in the Dockerfile.

You can read more about it in our documentation on Environmental Variables.