Astronomer enviroment connection: Google Cloud Platform connection is not imported correctly into local Airflow development environment

I define a Google Cloud Platfrom connection in the Astonomer GUI to share it with local development environments.

However, the connection is not imported correctly into my local dev environment. Instead of the Google Cloud connection type, it is imported with the Generic type. Accordingly, the key file is missing, which presumably leads to a JSON parsing error that prevents the associated DAG from being executed.

I would like to know, if I’m doing anything wrong or if this is an bug?


I perform the following steps to set up the connection and using it.

Astronomer GUI

  • Workspace > Enviroment > + Connection > Google Cloud Platform (Keyfile)
  • Naming the connection and inserting the content of the keyfile.
  • Organization Settings > Edit details > ENVIROMENT SECRETS FETCHING > Enabled

Astro CLI

  • astro dev stop
  • astro config set -g disable_env_objects false
  • astro dev start --workspace-id {the ws-id where I defined the connection}
  • The connection is added:
Airflow is starting up!
Added Connection: google_cloud_platform

Project is running! All components are now available.

Local Airflow GUI

  • The connection is imported as follows:

  • Hence the DAG run throws the follwin error.
[2024-03-14, 10:49:43 UTC] {connection.py:477} ERROR - Failed parsing the json for conn_id google_cloud_platform
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/airflow/models/connection.py", line 474, in extra_dejson
    obj = json.loads(self.extra)
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
               ^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Expecting ',' delimiter: line 1 column 51 (char 50)
  • If I change the connection type manually in Airflow to Google Cloud and provide the content of the JSON the DAG runs as expected; i.e. a connection can be established.
1 Like

Thanks @netzstreuner for reporting this. Just to confirm, the same connection that you created using Astro UI is showing as Generic and when you flip it to GCP, the DAG that uses that connection works fine ?

Thanks
Manmeet