Local Airflow using SecretsManager as backend "Can't locate credentials"

'm running a local environment Airflow 2.0.2 instance that is running on Docker. Everything is working fine and I try to set up SecretsManager as an alternative backend for connections only.

I’ve added this to the airflow.cfg file:

[secrets]
backend = airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend
backend_kwargs = {"connections_prefix" : "airflow/connections", "variables_prefix" : null}

Then, I’ve configured a aws_default connection in Airflow Connections with:
Type: Amazon Web Services

Name: aws_default

Login: <aws_access_key>

Password: <aws_secret_access_key>

I verified that the credentials are working by using a function that initiates an AWSHook with this connection ID.

I added SecretsManager readWrite policy to the user that the credentials belong to.

I then try to use the SnowflakeOperator to run a query against Snowflake such that the connection is fetched from SecretsManager, but I get this error as if the backend SecretsManager service cannot locate credentials to use the API and query AWS SecretsManager for the secret.

BTEW, The xxxx_snowflake_operator and xxxx_snowflake_hook in the Traceback are just wrappers over the snowflake_operator and snowflake_hook , doing exactly the same thing without any changes (only color for the UI)

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task
    self._prepare_and_execute_task_with_callbacks(context, task)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
    result = self._execute_task(context, task_copy)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task
    result = task_copy.execute(context=context)
  File "/usr/local/airflow/plugins/operators/xxxx_snowflake_operator.py", line 84, in execute
    raise ex
  File "/usr/local/airflow/plugins/operators/xxxx_snowflake_operator.py", line 78, in execute
    parameters=self.parameters)
  File "/usr/local/lib/python3.7/site-packages/airflow/hooks/dbapi.py", line 173, in run
    with closing(self.get_conn()) as conn:
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 215, in get_conn
    conn_config = self._get_conn_params()
  File "/usr/local/airflow/plugins/hooks/xxxx_snowflake_hook.py", line 22, in _get_conn_params
    conn = self.get_connection(self.snowflake_conn_id)
  File "/usr/local/lib/python3.7/site-packages/airflow/hooks/base.py", line 67, in get_connection
    conn = Connection.get_connection_from_secrets(conn_id)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets
    conn = secrets_backend.get_connection(conn_id=conn_id)
  File "/usr/local/lib/python3.7/site-packages/airflow/secrets/base_secrets.py", line 64, in get_connection
    conn_uri = self.get_conn_uri(conn_id=conn_id)
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/secrets/secrets_manager.py", line 115, in get_conn_uri
    return self._get_secret(self.connections_prefix, conn_id)
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/secrets/secrets_manager.py", line 153, in _get_secret
    SecretId=secrets_path,
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 663, in _make_api_call
    operation_model, request_dict, request_context)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 682, in _make_request
    return self._endpoint.make_request(operation_model, request_dict)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 102, in make_request
    return self._send_request(request_dict, operation_model)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 132, in _send_request
    request = self.create_request(request_dict, operation_model)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 116, in create_request
    operation_name=operation_model.name)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 356, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 90, in handler
    return self.sign(operation_name, request)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 162, in sign
    auth.add_auth(request)
  File "/usr/local/lib/python3.7/site-packages/botocore/auth.py", line 373, in add_auth
    raise NoCredentialsError()
botocore.exceptions.NoCredentialsError: Unable to locate credentials

How come the SecretsManager Airflow backend can’t locate the credentials? doesn’t it use the default AWS connection ID to run its botocore API requests?

I got it to work by setting a config file in ~/.aws/config that contains the aws access and secret keys. I wonder if I can make it use the aws_default connection in Airflow.