How do I set my postgres username and password to access the ad hoc query feature locally?

#1

Hello, I am looking to set (or get) my postgres db username and password for ad hoc queries.

Is there an environment variable I can set in the Dockerfile?

Or is there something I can set in here?

$ cat .astro/config.yaml 
project:
  name: airflow-pipelines
webserver: 
  port: 8081
postgres:
  port: 5432

I’m looking to connect to the airflow db from a db client from both a cloud based deployment and my local test environment: astro airflow start.

It looks like in the cloud deployment, its:
host: [host]-pgbouncer
schema: [host]-metadata
login: [host]_airflow
password: ?

Thanks!

0 Likes

#2

When developing locally, you will need to set up your local Postgres connection to query the Airflow metadata database via the “Ad Hock Query” feature (found in Data Profiling/Ad Hoc Query in the top menu)

NOTE: Airflow uses Pandas to power the Ad Hoc Query engine. This requires an additional dependency “libstdc++” to be added to your packages.txt file.

In the airflow_db connection object, change the Conn Type to Postgres and add the following connection information:

host: postgres
schema: postgres
login: postgres
password: postgres
port (optional): 5432

The port will be set to 5342 by default but if you’ve set a custom port, you’ll need to update that here as well.

1 Like

#3

Quick Update: If you’re interested in using Airflow’s Ad-Hoc Query feature for your “remote” deployment on Astronomer Cloud, check out this forum post.

0 Likes

How are Connections, Variables, and Env Vars encrypted on Astronomer?