DBT on Airflow(Astronomer)

I would like to set up DBT jobs on Airflow(astronomer).
When I run dbt run in the astro container, I get an error:

Running with dbt=0.19.1
Encountered an error:
[Errno 30] Read-only file system: 'logs/dbt.log'

I suspect we need to customize the container for this: Is there something I am missing?
Been following this doc: https://www.astronomer.io/blog/airflow-dbt-1

On running the dag, I get an error:

Output:
INFO - Running with dbt=0.19.1
INFO - Encountered an error while reading the project:
INFO -   ERROR: Runtime Error
INFO -   Could not find profile named 'snowflakeprofile'
INFO - Encountered an error:
INFO - Runtime Error
INFO -   Could not run dbt
INFO - Command exited with return code 2
ERROR - Bash command failed. The command returned a non-zero exit code.

However dbt debug runs okay and the connection test is ok

Hi @Eva!

Are you still having trouble running dbt on Astronomer? Are the jobs running locally as a CLI-initialized project or on Astronomer Cloud?

Side note for you as well, the repository referenced in that blog post has been updated to be more robust with some added functionality and examples and works locally out-of-the-box on the same dbt version (dbt==0.19.1). There are tasks that execute dbt run but also have --profiles-dir and --project-dir flags set which point to /usr/local/airflow/dbt. Maybe these flags help you but, again, let us know if you are still seeing this issue.

This is because your dbt project home is set in the dags directory.

Usually, in a deployment that is not your local deployment started by astro CLI, this works.

However, the local deployment, spun up by astro CLI, mounts the dags directory from your local machine to the containers.

When you execute dbt run, it will try to create a logs directory in the dbt root directory.

 <dbt-root-dir>/logs/dbt.log 

When dbt run is executed. dbt will attempt to create a directory in the dags directory.

Due to the way the dags directory in your local deployment is mounted, Docker will not allow for any user in the container to write to a mounted directory.

As you can see, the mounted directory is read only (“ro”).

docker inspect <scheduler_container>
...
            {
                "Type": "bind",
                "Source": "/host_mnt/Users/alan/projects/astro/dags",
                "Destination": "/usr/local/airflow/dags",
                "Mode": "ro",
                "RW": false,
                "Propagation": "rprivate"
            },

This is where the error comes in.

[Errno 30] Read-only file system: 'logs/dbt.log'

A way around this is to have your dbt root directory in the root astro folder.

|- astro
   |- dbt-home
   |- dags
   |- Dockerfile 
1 Like

Hi @Eva and @Alan
I wanted to follow up to see if any of the above strategies worked for you.
I too am facing a similar problem when I try running astro airflow in my local environment.

I tried different strategies of disabling dbt logs but none of them have worked.
dbt modules in my implementation are invoked using the bash operator and I am thus looking for a way to circumvent this issue.

@Alan I tried your advice and tried to write logs into dbt root directory with no success:

cmd: [‘bash’, ‘-c’, 'dbt --no-write-json run --profiles-dir /usr/local/airflow/dags/dbt/dbt_profiles --project-dir /usr/local/airflow/dags/dbt/product_health_edp_dbt

Failures:
tried: /var/local/airlfow got error: [Errno 30] Read-only file system: ‘target/partial_parse.msgpack’

tried: /usr/local/airflow/dags/ got error: Read-only file system: ‘/usr/local/airflow/dags/dbt.log.legacy’

tried: /usr/local/ got error: [Errno 13] Permission denied: ‘/usr/local/dbt.log.legacy’

Any tip or advice would be really helpful.

Thanks