DBT on Airflow(Astronomer)

I would like to set up DBT jobs on Airflow(astronomer).
When I run dbt run in the astro container, I get an error:

Running with dbt=0.19.1
Encountered an error:
[Errno 30] Read-only file system: 'logs/dbt.log'

I suspect we need to customize the container for this: Is there something I am missing?
Been following this doc: https://www.astronomer.io/blog/airflow-dbt-1

On running the dag, I get an error:

Output:
INFO - Running with dbt=0.19.1
INFO - Encountered an error while reading the project:
INFO -   ERROR: Runtime Error
INFO -   Could not find profile named 'snowflakeprofile'
INFO - Encountered an error:
INFO - Runtime Error
INFO -   Could not run dbt
INFO - Command exited with return code 2
ERROR - Bash command failed. The command returned a non-zero exit code.

However dbt debug runs okay and the connection test is ok

Hi @Eva!

Are you still having trouble running dbt on Astronomer? Are the jobs running locally as a CLI-initialized project or on Astronomer Cloud?

Side note for you as well, the repository referenced in that blog post has been updated to be more robust with some added functionality and examples and works locally out-of-the-box on the same dbt version (dbt==0.19.1). There are tasks that execute dbt run but also have --profiles-dir and --project-dir flags set which point to /usr/local/airflow/dbt. Maybe these flags help you but, again, let us know if you are still seeing this issue.

This is because your dbt project home is set in the dags directory.

Usually, in a deployment that is not your local deployment started by astro CLI, this works.

However, the local deployment, spun up by astro CLI, mounts the dags directory from your local machine to the containers.

When you execute dbt run, it will try to create a logs directory in the dbt root directory.

 <dbt-root-dir>/logs/dbt.log 

When dbt run is executed. dbt will attempt to create a directory in the dags directory.

Due to the way the dags directory in your local deployment is mounted, Docker will not allow for any user in the container to write to a mounted directory.

As you can see, the mounted directory is read only (“ro”).

docker inspect <scheduler_container>
...
            {
                "Type": "bind",
                "Source": "/host_mnt/Users/alan/projects/astro/dags",
                "Destination": "/usr/local/airflow/dags",
                "Mode": "ro",
                "RW": false,
                "Propagation": "rprivate"
            },

This is where the error comes in.

[Errno 30] Read-only file system: 'logs/dbt.log'

A way around this is to have your dbt root directory in the root astro folder.

|- astro
   |- dbt-home
   |- dags
   |- Dockerfile