Hi all,
I’m having hard time setting up my first dbt-core taskgroup with Airflow. I’m using the virtualenv approach when running my dag:
dbt_tg = DbtTaskGroup( dbt_project_name="bq_test", dbt_root_path = "/opt/airflow/dbt", conn_id="google-cloud-" + ENV, execution_mode="virtualenv", operator_args={ "project_dir": "/opt/airflow/dbt/bq_test", "py_system_site_packages": False, "py_requirements": ["dbt-core==1.5.3", "dbt-postgres==1.5.3", "dbt-bigquery", "pandas", "matplotlib", "seaborn"], }, profile_args={ "dataset" : "TestDataset", "project" : "project-test" }, )
The virtualenv gets created, but I keep getting the following error:
[2023-07-20, 19:52:48 UTC] {subprocess.py:66} INFO - Running command: ['/tmp/cosmos-venvrjf3b7d0/bin/dbt', 'run', '--models', 'distributors_staging', '--profile', 'bq_test', '--target', 'cosmos_target']
[2023-07-20, 19:52:48 UTC] {subprocess.py:77} INFO - Output:
[2023-07-20, 19:52:58 UTC] {subprocess.py:87} INFO - e[0m19:52:58 Running with dbt=1.5.3
[2023-07-20, 19:53:06 UTC] {subprocess.py:87} INFO - e[0m19:53:06 Registered adapter: bigquery=1.5.3
[2023-07-20, 19:53:06 UTC] {subprocess.py:87} INFO - e[0m19:53:06 Encountered an error:
[2023-07-20, 19:53:06 UTC] {subprocess.py:87} INFO - Compilation Error
[2023-07-20, 19:53:06 UTC] {subprocess.py:87} INFO - dbt found 1 package(s) specified in packages.yml, but only 0 package(s) installed in dbt_packages. Run "dbt deps" to install package dependencies.
[2023-07-20, 19:53:08 UTC] {subprocess.py:91} INFO - Command exited with return code 2
[2023-07-20, 19:53:08 UTC] {taskinstance.py:1847} ERROR - Task failed with exception
My packages.yml file is in the root of the project:
The content of the file is:
packages:
- package: dbt-labs/dbt_utils
version: 1.1.1
Any hints would be appreciated why cosmos can not install the package despite finding and reading the packages.yml