Hi there,
right now I try to get Astronomer working locally on my Windows 10 WSL system.
I’ve found this post here in the forum: DAGs aren't showing up in my Astronomer deployment, but I see them locally
But it seems I have the opposite problem: I can see my dags on in Astronomer Cloud after deploying, but airflow can’t find them locally:
/c/Users/richard/airflow$ ll dags
insgesamt 10242
drwxrwxrwx 0 root root 512 Jul 10 13:59 ./
drwxrwxrwx 0 root root 512 Jul 17 08:20 ../
-rwxrwxrwx 1 root root 1950 Jul 3 09:59 example-dag.py*
-rwxrwxrwx 1 root root 5964 Jul 9 08:38 get_token.py*
Considering the error below I also put the dags (unnecessarily?) into /usr/local/airflow/dags
:
/c/Users/richard/airflow$ ll /usr/local/airflow/dags
insgesamt 12
drwxr-xr-x 0 root root 512 Jul 9 09:29 ./
drwxr-xr-x 0 root root 512 Jul 9 09:29 ../
-rwxr-xr-x 1 root root 1950 Jul 9 09:29 example-dag.py*
-rwxr-xr-x 1 root root 5964 Jul 9 09:29 get_token.py*
In the logs we see:
/c/Users/richard/airflow$ astro airflow logs -s -f
scheduler_1 | Waiting for host: postgres 5432
scheduler_1 | Initializing airflow database...
scheduler_1 | [2019-07-17 06:29:54,802] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
scheduler_1 | [2019-07-17 06:29:55,273] {__init__.py:51} INFO - Using executor LocalExecutor
scheduler_1 | DB: postgresql://postgres:***@postgres:5432
scheduler_1 | [2019-07-17 06:29:55,713] {db.py:338} INFO - Creating tables
scheduler_1 | INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
scheduler_1 | INFO [alembic.runtime.migration] Will assume transactional DDL.
scheduler_1 | Done.
scheduler_1 | [2019-07-17 06:29:56,807] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
scheduler_1 | [2019-07-17 06:29:57,069] {__init__.py:51} INFO - Using executor LocalExecutor
scheduler_1 | ____________ _____________
scheduler_1 | ____ |__( )_________ __/__ /________ __
scheduler_1 | ____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
scheduler_1 | ___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
scheduler_1 | _/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
scheduler_1 |
scheduler_1 | [2019-07-17 06:29:57,329] {jobs.py:1545} INFO - Starting the scheduler
scheduler_1 | [2019-07-17 06:29:57,329] {jobs.py:1559} INFO - Processing files using up to 2 processes at a time
scheduler_1 | [2019-07-17 06:29:57,330] {jobs.py:1560} INFO - Running execute loop for -1 seconds
scheduler_1 | [2019-07-17 06:29:57,331] {jobs.py:1561} INFO - Processing each file at most -1 times
scheduler_1 | [2019-07-17 06:29:57,331] {jobs.py:1564} INFO - Process each file at most once every 0 seconds
scheduler_1 | [2019-07-17 06:29:57,331] {jobs.py:1568} INFO - Checking for new files in /usr/local/airflow/dags every 300 seconds
scheduler_1 | [2019-07-17 06:29:57,332] {jobs.py:1571} INFO - Searching for files in /usr/local/airflow/dags
scheduler_1 | [2019-07-17 06:29:57,332] {jobs.py:1573} INFO - There are 0 files in /usr/local/airflow/dags
scheduler_1 | [2019-07-17 06:29:57,580] {jobs.py:1635} INFO - Resetting orphaned tasks for active dag runs
Is there anything I can try?
Richard