Can't see the log for a local cli task run

I’m accessing the Airflow CLI as documented by Astronomer and running a single task with

airflow run dag_name task_name execution_date

(following the Airflow cli docs)

I see some output and the task appears to be running, but I can’t see the full logs I would see in the webserver interface. I get little more than a message that the task is running.

I’ve tried following the Astronomer logs with astro airflow logs -f but this also doesn’t show the full logs.

Is it possible to view the full logs in the CLI?

hi @ollieglass

Hm, those logs should be there from astro airflow logs

Can you try running the command with:

airflow run --interactive dag_name task_name execution_date

and check for logs?

If you still don’t see it, you can always jump into the container and find the logs (tho that isn’t ideal):

$ docker ps
CONTAINER ID        IMAGE                                        COMMAND                  CREATED             STATUS              PORTS                                        NAMES
aee0b9c4a9d6        airflow-example-dags_b96b2a/airflow:latest   "tini -- /entrypoint…"   3 seconds ago       Up 3 seconds        5555/tcp, 8793/tcp, 0.0.0.0:8080->8080/tcp   airflowexampledagsb96b2a_webserver_1
245cfbbd5a48        airflow-example-dags_b96b2a/airflow:latest   "tini -- /entrypoint…"   4 seconds ago       Up 3 seconds        5555/tcp, 8080/tcp, 8793/tcp                 airflowexampledagsb96b2a_scheduler_1
$ docker exec -it airflowexampledagsb96b2a_scheduler_1 /bin/bash.0:5432->5432/tcp                       airflowexampledagsb96b2a_postgres_1
bash-5.0$ ls
Dockerfile             airflow.cfg            dags                   logs                   packages.txt           requirements.txt
LICENSE                airflow_settings.yaml  include                me.txt                 plugins                unittests.cfg
bash-5.0$ cd logs/
bash-5.0$ ls
dag_processor_manager   example_kubernetes_pod  scheduler
bash-5.0$ 

Hi, I didn’t find a way to see those logs using the astro CLI so I create my own airflow.cfg at my project changing variable log_filename_template to log_filename_template = {{ ti.dag_id }}.log (this will only apply for local development) and then I execute a tail to the file to see the task logs by dag.

-docker exec -ti (docker ps | grep webserver | sed -e 's/\s.*//’) sh -c “tail -f -n 100 /usr/local/airflow/logs/{dag_id}.log”

I hope this will fix your problem.