Hi there
I have an airflow setup running via docker compose and connected to an on-premises DB and redis cache.
For some time now, if a DAG run, with all its subtasks completes successfully, Airflow correctly marks it as completed and halts execution. However if a subtask fails or times out, Airflow is unable to mark it as Failed internally, and subsequently keeps running the subtask and spamming our internal slack channel.
I’ve verified that the redis cache is working correctly and the Postgres DB has also been deleted and reinstalled, just to isolate the problem. Could someone shed some light on this weird problem? Thanks
AssertionError
[2020-06-22 07:58:29,862] {models.py:1791} INFO - Marking task as FAILED.
[2020-06-22 07:58:30,098] {models.py:1795} ERROR - Failed to send email to: ['<redacted>']
[2020-06-22 07:58:30,098] {models.py:1796} ERROR - Connection unexpectedly closed