Airflow Unable to mark a task inside a DAG as failed

Hi there

I have an airflow setup running via docker compose and connected to an on-premises DB and redis cache.
For some time now, if a DAG run, with all its subtasks completes successfully, Airflow correctly marks it as completed and halts execution. However if a subtask fails or times out, Airflow is unable to mark it as Failed internally, and subsequently keeps running the subtask and spamming our internal slack channel.
I’ve verified that the redis cache is working correctly and the Postgres DB has also been deleted and reinstalled, just to isolate the problem. Could someone shed some light on this weird problem? Thanks

AssertionError
[2020-06-22 07:58:29,862] {models.py:1791} INFO - Marking task as FAILED.
[2020-06-22 07:58:30,098] {models.py:1795} ERROR - Failed to send email to: ['<redacted>']
[2020-06-22 07:58:30,098] {models.py:1796} ERROR - Connection unexpectedly closed

Hi @lalit-cm- welcome to our forum!

A few questions for you:

  1. What version of Airflow are you running?
  2. Have you tried tweaking the retries param on the task that’s failing? You can see an example of how that parameter works in this Airflow tuotiral; it should tell the task to only retry that many times in the case of a failure (ie. if you set it to 0 the task will fail without any retries, which would only send one failure notification to your slack channel).