Airflow with Oracle backend - DAGs not running

Hi, so I’ve set up an instance of Airflow using Oracle as backend. I ran into a few issues with the airflow db migration script not being compatible (trying to use datatypes that didn’t exist in oracle, etc) so had to manually add them and make a few decisions. Once I believed everything was working ok, I ran a dag. Airflow marked it as running, in the UI under the DAG Runs column, there is one running task. But, this doesn’t reflect in the table column “Recent tasks”, and I can see no Kubernetes pods for the dag task have started up. I was wondering if one change I had to make, due to oracle limitations, whereby “timestamp with time zone” data types cannot be part of a primary key or unique constraint may be part of the problem? I changed all these to “timestamp with local time zone” - but I’m not sure. Has anyone else here faced similar issues with Oracle backend?

Thanks in advance.

Running with Oracle as the metadata backend is totally un-supported, so you are largely on your own here, sorry. I can give you a few pointers though.

Changing the timezone behaviour could have this, if the value is written as one TZ (UTC) but coming back as another (local). A better option if it’s available would be “timestamp withOUT timezone”. That way the code in Airflow to ensure that the TZ is always written as UTC (see https://github.com/apache/airflow/blob/v1-10-stable/airflow/utils/sqlalchemy.py) will not get confused and convert form local to UTC when read back out.