After deploying an image, I see the following exception when navigating to the deployment webserver:
Ooops!
Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.Python version: 3.7.10
Airflow version: 2.0.2
Node: heliocentric-exploration-2937-webserver-5cb9fdb9d6-vn5gwTraceback (most recent call last):
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py”, line 1277, in _execute_context
cursor, statement, parameters, context
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/default.py”, line 608, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UndefinedColumn: column dag.last_parsed_time does not exist
LINE 1: …AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_p…
^The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File “/usr/local/lib/python3.7/site-packages/flask/app.py”, line 2447, in wsgi_app
response = self.full_dispatch_request()
File “/usr/local/lib/python3.7/site-packages/flask/app.py”, line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File “/usr/local/lib/python3.7/site-packages/flask/app.py”, line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File “/usr/local/lib/python3.7/site-packages/flask/_compat.py”, line 39, in reraise
raise value
File “/usr/local/lib/python3.7/site-packages/flask/app.py”, line 1950, in full_dispatch_request
rv = self.dispatch_request()
File “/usr/local/lib/python3.7/site-packages/flask/app.py”, line 1936, in dispatch_request
return self.view_functionsrule.endpoint
File “/usr/local/lib/python3.7/site-packages/airflow/www/auth.py”, line 34, in decorated
return func(*args, **kwargs)
File “/usr/local/lib/python3.7/site-packages/airflow/www/views.py”, line 497, in index
filter_dag_ids = current_app.appbuilder.sm.get_accessible_dag_ids(g.user)
File “/usr/local/lib/python3.7/site-packages/airflow/www/security.py”, line 273, in get_accessible_dag_ids
return {dag.dag_id for dag in accessible_dags}
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py”, line 3535, in iter
return self._execute_and_instances(context)
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py”, line 3560, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py”, line 1011, in execute
return meth(self, multiparams, params)
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py”, line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py”, line 1130, in _execute_clauseelement
distilled_params,
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py”, line 1317, in execute_context
e, statement, parameters, cursor, context
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py”, line 1511, in handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from=e
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py”, line 182, in raise
raise exception
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py”, line 1277, in _execute_context
cursor, statement, parameters, context
File “/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/default.py”, line 608, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedColumn) column dag.last_parsed_time does not exist
LINE 1: …AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_p…
^[SQL: SELECT dag.dag_id AS dag_dag_id, dag.root_dag_id AS dag_root_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_parsed_time AS dag_last_parsed_time, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners, dag.description AS dag_description, dag.default_view AS dag_default_view, dag.schedule_interval AS dag_schedule_interval, dag.concurrency AS dag_concurrency, dag.has_task_concurrency_limits AS dag_has_task_concurrency_limits, dag.next_dagrun AS dag_next_dagrun, dag.next_dagrun_create_after AS dag_next_dagrun_create_after
FROM dag]
(Background on this error at: )
The issue looks like an incompatibility between the Airflow deployment version and my image, but I have confirmed my image is using 2.0.0, Quay, and the deployment Airflow version is 2.0.0 also.
I tried deploying to a 2.0.2 deployment, but for that I get an email properly telling me the versions are incompatible and that I first need to upgrade (downgrade?) the deployment:
Your recent attempt to upgrade Analytics Service 2.0 (Airflow 2.0.2) to Airflow 2.0.0 failed. To upgrade your Deployment on Astronomer to a new version of Apache Airflow, you must: 1. Initialize the Airflow Upgrade via the Deployment Settings page of the Astronomer UI or via the CLI (
$ astro deployment airflow upgrade
), THEN 2. Change the Astronomer Certified Image in your Dockerfile to match your new, desired version of Apache Airflow 3. Deploy changes to Astronomer (
$ astro deploy
) Please complete the steps above before deploying again to Astronomer. Until you take action, your Airflow Deployment will continue to run 2.0.2. For guidelines on how to cancel this upgrade and other information, refer to our [Airflow Versioning]