为什么我在登录 apache 气流后出现错误?



我想登录到Apache气流,但是当我登录我得到了一个错误。但是我已经在Ubuntu中创建了用户。我不知道问题在哪里,也不知道解决办法在哪里。有人知道怎么解吗?我已经安装了apache气流2.1.3版本的Ubuntu从Windows 10操作系统,我真的不使用kubernetes或docker,我的SQLAlchemy版本是1.3.24

Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.
Python version: 3.8.10
Airflow version: 2.1.3
Node: ANONYM
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
self.dialect.do_execute(
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
sqlite3.OperationalError: no such column: dag.last_parsed_time
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/airflow/www/auth.py", line 34, in decorated
return func(*args, **kwargs)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/airflow/www/views.py", line 551, in index
filter_dag_ids = current_app.appbuilder.sm.get_accessible_dag_ids(g.user)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/airflow/www/security.py", line 298, in get_accessible_dag_ids
return {dag.dag_id for dag in accessible_dags}
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3535, in __iter__
return self._execute_and_instances(context)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3560, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement
ret = self._execute_context(
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context
self._handle_dbapi_exception(
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception
util.raise_(
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
self.dialect.do_execute(
File "/home/faustinaleo18/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such column: dag.last_parsed_time
[SQL: SELECT dag.dag_id AS dag_dag_id, dag.root_dag_id AS dag_root_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_parsed_time AS dag_last_parsed_time, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners, dag.description AS dag_description, dag.default_view AS dag_default_view, dag.schedule_interval AS dag_schedule_interval, dag.concurrency AS dag_concurrency, dag.has_task_concurrency_limits AS dag_has_task_concurrency_limits, dag.next_dagrun AS dag_next_dagrun, dag.next_dagrun_create_after AS dag_next_dagrun_create_after 
FROM dag]
(Background on this error at: http://sqlalche.me/e/13/e3q8)

您似乎没有初始化气流。

建议你看看这个链接- https://airflow.apache.org/docs/apache-airflow/stable/start/local.html

在我的情况下是在AWS MWAA,我有这个问题:因为Python DAG文件从S3中删除,我得到报告的错误消息。

重新将文件上传到S3的原始位置后,它恢复工作。

相关内容

  • 没有找到相关文章

最新更新