运行DAG时Apache气流错误(错误-[Erno 2]没有这样的文件或目录)



我正试图找出出现此错误的原因。是否缺少依赖项?版本问题?为什么只有这一个DAG而没有其他DAG会发生这种情况?

错误为:

FileNotFoundError: [Errno 2] No such file or directory: /home/airflow/composer_kube_config

这是我们的DAG:

import datetime
from airflow import DAG
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow.contrib.kubernetes.secret import Secret
from airflow.contrib.kubernetes.volume import Volume
from airflow.contrib.kubernetes.volume_mount import VolumeMount
# from airflow.contrib.kubernetes.pod import Port
from utils.constants import DEFAULT_ARGS, DUMB_BUCKET, SCHEMA_BUCKET, PROJECT, 
CLOUD_COMPOSER_SERVICE_ACCOUNT_SECRET, STAGING_BUCKET
volume_mount = VolumeMount(
'secret',
mount_path='/etc/secret',
sub_path=None,
read_only=True
)
volume_config= { 'persistentVolumeClaim': { 'claimName': 'all-ftp' } }
volume = Volume(name='secret', configs=volume_config)
with DAG( 'ftp_file_poller', schedule_interval="55 6 * * *", start_date=datetime.datetime(2020,7,1) ) as dag:
poller = KubernetesPodOperator(
secrets=[CLOUD_COMPOSER_SERVICE_ACCOUNT_SECRET],
task_id='ftp-file-poller',
name='ftp-polling',
cmds=['ftp-poller'],
namespace='default',
image='us.gcr.io/<our gcp project>/ftp-poller:v7',
is_delete_operator_pod=True,
get_logs=True,
volumes=[volume],
volume_mounts=[volume_mount]
)
poller.doc = """
about this DAG info
"""

这是我在文档中找到的关于这个文件的一句话:

# Only name, namespace, image, and task_id are required to create a
# KubernetesPodOperator. In Cloud Composer, currently the operator defaults
# to using the config file found at `/home/airflow/composer_kube_config if
# no `config_file` parameter is specified. By default it will contain the
# credentials for Cloud Composer's Google Kubernetes Engine cluster that is
# created upon environment creation.

这是通过将DEFAULT_ARGS常量添加到DAG定义中来解决的,如下所示:

with DAG(dag_id='ftp_file_poller',
schedule_interval="55 6 * * *",
start_date=datetime.datetime(2020,7,1),
default_args=DEFAULT_ARGS) as dag:

相关内容

最新更新