在MWAA环境中使用S3FileTransformOperator



我试图在MWAA环境中使用S3FileTransformOperator,但我在脚本文件中缺乏权限:

PermissionError: [Errno 13] Permission denied

我尝试在任务前使用chmod命令添加Bash操作符,但没有成功。

有没有人在MWAA中使用过S3FileTransformOperator ?

Hi!
Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.
Thanks!

参考:https://repost.aws/questions/QUvsbZds_NQTG7JSxCQ11djQ/s-3-file-transform-operator-permission-denied-on-script

我试过了,但第一步又出错了,

from airflow.operators.bash import BashOperator
from airflow.operators.s3_file_transform_operator import S3FileTransformOperator
with DAG(...) as dag:
chmod = BashOperator(
task_id="chmod",
bash_command="chmod +x /usr/local/airflow/dags/transform.py"
)
transform = S3FileTransformOperator(
task_id="transform",
source_s3_key="s3://bucket/path/to/sample.csv",
dest_s3_key="s3://bucket/path/to/result.csv",
transform_script="/usr/local/airflow/dags/transform.py",
)
chmod >> expose
chmod: changing permissions of ‘/usr/local/airflow/dags/transform.py’: Read-only file system

来自AWS的其他参考资料显示可以运行bash脚本转换,但我还没有成功使用Python。

https://docs.aws.amazon.com/mwaa/latest/userguide/t - apache -气流- 202. - html # op-s3-transform

相关内容

  • 没有找到相关文章

最新更新