我想用ssh——内部ip从composer连接到vm,但我遇到了权限错误。我用vm和composer使用相同的VPC/子网测试了ssh,结果如下。可能存在哪些问题?我怎样才能使ssh成功呢?
DAG
"""A liveness prober dag for monitoring composer.googleapis.com/environment/healthy."""
import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import timedelta
default_args = {
'start_date': airflow.utils.dates.days_ago(0),
'retries': 1,
'retry_delay': timedelta(minutes=5)
}
dag = DAG(
'in-ssh',
default_args=default_args,
description='liveness monitoring dag',
schedule_interval=None,
dagrun_timeout=timedelta(minutes=20))
# priority_weight has type int in Airflow DB, uses the maximum.
t1 = BashOperator(
task_id='in-ssh',
bash_command='gcloud beta compute ssh --zone "asia-northeast1-a" "dev-testserver01" --internal-ip --project "project"',
dag=dag,
depends_on_past=False,
priority_weight=2**31-1)
错误日志
[2021-04-01 03:12:52,404] {bash_operator.py:158} INFO - Updating project ssh metadata...
[2021-04-01 03:13:09,570] {bash_operator.py:158} INFO - .....................................................................................Updated [https://www.googleapis.com/compute/beta/projects/project].
[2021-04-01 03:13:10,257] {bash_operator.py:158} INFO - ...done.
[2021-04-01 03:13:10,301] {bash_operator.py:158} INFO - Waiting for SSH key to propagate.
[2021-04-01 03:13:10,573] {bash_operator.py:158} INFO - Warning: Permanently added 'compute.6676320815635940303' (ECDSA) to the list of known hosts.
[2021-04-01 03:13:10,665] {bash_operator.py:158} INFO - airflow@192.168.10.8: Permission denied (publickey).
[2021-04-01 03:14:07,028] {bash_operator.py:158} INFO - ERROR: (gcloud.beta.compute.ssh) Could not SSH into the instance. It is possible that your SSH key has not propagated to the instance yet. Try running this command again. If you still cannot connect, verify that the firewall and instance are set to accept ssh traffic.
[2021-04-01 03:14:07,654] {bash_operator.py:162} INFO - Command exited with return code 1
[2021-04-01 03:14:07,697] {taskinstance.py:1152} ERROR - Bash command failed
不要使用BashOperator()
对VM实例进行ssh,而是使用ComputeEngineSSHHook()
并将参数传递给此方法。有关此方法的参考,请参阅本文档。
你可以参考这个github链接example_compute_ssh.py。它展示了如何使用ComputeEngineSSHook()
连接到不同的VM设置。
以下是github链接的一个片段:
import os
from airflow.providers.google.cloud.hooks.compute_ssh import ComputeEngineSSHHook
from airflow.providers.ssh.operators.ssh import SSHOperator
GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'example-project')
GCE_ZONE = os.environ.get('GCE_ZONE', 'europe-west2-a')
GCE_INSTANCE = os.environ.get('GCE_INSTANCE', 'target-instance')
vm_ssh = SSHOperator(
task_id="vm_ssh",
ssh_hook=ComputeEngineSSHHook(
instance_name=GCE_INSTANCE,
zone=GCE_ZONE,
project_id=GCP_PROJECT_ID,
use_oslogin=True,
use_iap_tunnel=False,
use_internal_ip=True, // include this line if you are using internal ip
),
command="echo vm_ssh",
)