Terraform, AWS, Databricks错误:cannot create instance profile:



我正在尝试创建和挂载在数据块S3桶上。文件结构

  • Main (Parent) Module—创建VPC、调用模块Workspace和S3_bucket
  • 子模块1 -工作区-创建跨帐户IAM角色,根桶和Databricks工作区
  • 子模块2 - S3_bucket -为S3桶创建跨帐户策略和角色,并自己创建和挂载桶

模块S3桶基于这里的一个示例https://registry.terraform.io/providers/databricks/databricks/0.3.3/docs/resources/aws_s3_mount唯一的区别—我将创建IAM角色和策略的部分与创建S3桶并挂载它们的部分分开了

下面是S3_bucket_policy.tf

中的一个片段
data "databricks_aws_assume_role_policy" "s3_arp" {
external_id = var.dbx_account_id
}
// Step 9: Grant Databricks full access to VPC resources
resource "aws_iam_role" "s3_cross_account" {
#for_each = aws_iam_role.s3_cross_account == null ? [  var.aws_s3_cxrossaccount_iam_role_name ] : []
name               = var.prefix != null ? "${var.prefix}-${var.aws_s3_cxrossaccount_iam_role_name}" : "${var.aws_s3_cxrossaccount_iam_role_name}"
assume_role_policy = data.databricks_aws_assume_role_policy.s3_arp.json
description        = "Grants Databricks full access to VPC resources"
tags               = var.tags
}
# resource "time_sleep" "wait10s_3" {
#   depends_on = [aws_iam_role.s3_cross_account]
#   create_duration = "10s"
# }
// Step 11: Register cross-account role for multi-workspace scenario (only if you're using multi-workspace setup)
resource "databricks_mws_credentials" "s3_dbx_cred" {
#for_each = databricks_mws_credentials.s3_dbx_cred == null ? [  var.dbx_s3_credential_name ] : []
# provider         = databricks.mws
# depends_on       = [ time_sleep.wait10s_3 ]
account_id       = var.dbx_account_id
credentials_name = var.prefix != null ? "${var.prefix}-${var.dbx_s3_credential_name}" : "${var.dbx_s3_credential_name}" 
role_arn         = aws_iam_role.s3_cross_account.arn
}
// Step 12: Register your data role with instance profile
resource "aws_iam_instance_profile" "iam_ip" {
#for_each = aws_iam_instance_profile.iam_ip == null ? [  var.aws_s3_instance_profile_name ] : []
name = var.prefix != null ? "${var.prefix}-${var.aws_s3_instance_profile_name}" : "${var.aws_s3_instance_profile_name}"
role = aws_iam_role.s3_cross_account.name
tags               = var.tags
}
# resource "time_sleep" "wait10s_4" {
#   depends_on = [aws_iam_instance_profile.iam_ip]
#   create_duration = "10s"
# }
// Step 13: Register instance profile at Databrickshsozialhilfe 
resource "databricks_instance_profile" "dbx_ip" {
#for_each = databricks_instance_profile.dbx_ip == null ? [ "OK" ] : []
#depends_on = [ time_sleep.wait10s_4 ]
instance_profile_arn = aws_iam_instance_profile.iam_ip.arn
}
// Step 14: now you can do `%fs ls /mnt/experiments` in notebooks
// https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mount#migration-from-other-mount-resources
resource "databricks_secret" "mount_config" {
#for_each = databricks_secret.mount_config == null ? [ "OK" ] : []
key   = "mount-config"
scope = "AWS"

string_value = jsonencode({
"s3" = {
"region"  = var.aws_region
"endpoint" = "s3.${var.aws_region}.amazonaws.com"
"enable_multipart_upload" = true
"max_retry_count" = 10
"max_parallel_uploads" = 5
}
})
}

最后两个资源邮件,提示消息错误:cannot create secret: HTTP方法POST不被此URL支持

这是调试日志文件中的内容

2023-04-11T17:55:20.055+0200 [DEBUG] module.s3_buckets.databricks_secret.mount_config: applying the planned Create change
2023-04-11T17:55:20.056+0200 [INFO]  Starting apply for module.s3_buckets.databricks_instance_profile.dbx_ip
2023-04-11T17:55:20.057+0200 [DEBUG] module.s3_buckets.databricks_instance_profile.dbx_ip: applying the planned Create change
2023-04-11T17:55:21.441+0200 [DEBUG] provider.terraform-provider-databricks_v1.14.2.exe: POST /api/2.0/instance-profiles/add
> {
>   "instance_profile_arn": "arn:aws:iam::072498559190:instance-profile/vsemouch-dbx-aws-aws-s3-instance-profile"
> }
< HTTP/2.0 405 Method Not Allowed
[non-JSON document of 333 bytes]: timestamp=2023-04-11T17:55:21.436+0200
2023-04-11T17:55:21.441+0200 [ERROR] provider.terraform-provider-databricks_v1.14.2.exe: Response contains error diagnostic: diagnostic_detail= diagnostic_severity=ERROR tf_proto_version=5.3 tf_provider_addr=registry.terraform.io/databricks/databricks tf_req_id=71bf8cf3-b5b7-e56d-bb83-99ac3496d85d tf_rpc=ApplyResourceChange @module=sdk.proto diagnostic_summary="cannot create instance profile: HTTP method POST is not supported by this URL" tf_resource_type=databricks_instance_profile @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/diag/diagnostics.go:55 timestamp=2023-04-11T17:55:21.436+0200
2023-04-11T17:55:21.473+0200 [DEBUG] provider.terraform-provider-databricks_v1.14.2.exe: POST /api/2.0/secrets/put
> {
>   "key": "mount-config",
>   "scope": "AWS",
>   "string_value": "**REDACTED**"
> }
< HTTP/2.0 405 Method Not Allowed
[non-JSON document of 323 bytes]: timestamp=2023-04-11T17:55:21.472+0200
2023-04-11T17:55:21.473+0200 [ERROR] provider.terraform-provider-databricks_v1.14.2.exe: Response contains error diagnostic: tf_resource_type=databricks_secret tf_rpc=ApplyResourceChange @module=sdk.proto diagnostic_detail= diagnostic_summary="cannot create secret: HTTP method POST is not supported by this URL" tf_provider_addr=registry.terraform.io/databricks/databricks @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/diag/diagnostics.go:55 diagnostic_severity=ERROR tf_proto_version=5.3 tf_req_id=07d5f476-3415-0626-c2fa-b8ad9b3b580f timestamp=2023-04-11T17:55:21.472+0200
2023-04-11T17:55:21.634+0200 [ERROR] vertex "module.s3_buckets.databricks_secret.mount_config" error: cannot create secret: HTTP method POST is not supported by this URL
2023-04-11T17:55:21.682+0200 [ERROR] vertex "module.s3_buckets.databricks_instance_profile.dbx_ip" error: cannot create instance profile: HTTP method POST is not supported by this URL
2023-04-11T17:55:21.798+0200 [DEBUG] provider.stdio: received EOF, stopping recv loop: err="rpc error: code = Unavailable desc = transport is closing"
2023-04-11T17:55:21.817+0200 [DEBUG] provider: plugin process exited: path=.terraform/providers/registry.terraform.io/databricks/databricks/1.14.2/windows_amd64/terraform-provider-databricks_v1.14.2.exe pid=96460
2023-04-11T17:55:21.817+0200 [DEBUG] provider: plugin exited

如果你们中的一位专家能帮助我,我会很高兴的。

您需要为不同的资源使用两种不同的提供者配置:

  • 帐户级别—用于创建工作区和相关对象
  • 工作空间级别——用于在工作空间内创建秘密范围、秘密等。

对应的资源应该在每个资源中指定不同的提供者配置。你还需要有明确的依赖关系。

// account level
provider "databricks" {
alias    = "mws"
host     = "https://accounts.cloud.databricks.com"
username = var.databricks_account_username
password = var.databricks_account_password
}
//workspace level
provider "databricks" {
alias = "mws"
host  = module.e2.workspace_url
token = module.e2.token_value
}

,并按如下方式使用:

  • 帐户级对象
resource "databricks_mws_workspaces" "this" {
provider = databricks.mws
....
}
  • 工作区对象:
resource "databricks_secret" "mount_config" {
provider = databricks.ws
key   = "mount-config"
scope = "AWS"
string_value = "..."
depends_on = [databricks_mws_workspaces.this]
}

注:还可以查看故障排除指南。

相关内容

最新更新