使用googleCloudStorageR库将bucket文件移动到其他bucket中



我在docker中有一个R脚本,以便在Google Cloud Run上执行它。

在完全处理位于bucket"中的文件X.csv之后;输入";,我想把它迁移到bucket中;完成";。如何使用googleCloudStorageR?

`googleCloudStorageR::gcs_upload("myfile.csv")` 

似乎没有提供gs://语法:

googleCloudStorageR::gcs_upload("gs://Input/X")

返回错误:

Path 'gs://Input/X.csv' does not exist

另外一个bucket名称不在gcs_upload((函数参数中。我必须将默认bucket设置为";完成了";?

googleCloudStorageR::gcs_global_bucket("Done")

谢谢。

如果您使用cran.r-project,本文档将展示如何上传对象

## upload a file - type will be guessed from file extension or supply type  
write.csv(mtcars, file = filename)
gcs_upload(filename)
## upload an R data.frame directly - will be converted to csv via write.csv
gcs_upload(mtcars)
## upload an R list - will be converted to json via jsonlite::toJSON
gcs_upload(list(a = 1, b = 3, c = list(d = 2, e = 5)))
## upload an R data.frame directly, with a custom function
## function should have arguments 'input' and 'output'
## safest to supply type too
f <- function(input, output) write.csv(input, row.names = FALSE, file = output)
gcs_upload(mtcars, 
object_function = f,
type = "text/csv")

如果您正在使用cloudyr

gcs_upload(
file,
bucket = gcs_get_global_bucket(),
type = NULL,
name = deparse(substitute(file)),
object_function = NULL,
object_metadata = NULL,
predefinedAcl = c("private", "bucketLevel", "authenticatedRead",
"bucketOwnerFullControl", "bucketOwnerRead", "projectPrivate", "publicRead",
"default"),
upload_type = c("simple", "resumable")
)
gcs_upload_set_limit(upload_limit = 5000000L)

如果你想设置bucket:

## set global bucket so don't need to keep supplying in future calls
gcs_global_bucket("my-bucket")

您可以在以下目录中找到此文档:

googleCloudStorageR master/docs/reference/gcs_upload.html

相关内容

  • 没有找到相关文章

最新更新