如何将GG Bigquery中存储的数据文件导出到GZ文件夹中



我使用与下面的代码类似的代码从bquery提取zip文件到GCS。有时我需要提取大约90个文件。我想提取一个压缩文件夹,而不是逐个发送文件
注意:我正在使用Jupyter
谢谢你的帮助。

from google.cloud import bigquery
client = bigquery.Client()
project_id = 'fh-bigquery'
dataset_id = 'public_dump'
table_id = 'afinn_en_165'

bucket_name = 'your_bucket'
destination_uri = 'gs://{}/{}'.format(bucket_name, 'file.csv.gz')
dataset_ref = client.dataset(dataset_id, project=project_id)
table_ref = dataset_ref.table(table_id)
job_config = bigquery.job.ExtractJobConfig()
job_config.compression = 'GZIP'
extract_job = client.extract_table(
table_ref,
destination_uri,
job_config = job_config
) 
extract_job.result()`

我认为不可能使用单个API请求提取整个数据集。为了将相应的表导出到Google Cloud Storage bucket中,我将使用以下代码来一次遍历存储每个表的表ID:

from google.cloud import bigquery
from google.oauth2 import service_account
key_path = "SERVICE_ACCOUNT_PATH"
credentials = service_account.Credentials.from_service_account_file(
key_path,
scopes=["https://www.googleapis.com/auth/cloud- platform"],)
client = bigquery.Client()
project_id = 'PROJECT_ID'
dataset_id = 'DATASET_ID'
bucket_name = 'BUCKET_NAME'
dataset_ref = client.dataset(dataset_id, project=project_id)
for t in client.list_tables(dataset_ref):
print("Extracting table {}".format(t.table_id))
zip_file = '{}.csv.zip'.format(t.table_id)
destination_uri = 'gs://{}/{}'.format(bucket_name, zip_file)
table_ref = dataset_ref.table(t.table_id)
job_config = bigquery.job.ExtractJobConfig()
job_config.compression = 'GZIP'
extract_job = client.extract_table(
table_ref,
destination_uri,
job_config = job_config
)
extract_job.result()

最新更新