我正在通过查询将数据从Azure SQL DB复制到blob。
以下是活动的脚本:
{
"type": "Copy",
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "select distinct a.*, b.Name from [dbo].[Transactxxxxxxx] a join dbo.Anxxxxx b on a.[Clixxxxx] = b.[Fixxxxxx] where b.Name = 'associations'"
},
"sink": {
"type": "BlobSink",
"writeBatchSize": 0,
"writeBatchTimeout": "00:00:00"
}
},
"inputs": [
{
"name": "Txnsxxxxxxxxxxx"
}
],
"outputs": [
{
"name": "Txnxxxxxxxxxxxx"
}
],
"policy": {
"timeout": "01:00:00",
"concurrency": 1,
"retry": 3
},
"scheduler": {
"frequency": "Hour",
"interval": 1
},
"name": "Copyxxxxxxxxxx"
}
该活动似乎有效,但它不会将任何文件放入接收器中。
数据集指向正确的容器。
根据您提供的信息,我在我们的服务中找到了成功的运行日志。我注意到目标 blob 被指定为"实验输入/Inxxx_To_xx_Associations.csv/Inxxx_To_xx.csv"。Blob 名称是静态的,多个切片运行将覆盖同一个 Blob 文件。可以利用 partitionBy 属性来获得动态 blob 名称。有关更多详细信息,请参阅本文:https://azure.microsoft.com/en-us/documentation/articles/data-factory-azure-blob-connector/#azure-blob-dataset-type-properties。