在Azure Data Factory Pipeline中,我可以有一个带有两个水槽的复制活动吗?我有一个来源和2个水槽(一个用于下游处理的Azure Data Lake Store,另一个用于Blob存储的存档)。
绝对可能。只需在同一管道中使用相同的输入数据集添加第二个活动,但输出数据集则是不同的。
json然后看起来像这样:
{
"$schema": "http://datafactories.schema.management.azure.com/schemas/2015-09-01/Microsoft.DataFactory.Pipeline.json",
"name": "CopyActivity1",
"properties": {
"description": "Copy data from blob to a sql server table",
"activities": [
{
"name": "CopyActivityTemplate",
"type": "Copy",
"inputs": [
{
"name": "AzureBlobLocation1"
}
],
"outputs": [
{
"name": "AzureSqlTableLocation1"
}
],
"typeProperties": {
"source": {
"type": "BlobSource"
},
},
{
"name": "CopyActivityTemplate2",
"type": "Copy",
"inputs": [
{
"name": "AzureBlobLocation1"
}
],
"outputs": [
{
"name": "AzureSqlTableLocation2"
}
],
"typeProperties": {
"source": {
"type": "BlobSource"
},
},
}
],
"start": "2016-12-05T22:00:00Z",
"end": "2016-12-06T01:00:00Z"
}
}
,因此基本上您需要添加另一个活动。