数据工厂一旦无法正常工作,每天安排多个活动



通过更改以下管道进行多次测试后,我将在本论坛上发布此信息,以寻求专家的帮助。以下管道的基本思想是 Activity-1 将通过调用" U-SQL"脚本来进行一些计算,该脚本将结果输出到" Data Lake Store"。现在, Activity-2 将获取从 Activity-1 产生的数据,并将该数据复制到" Azure-SQL"。两项活动均计划每天运行一次。但是,我看不到管道是从来没有触发的。如果计划每15分钟运行一次,它可以正常工作,我做错了什么?

{
        "name": "IncrementalLoad_Pipeline",
        "properties": {
            "description": "This is a pipeline to to pick files from Data Lake as per the slice start date time.",
            "activities": [
                {
                    "type": "DataLakeAnalyticsU-SQL",
                    "typeProperties": {
                        "scriptPath": "andeblobcontainer\script.usql",
                        "scriptLinkedService": "AzureStorageLinkedService",
                        "degreeOfParallelism": 3,
                        "priority": 100,
                        "parameters": {
                            "in": "$$Text.Format('/Input/SyncToCentralDataLog_{0:dd_MM_yyyy}.txt', Date.AddDays(SliceStart,-7))",
                            "out": "$$Text.Format('/Output/incremental_load/StcAnalytics_{0:dd_MM_yyyy}.tsv', Date.AddDays(SliceStart,-7))"
                        }
                    },
                    "inputs": [
                        {
                            "name": "IncrementalLoad_Input"
                        }
                    ],
                    "outputs": [
                        {
                            "name": "IncrementalLoad_Output"
                        }
                    ],
                    "scheduler": {
                        "frequency": "Day",
                        "interval": 1
                    },
                    "name": "IncrementalLoad",
                    "linkedServiceName": "AzureDataLakeAnalyticsLinkedService"
                },
                {
                    "type": "Copy",
                    "typeProperties": {
                        "source": {
                            "type": "AzureDataLakeStoreSource",
                            "recursive": false
                        },
                        "sink": {
                            "type": "SqlSink",
                            "writeBatchSize": 0,
                            "writeBatchTimeout": "00:00:00"
                        }
                    },
                    "inputs": [
                        {
                            "name": "IncrementalLoad_Input2"
                        },
                        {
                            "name": "IncrementalLoad_Output"
                        }
                    ],
                    "outputs": [
                        {
                            "name": "AzureSQLDatasetOutput"
                        }
                    ],
                    "scheduler": {
                        "frequency": "Day",
                        "interval": 1
                    },
                    "name": "CopyToAzureSql"
                }
            ],
            "start": "2016-09-12T23:45:00Z",
            "end": "2016-09-13T01:00:00Z",
            "isPaused": false,
            "hubName": "vijaytest-datafactory_hub",
            "pipelineMode": "Scheduled"
        }
    }

在开始和结束时期的JSON带有JSON不够大。ADF无法在不到几天的时间范围内提供一组每日时间切片。

尝试增加开始和结束时期以覆盖1周。例如:

        "start": "2016-09-12",
        "end": "2016-09-18",

您应该能够延长结束日期而不会丢弃管道。

希望这会有所帮助。

相关内容

  • 没有找到相关文章

最新更新