Azure函数定时器触发失败时,我添加我的脚本在__init__ .py



我写了一个python代码,它从MongoDB数据库获取json文件,并执行某种ETL进程。我的问题是,当我运行我的脚本在它自己的工作完全良好,然而,当我把代码在maininit.py文件,以便它可以运行使用定时器触发失败。

我做错了什么在主init.py文件还是有什么我错过了?

这是我在本地调试时收到的错误:

Executed 'Functions.TestTimerTrigger' (Failed, Id=XXXX, Duration=30285ms)
[2021-01-18T10:41:11.323Z] System.Private.CoreLib: Exception while executing function: Functions.TestTimerTrigger. System.Private.CoreLib: Result: Failure
Exception: ServerSelectionTimeoutError: XXXX:27017: timed out, Timeout: 30s, Topology Description: <TopologyDescription id: 600565a938892b2d0c79ba96, topology_type: Single, servers: [<ServerDescription ('XXXX', 27017) server_type: Unknown, rtt: None, error=NetworkTimeout('XXXX:27017: timed out')>]>

运行自己的python脚本:

from pymongo import MongoClient
import pandas as pd
from azure.storage.filedatalake import DataLakeServiceClient
from azure.core._match_conditions import MatchConditions
from azure.storage.filedatalake._models import ContentSettings
from pandas import json_normalize
from datetime import datetime, timedelta
mongo_client = MongoClient("XXXX") 
db = mongo_client.x_db #database name
table = db.levels #collection name
document = table.find()
mongo_docs = list(document)
mongo_docs = json_normalize(mongo_docs) 
mongo_docs.to_csv("test.csv", sep = ",", index=False)  
try:  
global service_client

service_client = DataLakeServiceClient(account_url="{}://{}.dfs.core.windows.net".format(
"https", "XXXX"), credential='XXXX')

file_system_client = service_client.get_file_system_client(file_system="root") name
directory_client = file_system_client.get_directory_client("testfolder") 
file_client = directory_client.create_file("test.csv") 
local_file = open(r"XXXtest.csv",'rb')
file_contents = local_file.read()
file_client.upload_data(file_contents, overwrite=True) 
except Exception as e:
print(e) 

将我的脚本放入主init.py文件:

import datetime
import logging
import json
import requests
import azure.functions as func
from pymongo import MongoClient
import pandas as pd
from azure.storage.filedatalake import DataLakeServiceClient
from azure.core._match_conditions import MatchConditions
from azure.storage.filedatalake._models import ContentSettings
from pandas import json_normalize
from datetime import datetime, timedelta

def main(mytimer: func.TimerRequest) -> None:
mongo_client = MongoClient("XXXX") #MongoDB Connection String
db = mongo_client.x_db #database name
table = db.levels #collection name
document = table.find()
mongo_docs = list(document)
mongo_docs = json_normalize(mongo_docs) 
mongo_docs.to_csv("test.csv", sep = ",", index=False)  
try:  
global service_client

service_client = DataLakeServiceClient(account_url="{}://{}.dfs.core.windows.net".format(
"https", "XXXX"), credential='XXXX')

file_system_client = service_client.get_file_system_client(file_system="root") 
directory_client = file_system_client.get_directory_client("testfolder") 
file_client = directory_client.create_file("test.csv") #name of file being created
local_file = open(r"XXXXtest.csv",'rb') #file which you want to open
file_contents = local_file.read()
file_client.upload_data(file_contents, overwrite=True)
except Exception as e:
print(e) 
utc_timestamp = datetime.datetime.utcnow().replace(
tzinfo=datetime.timezone.utc).isoformat()
if mytimer.past_due:
logging.info('The timer is past due!')
logging.info('Python timer trigger function ran at %s', utc_timestamp)

任何帮助都将不胜感激。

我想这个错误信息说明了一切:

error=NetworkTimeout('XXXX:27017: timed out')

你正在连接一个你无法从Function App连接到的东西。它可能是不可访问的,因为它在私有网络上,或者Function App没有添加到资源的白名单中,或者端口不正确。

无论如何:确保你从代码中连接到的资源可以从Azure Function App中访问。

最新更新