c#如何使用List插入大量数据.更有效地安排



我正试图直接使用c#代码从第三方(Sage)的Rest API插入数据到Azure SQL。

当我试图添加大约30,000行数据时,我在第66行得到了底部错误,其中它有

resultList.AddRange(result.Data);

我只尝试了三列数据,我不能添加更多的列。

目前,我在Azure SQL数据库中使用"基本"层。

我不确定我可以改进c#代码使其更高效(比如使用更好的方法添加List)。

或者我应该只是升级层在Azure SQL数据库有更好的DTU?

或者我不应该将JSON转换(序列化)到Azure SQL,而是直接在Cosmos DB中插入JSON格式,然后再移动到Azure SQL?

底部是错误信息:

System.Data.SqlClient.SqlException
HResult=0x80131904
Message=Timeout expired.  The timeout period elapsed prior to completion of the operation or 
the server is not responding.
Source=Core .Net SqlClient Data Provider
StackTrace:
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean 
breakConnection, Action`1 wrapCloseInAction)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, 
Boolean callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParserStateObject.ThrowExceptionAndWarning(Boolean 
callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, 
UInt32 error)
at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, 
SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject 
stateObj, Boolean& dataReady)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)
at Sage0413.Program.Main(String[] args) in C:UserssourcereposSageMain.cs:line 66

这是c#代码的一部分,它有List。AddRange:

List<dynamic> resultList = new List<dynamic>();
do
{
query.Offset = offset;
query.PageSize = pagesize;
Task<OnlineResponse> task = client.Execute(query);
OnlineResponse response = task.Result;
Result result = response.Results[0];
offset += pagesize;
resultList.AddRange(result.Data);
} while (offset < 30000);
resultJsonString = JsonConvert.SerializeObject(resultList);
return resultJsonString;      

这是c#代码的一部分,它有SqlBulkCopy:

foreach (var record in data)
{
table.Rows.Add(new object[] { record.Detail.RecordNo, record.Detail.BatchKey, 
record.Detail.WhenModified });
}
using (var conn = new SqlConnection(connString))
using (var bulk = new SqlBulkCopy(conn))
{
bulk.DestinationTableName = "tblSage";
conn.Open();
bulk.WriteToServer(table);
} 

您需要调整超时值以允许比默认值更长的时间。

bulk.BulkCopyTimeout = 60; // seconds

参考:

  • https://learn.microsoft.com/en-us/dotnet/api/system.data.sqlclient.sqlbulkcopy.bulkcopytimeout

相关内容

最新更新