CsvHelper stream too long



我有一个问题,使用CsvHelper保存大量的数据(>2GB)到Azure Blob存储:我得到错误"流太长"。有人能帮我解决吗?提前感谢!这是我的代码:

public static void EXPORT_CSV(DataTable dt, string fileName, ILogger log)
{
try
{
// Retrieve storage account from connection string.
var cnStorage = Environment.GetEnvironmentVariable("cnStorage");
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(cnStorage);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("dataexport");
bool exists = container.CreateIfNotExists();
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
var stream = new MemoryStream();
using (var writer = new StreamWriter(stream))
using (var csvWriter = new CsvWriter(writer, CultureInfo.InvariantCulture))
{
csvWriter.Configuration.TypeConverterOptionsCache.GetOptions<DateTime>().Formats = new[] { "dd/MM/yyyy" };
foreach (DataColumn column in dt.Columns)
{
csvWriter.WriteField(column.ColumnName);
}
csvWriter.NextRecord();
foreach (DataRow row in dt.Rows)
{
for (var i = 0; i < dt.Columns.Count; i++)
{
csvWriter.WriteField(row[i]);
}
csvWriter.NextRecord();
}
csvWriter.Flush();
writer.Flush();
stream.Position = 0;
log.LogInformation($"C# BatchDataExportCSVsegnalazioni START UploadFromStream  at: {DateTime.Now}");
blockBlob.UploadFromStream(stream);
log.LogInformation($"C# BatchDataExportCSVsegnalazioni END UploadFromStream  at: {DateTime.Now}");
}
}
catch (Exception ex)
{
log.LogError("Error upload BatchDataExportCSVsegnalazioni: " + ex.Message);
}
}

该错误可能是由于使用MemoryStream处理大数据而不是使用csvHelper。看看问题是否可以通过:

  1. 将数据直接写入FileStream而不是写入内存流。

    using (var fileStream = File.Create(path))
    // (or)
    using (var fileStream = new FileStream(filePath, FileMode.OpenOrCreate))
    {
    using (var writer = new StreamWriter(fileStream))
    {
    using (var csvWriter = new CsvWriter(writer, CultureInfo.InvariantCulture))
    {
    
  2. (或)

  1. 您可以使用cloudblockblob库通过使用程序集azure . storage . blobs和命名空间azure . storage . blobs . specialized:
  2. 的扩展方法在azure存储中创建文件。

请参考使用Blob存储流处理Azure中的大文件

,

var stream = blob.OpenWrite()

另请参阅使用。net流媒体文件上传到Azure Blob存储的注意事项

我解决了直接写入azure blob存储使用blob. openwriteasync ():

public static async Task UPLOAD_CSVAsync(DataTable dt, string fileName, ILogger log)
{
try
{
// Retrieve storage account from connection string.
var cnStorage = Environment.GetEnvironmentVariable("cnStorage");
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(cnStorage);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("dataexport");
bool exists = container.CreateIfNotExists();
// Retrieve reference to a blob named "fileName".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
log.LogInformation($"C# BatchExpCSVsegnalazioni START UploadFromStream  at: {DateTime.Now}");
await WriteDataTableToBlob(dt, blockBlob);
log.LogInformation($"C# BatchExpCSVsegnalazioni END UploadFromStream  at: {DateTime.Now}");
}
catch (Exception ex)
{
log.LogError("error upload BatchExpCSVsegnalazioni: " + ex.Message);
}
}
public static async Task WriteDataTableToBlob(DataTable dt, CloudBlockBlob blob)
{
using (var writer = await blob.OpenWriteAsync())
using (var streamWriter = new StreamWriter(writer))
using (var csvWriter = new CsvWriter(streamWriter, CultureInfo.InvariantCulture))
{
csvWriter.Configuration.TypeConverterOptionsCache.GetOptions<DateTime>().Formats = new[] { "dd/MM/yyyy" };
foreach (DataColumn column in dt.Columns)
{
csvWriter.WriteField(column.ColumnName);
}
csvWriter.NextRecord();
foreach (DataRow row in dt.Rows)
{
for (var i = 0; i < dt.Columns.Count; i++)
{
csvWriter.WriteField(row[i]);
}
csvWriter.NextRecord();
}
csvWriter.Flush();
}
}

最新更新