不能将文件〜200MB上传到ASP.NET Core Server端到Azure Blob存储



我在ASP.NET Core上有服务器部分,该核心在内容类型中接收文件:Multipart/form-data格式标题,然后将其发送到流中的Azure Blob存储。但是,当我发送约200 MB的文件时,我有一个错误

"请求主体太大,超过了最大允许的限制"

当我搜索时,它可能会在Windowsazure的旧版本中发生。存储,但我有9.1.1版。而且,当我看起来更深入的方法时,默认情况下,在4 MB上上loadfromstreamasync blob。因此,我不知道该怎么做才能寻求您的帮助。我的控制器:

public async Task<IActionResult> Post(string folder)
    {
        string azureBlobConnectionString = _configuration.GetConnectionString("BlobConnection");
        // Retrieve storage account from connection string.
        CloudStorageAccount storageAccount = CloudStorageAccount.Parse(azureBlobConnectionString);
        HttpResponseUploadClass responseUploadClass = await Request.StreamFile(folder, storageAccount);
        FormValueProvider formModel = responseUploadClass.FormValueProvider;
        var viewModel = new MyViewModel();
        var bindingSuccessful = await TryUpdateModelAsync(viewModel, prefix: "",
            valueProvider: formModel);
        if (!bindingSuccessful)
        {
            if (!ModelState.IsValid)
            {
                return BadRequest(ModelState);
            }
        }
        return Ok(responseUploadClass.Url);
    }

和我将流fileestream发送到azure blob

的班级
 public static async Task<HttpResponseUploadClass> StreamFile(this HttpRequest request, string folder, CloudStorageAccount blobAccount)
    {
        CloudBlobClient blobClient = blobAccount.CreateCloudBlobClient();
        CloudBlobContainer container = blobClient.GetContainerReference(folder);
        CloudBlockBlob blockBlob = null;
        if (!MultipartRequestHelper.IsMultipartContentType(request.ContentType))
        {
            throw new Exception($"Expected a multipart request, but got {request.ContentType}");
        }
        var formAccumulator = new KeyValueAccumulator();
        var boundary = MultipartRequestHelper.GetBoundary(
            MediaTypeHeaderValue.Parse(request.ContentType),
            DefaultFormOptions.MultipartBoundaryLengthLimit);
        var reader = new MultipartReader(boundary, request.Body);
        var section = await reader.ReadNextSectionAsync();
        while (section != null)
        {
            ContentDispositionHeaderValue contentDisposition;
            var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(section.ContentDisposition, out contentDisposition);
            var disposition = ContentDispositionHeaderValue.Parse(section.ContentDisposition);
            if (hasContentDispositionHeader)
            {
                if (MultipartRequestHelper.HasFileContentDisposition(contentDisposition))
                {
                    try
                    {
                        string fileName = HttpUtility.UrlEncode(disposition.FileName.Value.Replace(""", ""), Encoding.UTF8);
                        blockBlob = container.GetBlockBlobReference(Guid.NewGuid().ToString());
                        blockBlob.Properties.ContentType = GetMimeTypeByWindowsRegistry(fileName);
                        blockBlob.Properties.ContentDisposition = "attachment; filename*=UTF-8''" + fileName;
                        await blockBlob.UploadFromStreamAsync(section.Body);
                    }
                    catch (Exception e)
                    {
                        Console.WriteLine(e);
                        throw;
                    }
                }
                else if (MultipartRequestHelper.HasFormDataContentDisposition(contentDisposition))
                {
                    var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name);
                    var encoding = GetEncoding(section);
                    using (var streamReader = new StreamReader(
                        section.Body,
                        encoding,
                        detectEncodingFromByteOrderMarks: true,
                        bufferSize: 1024,
                        leaveOpen: true))
                    {
                        var value = await streamReader.ReadToEndAsync();
                        if (String.Equals(value, "undefined", StringComparison.OrdinalIgnoreCase))
                        {
                            value = String.Empty;
                        }
                        formAccumulator.Append(key.Value, value);
                        if (formAccumulator.ValueCount > DefaultFormOptions.ValueCountLimit)
                        {
                            throw new InvalidDataException($"Form key count limit {DefaultFormOptions.ValueCountLimit} exceeded.");
                        }
                    }
                }
            }
            section = await reader.ReadNextSectionAsync();
        }
        var formValueProvider = new FormValueProvider(
            BindingSource.Form,
            new FormCollection(formAccumulator.GetResults()),
            CultureInfo.CurrentCulture);
        return new HttpResponseUploadClass{FormValueProvider = formValueProvider, Url = blockBlob?.Uri.ToString()};
    }

正如您所说的,每个块blob可以是不同的大小,最多可达100 mb(4 MB(4 MB(,用于使用REST版本的2016-05-31之前的请求(,一个Block Blob最多可包括50,000个街区。

如果您正在编写不超过256 MB (64 MB的块斑点(用于2016-05-31之前使用REST版本的64 MB(,则可以将其上传到其带有单个写操作的完整,请参阅put blob。

存储客户端默认为 32 MB最大单个块上传,可使用SingleBlobUploadThresholdInBytes属性进行设置。当块斑点上传大于此属性中的值时,存储客户端将文件分解为块。您可以使用ParallelOperationThreadCount属性设置用于并行上传块的线程数。

BlobRequestOptions requestoptions = new BlobRequestOptions()
{
    SingleBlobUploadThresholdInBytes = 1024 * 1024 * 50, //50MB
    ParallelOperationThreadCount = 12,
};
CloudStorageAccount account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobclient = account.CreateCloudBlobClient();
blobclient.DefaultRequestOptions = requestoptions;
CloudBlobContainer blobcontainer = blobclient.GetContainerReference("uploadfiles");
blobcontainer.CreateIfNotExists();
CloudBlockBlob blockblob = blobcontainer.GetBlockBlobReference("bigfiles");

有关更多详细信息,您可以参考此线程。

最新更新