我有一个 API 方法,当调用并传递文件密钥数组时,会从 S3 下载它们。我想流式传输它们,而不是下载到磁盘,然后压缩文件并将其返回给客户端。
这是我当前的代码的样子:
reports.get('/xxx/:filenames ', async (req, res) => {
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
var str_array = filenames.split(',');
for (var i = 0; i < str_array.length; i++) {
var filename = str_array[i].trim();
localFileName = './' + filename;
var params = {
Bucket: config.reportBucket,
Key: filename
}
s3.getObject(params, (err, data) => {
if (err) console.error(err)
var file = require('fs').createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(file);
console.log(file);
})
}
});
我将如何流式传输文件而不是将它们下载到磁盘,以及如何压缩它们以将其返回给客户端?
主要问题是压缩多个文件。
更具体地说,从 AWS S3 批量下载它们。
我搜索了 AWS 开发工具包,但没有找到批量 s3 操作。
这给我们带来了一个可能的解决方案:
- 逐个加载文件并将它们存储到文件夹中
- Zip文件夹(带有这样的软件包)
- 发送压缩文件夹
这是原始且未经测试的示例,但它可能会给您以下想法:
// Always import packages at the beginning of the file.
const AWS = require('aws-sdk');
const fs = require('fs');
const zipFolder = require('zip-folder');
const s3 = new AWS.S3();
reports.get('/xxx/:filenames ', async (req, res) => {
const filesArray = filenames.split(',');
for (const fileName of filesArray) {
const localFileName = './' + filename.trim();
const params = {
Bucket: config.reportBucket,
Key: filename
}
// Probably you'll need here some Promise logic, to handle stream operation end.
const fileStream = fs.createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(fileStream);
}
// After that all required files would be in some target folder.
// Now you need to compress the folder and send it back to user.
// We cover callback function in promise, to make code looks "sync" way.
await new Promise(resolve => zipFolder('/path/to/the/folder', '/path/to/archive.zip', (err) => {resolve()});
// And now you can send zipped folder to user (also using streams).
fs.createReadStream('/path/to/archive.zip').pipe(res);
});
有关流链接和链接的信息
注意:根据流的性质,您可能会遇到一些异步行为问题,因此,请首先在压缩之前检查所有文件是否都存储在文件夹中。
只是提一下,我还没有测试过这段代码。因此,如果出现任何问题,让我们一起调试