Flink s3 读取错误:数据读取的长度与预期不同



使用 flink1.7.0,但也在 flink 1.8.0 上看到。通过 flink .readFile 源从 S3 读取 gzip 对象时,我们经常遇到但有些随机的错误:

org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Data read has a different length than the expected: dataLength=9713156; expectedLength=9770429; includeSkipped=true; in.getClass()=class org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client$2; markedSupported=false; marked=0; resetSinceLastMarked=false; markCount=0; resetCount=0
at org.apache.flink.fs.s3base.shaded.com.amazonaws.util.LengthCheckInputStream.checkLength(LengthCheckInputStream.java:151)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.util.LengthCheckInputStream.read(LengthCheckInputStream.java:93)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:76)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AInputStream.closeStream(S3AInputStream.java:529)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AInputStream.close(S3AInputStream.java:490)
at java.io.FilterInputStream.close(FilterInputStream.java:181)
at org.apache.flink.fs.s3.common.hadoop.HadoopDataInputStream.close(HadoopDataInputStream.java:89)
at java.util.zip.InflaterInputStream.close(InflaterInputStream.java:227)
at java.util.zip.GZIPInputStream.close(GZIPInputStream.java:136)
at org.apache.flink.api.common.io.InputStreamFSInputWrapper.close(InputStreamFSInputWrapper.java:46)
at org.apache.flink.api.common.io.FileInputFormat.close(FileInputFormat.java:861)
at org.apache.flink.api.common.io.DelimitedInputFormat.close(DelimitedInputFormat.java:536)
at org.apache.flink.streaming.api.functions.source.ContinuousFileReaderOperator$SplitReader.run(ContinuousFileReaderOperator.java:336)

是的 在给定的作业中,我们通常会看到许多/大多数作业成功读取,但几乎总是至少有一个失败(例如 50 个文件中)。

似乎此错误实际上源于AWS客户端,因此也许flink与它无关,但我希望有人可能对如何可靠地完成这项工作有所了解。

当错误发生时,它最终会杀死源并取消所有连接的运算符。我仍然是flink的新手,但我认为这是可以从以前的快照中恢复的东西吗?当发生这种异常时,我应该期望 flink 会重试读取文件吗?

也许您可以尝试为 s3a 添加更多连接,例如

flink:
...
config: |
fs.s3a.connection.maximum: 320

相关内容

  • 没有找到相关文章

最新更新