写入无效的 avro 文件:长度为负数:-40



我正在尝试从python编写avro文件,大部分时间都遵循官方教程。

我有一个似乎是有效的架构:

{"namespace": "example.avro",
 "type": "record",
 "name": "Stock",
 "fields": [
     {"name": "ticker_symbol", "type": "string"},
     {"name": "sector",  "type": "string"},
     {"name": "change", "type": "float"},
     {"name": "price",  "type": "float"}
 ]
}

这是相关代码

avro_schema = schema.parse(open("stock.avsc", "rb").read())
output = BytesIO()
writer = DataFileWriter(output, DatumWriter(), avro_schema)
for i in range(1000):
    writer.append(_generate_fake_data())
writer.flush()
with open('record.avro', 'wb') as f:
    f.write(output.getvalue())

但是,当我尝试使用 cli avro-tools 从此文件读取输出时:

avro-tools fragtojson --schema-file stock.avsc ./record.avro  --no-pretty

我收到以下错误:

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/usr/local/Cellar/avro-tools/1.8.2/libexec/avro-tools-1.8.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -40
    at org.apache.avro.io.BinaryDecoder.doReadBytes(BinaryDecoder.java:336)
    at org.apache.avro.io.BinaryDecoder.readString(BinaryDecoder.java:263)
    at org.apache.avro.io.ResolvingDecoder.readString(ResolvingDecoder.java:201)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:422)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:414)
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:181)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153)
    at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:232)
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:222)
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:175)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:145)
    at org.apache.avro.tool.BinaryFragmentToJsonTool.run(BinaryFragmentToJsonTool.java:82)
    at org.apache.avro.tool.Main.run(Main.java:87)
    at org.apache.avro.tool.Main.main(Main.java:76)

我很确定相关错误是

 Malformed data. Length is negative: -40

但我无法说出我做错了什么。我怀疑我写错了 avro 文件。

我想写入字节数组(而不是像示例中那样直接写入文件(,因为最终我将使用 boto3 将此 avro 缓冲区发送到 AWS Kinesis Firehose

我使用了错误的工具来读取文件。我应该使用

avro-tools tojson ./record.avro

而不是像问题中那样fragtojson。不同之处在于,fragtojson用于单个 avro 基准面,而tojson用于整个文件。

我想写入字节数组(而不是像示例中那样直接写入文件(,因为最终我将使用 boto3 将此 avro 缓冲区发送到 AWS Kinesis Firehose。

所以你不需要使用DataFileWriter,你需要的是这个:

datum_writer = io.DatumWriter(avro_schema)
output = io.BytesIO()
encoder = avro.io.BinaryEncoder(output)
for i in range(1000):
    datum_writer.write(_generate_fake_data(), encoder)
data_bytes = output.getvalue()

如果要打印data_bytes的内容,只需使用二进制解码器对其进行解码即可

相关内容

  • 没有找到相关文章

最新更新