我正在使用 Flink 的(1.7.2( kafka consumer。如何反序列化扩展相同特征的几个案例类?例如
import spray.json.{DefaultJsonProtocol, RootJsonFormat}
trait Foo
case class Boo(name: String) extends Foo
case class Buzz(name: String, age: Int) extends Foo
object Formats extends DefaultJsonProtocol{
implicit val booFormat: RootJsonFormat[Boo] =
jsonFormat1(Boo.apply)
implicit val buzzFormat: RootJsonFormat[Buzz] =
jsonFormat2(Buzz.apply)
}
我正在使用卡夫卡消费者与这个DeserializationSchema
:
class FooSchema extends DeserializationSchema[Foo]{
@transient lazy val log = LoggerFactory.getLogger(this.getClass)
implicit val typeInfo = createTypeInformation[Foo]
override def deserialize(bytes: Array[Byte]): Foo = {
val foo = new String(bytes, StandardCharsets.UTF_8).parseJson
.convertTo[Foo] //doesn't compile, I need to deserialize to Boo and Buzz
log.debug(s"Received Boo")
foo
}
override def isEndOfStream(t: Foo): Boolean = false
override def getProducedType: TypeInformation[Foo] = createTypeInformation[Foo]
}
任何想法将不胜感激
尝试 spray-json-shapeless,它可以自动为 ADT 派生解码器,如下所示:
sealed trait Foo
case class Boo(name: String) extends Foo
case class Buzz(name: String, age: Int) extends Foo
object MyFormats extends DefaultJsonProtocol with FamilyFormats {
implicit val formats = shapeless.cachedImplicit[JsonFormat[Foo]]
}
记住要使特质sealed
。请注意,原始 JSON 需要包含type
字段消歧器才能rawJsonString.parseJson.convertTo[Foo]
工作,例如
object Main extends App {
import MyFormats._
val rawJsonBuzz =
"""
|{
| "name": "Picard",
| "age": 60,
| "type": "Buzz"
|}
""".stripMargin
val buzz = rawJsonBuzz.parseJson.convertTo[Foo]
println(buzz)
val rawJsonBoo =
"""
|{
| "name": "Picard",
| "type": "Boo"
|}
""".stripMargin
val boo = rawJsonBoo.parseJson.convertTo[Foo]
println(boo)
}
应该输出
Buzz(Picard,60)
Boo(Picard)