斯卡拉火花 :使用重载时尽量避免类型擦除



我对Scala/Spark比较陌生

我正在尝试根据类类型将一个函数重载到 DStream 中

def persist(service1DStream: DStream[Service1]): Unit = {...}
def persist(service2DStream: DStream[Service2]): Unit = {...}

我收到编译错误:

persist(_root_.org.apache.spark.streaming.dstream.DStream) is already defined in the scope

似乎是由于类型擦除。如何让编译器认识到DStream[Service1]不同于DStream[Service2]

谢谢

def persist(serviceDStream: DStream[Any]): Unit = serviceDStream match {
case _: DStream[Service1] => println("it is a Service1")
case _: DStream[Service2] => println("it is a Service2")    
case _ => println("who knows")     
}
使用更多信息无

形指南改进了解决方案runtime type erasure shapeless

import org.apache.spark.sql.SparkSession
import org.apache.spark.streaming.dstream.DStream
import shapeless.TypeCase
object Test {
 def main(args: Array[String]): Unit = {
val spark = SparkSession
  .builder
  .getOrCreate()
case class Service1 (a: String)
case class Service2 (a: Int)
val Service1Typed = TypeCase[DStream[Service1]]
val Service2Typed    = TypeCase[DStream[Service2]]
def persist(serviceDStream: DStream[Any]): Unit = serviceDStream match {
  case Service1Typed => println("it is a Service1")
  case Service2Typed => println("it is a Service2")
  case _ => println("who knows")
}

}

}

您还可以使用import scala.reflect.ClassTag详细信息:类标记示例

最新更新