无法为RDD创建数据帧



我正在尝试创建一个具有动态架构生成的数据框架。这是代码段:

def mapMetricList(row: Row): Seq[Metric] = ???
val fields = Seq("Field1", "Field2")
case class Metric(name: String, count: Long)
def convertMetricList(df: DataFrame): DataFrame = {
  val outputFields = df.schema.fieldNames.filter(f => fields.contains(f))
  val rdd = df.rdd.map(row => {
    val schema = row.schema
    val metrics = mapMetricList(row)
    val s = outputFields.map(name => row.get(schema.fieldIndex(name)))
    Row.fromSeq(s ++ Seq(metrics))
  })
  val nonMetricsSchema = outputFields.map( f => df.schema.apply(f))
  val metricField = StructField("total",ArrayType(ScalaReflection.schemaFor[Metric].dataType.asInstanceOf[StructType]),nullable=true)
  val schema = StructType(nonMetricsSchema ++ Seq(metricField))
  schema.printTreeString()
  val dff = spark.createDataFrame(rdd, schema)
  dff
}

但是,我在运行时不断获得这些例外:

Caused by: java.lang.RuntimeException: Metric is not a valid external type for schema of struct<name:string,count:bigint>
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.evalIfCondExpr3$(Unknown Source)
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.evalIfFalseExpr4$(Unknown Source)
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
    at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:290)

我正在使用Spark 2.1.0

在我的计算机上使用SPARK 1.6的功能正常,我打印了" ConvertMetriclist"功能的结果。也许在" metricfield"字段"计数"类型中发行。在您提到的痕迹" bigint"中,我的env类型是" longtype":

StructField(total,ArrayType(
    StructType(StructField(name,StringType,true), 
    StructField(count,LongType,false)
),true),true)

您可以在Env上检查" Metricfield"类型。如果不同的话,解决方法是对硬码度量结构。

相关内容

  • 没有找到相关文章

最新更新