数据帧:将列内的数组转换为RDD[Array[String]]



给定一个数据帧:

+---+----------+
|key|     value|
+---+----------+
|foo|       bar|
|bar|  one, two|
+---+----------+

然后我想使用值列作为 FPGrowth 的入口,它必须看起来像RDD[Array[String]]

val transactions: RDD[Array[String]] = df.select("value").rdd.map(x => x.getList(0).toArray.map(_.toString))
import org.apache.spark.mllib.fpm.{FPGrowth, FPGrowthModel}
val fpg = new FPGrowth().setMinSupport(0.01)
val model = fpg.run(transactions)

我得到例外:

  org.apache.spark.SparkException: Job aborted due to stage failure: Task 7 in stage 141.0 failed 1 times, most recent failure: Lost task 7.0 in stage 141.0 (TID 2232, localhost): java.lang.ClassCastException: java.lang.String cannot be cast to scala.collection.Seq

欢迎任何建议!

而不是 val transactions: RDD[Array[String]] = df.select("value").rdd.map(x => x.getList(0).toArray.map(_.toString))

尝试使用 val transactions= df.select("value").rdd.map(_.toString.stripPrefix("[").stripSuffix("]").split(","))

它按预期给出所需的 ouptut,即 RDD[Array[String]]

val transactions= df.select("value").rdd.map(_.toString.stripPrefix("[").stripSuffix("]").split(","))
transactions: org.apache.spark.rdd.RDD[Array[String]] = MapPartitionsRDD[10] at map at <console>:33
scala> transactions.take(2)
res21: Array[Array[String]] = Array(Array(bar), Array(one, two))

要删除"["和"]",可以在split函数之前使用stripPrefixstripSuffix函数。

相关内容

  • 没有找到相关文章

最新更新