我有一个数据集
+----------+--------+------------+
| id| date| errors|
+----------+--------+------------+
|1 |20170319| error1|
|1 |20170319| error2|
|1 |20170319| error2|
|1 |20170319| error1|
|2 |20170319| err6|
|1 |20170319| error2|
需要数字错误计数天明智
输出
+----------+--------+------------+
| date| errors| count
+----------+--------+------------+
|20170319| error1| 2
|20170319| error2| 3
|20170319| err6| 1
val dataset = spark.read.json(path);
val c =dataset.groupBy("date").count()
//我如何继续计数错误
我尝试在Spark Scala SQL中过时的日期,但找不到生产力我需要转换为RDD并找到一种方法。?
您只需要groupBy
date
和errors
。
val c =dataset.groupBy("date","errors").count()