这是我的基本dataFrame:
root |-- user_id: string (nullable = true)
|-- review_id: string (nullable = true)
|-- review_influence: double (nullable = false)
目标是为每个user_id提供评论_influence的总和。因此,我试图汇总数据并这样总结:
val review_influence_listDF = review_with_influenceDF
.groupBy("user_id")
.agg(collect_list("review_id") as("list_review_id"), collect_list("review_influence") as ("list_review_influence"))
.agg(sum($"list_review_influence"))
但是我有一个错误:
org.apache.spark.sql.AnalysisException: cannot resolve 'sum(`list_review_influence`)' due to data type mismatch: function sum requires numeric types, not ArrayType(DoubleType,true);;
我该怎么办?
您可以在agg
函数中直接总和列:
review_with_influenceDF
.groupBy("user_id")
.agg(collect_list($"review_id").as("list_review_id"),
sum($"review_influence").as("sum_review_influence"))