CollectionAccumulator依赖性在Intellij中无法解决



我是Scala和Spark的新手。我正在编写有关CollectionAccumulator的示例程序。但是,在Intellij中没有解决集合量的依赖性。

val slist : CollectionAccumulator[String] = new CollectionAccumulator()
sc.register(slist,"Myslist")

请找到使用的代码。我通过替换collectionAccumulator [string]尝试了累加器[String]。累加器正在解决

我已经导入以下内容:

import org.apache.log4j._
import org.apache.spark.{Accumulator, SparkContext}
import org.apache.spark.util._

pom.xml中的依赖项:

<dependencies>
    <!-- Scala and Spark dependencies -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.2.0-cdh5.3.1</version>
    </dependency>

请帮助..

CollectionAccumulator在Spark 2.0 版本中支持。您在Spark 1.2.0 CDH版本上。参考:https://spark.apache.org/docs/2.0.0/api/scala/index.html#org.apache.sparch.util.collection.collectionAccumulator

替换您的火花依赖性
<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
      <version>2.1.0.cloudera1</version>
</dependency>

还要确保" $ {scala.version}"解析为Scala 2.11

CollectionAccumulator仅在spark v2.0.0之后,只需将您的火花版本更新为 2.0+

示例build.sbt

name := "smartad-spark-songplaycount"                                                                                                                                           
version := "1.0"                                                                                       
scalaVersion := "2.10.4"                                                                               
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.0"                                
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.2.0"                                 
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.2.0"                                   
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

上面.sbt

上的示例SBT控制台
sbt console
scala> import org.apache.spark.util.CollectionAccumulator
import org.apache.spark.util.CollectionAccumulator
scala> val slist : CollectionAccumulator[String] = new CollectionAccumulator()
slist: org.apache.spark.util.CollectionAccumulator[String] = Un-registered Accumulator: CollectionAccumulator

最新更新