运行scala代码jar出现NoSuchMethodError:scala. predef $. refarrayops



我的代码可以在本地模式下正常运行,当我将其打印成一个jar包并将其上传到我部署运行的SPARK服务器时,NoSuchMethodError: scala。Predef美元。refArrayOps出现了。出错的代码行如下所示val expectArray=expectVertex.take(2).toArray.sortBy(it=>{it_1})expectVertex是一个scala映射,它的键类型是graphx。VertexId,它的值类型是Int

我在使用Spark简单代码这样做时也遇到了这个问题,当我使用一行数组函数时发生了这个错误,代码如下包org.example

import org.apache.spark.graphx.{Edge, Graph}
import org.apache.spark.{SparkConf, SparkContext}
import java.util.logging.{Level, Logger}
/**
* Hello world!
*
*/
class App{
def run(): Unit ={
Logger.getLogger("org.apache.spark").setLevel(Level.WARNING)
Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)
val conf = new SparkConf().setAppName("AXU test")
.setMaster("local")
val sc = new SparkContext(conf)
val vertices = sc.parallelize(Array((1L, "A"), (2L, "B"), (3L, "C"), (4L, "D")))
val edges = sc.parallelize(Array(Edge(1L, 2L, "friend"), Edge(2L, 3L, "follow"), Edge(3L, 4L, "friend")))
val graph = Graph(vertices, edges)
val inDegrees = graph.inDegrees
inDegrees.collect().foreach(println)
val deg = inDegrees.collect()
for( i <- 0 to deg.length-1){
print("this is no." + (i+1) + " point indegree:")
println("id: " + deg(i)._1 + " value: " + deg(i)._2)
}
sc.stop()
}
}

日志是

Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:65)
at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
at org.example.App.run(App.scala:23)
at org.example.Main$.main(Main.scala:6)
at org.example.Main.main(Main.scala)

,如果我删除第一行的代码。23、代码是inDegrees.collect().foreach(println)它可以正常工作。我编译和运行的scala版本都是2.12.7。看起来我不能使用Array [T]这样的方法。foreach or Array [T]。sortBy (it=>{it_1})在jar包中(我使用Maven来打包jar)。maven的内容如下:

<properties>
<scala.version>2.12.7</scala.version>
<spark.version>2.4.4</spark.version>
</properties>

<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<id>compile-scala</id>
<phase>compile</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>test-compile-scala</id>
<phase>test-compile</phase>
<goals>
<goal>add-source</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>org.example.Main</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>assembly</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>java</executable>
<includeProjectDependencies>true</includeProjectDependencies>
<includePluginDependencies>false</includePluginDependencies>
<classpathScope>compile</classpathScope>
<mainClass>org.example.Main</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</project>
谁能告诉我为什么会出现这个问题?提前谢谢你。

很可能你在本地用Scala 2.12编译代码,但在服务器上运行的是Scala 2.13或2.11。

尝试用服务器端的Scala版本重新编译代码。

Scala 2.11, 2.12, 2.13是二进制不兼容的

refArrayOps的签名不同(二进制不相容)

def refArrayOps(scala.Array[scala.Any]): scala.Any(scalap)

public <T> T[] refArrayOps(T[])(javap) scalap和javap显示不同的方法签名

@inline implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps[T][api] [source]

def refArrayOps(scala.Array[scala.Any]): scala.Array[scala.Any](scalap)

public <T> T[] refArrayOps(T[])(javap)

implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps.ofRef[T][api] [source]

  • 在Scala 2.11-

def refArrayOps(scala.Array[scala.Any]): scala.collection.mutable.ArrayOps(scalap)

public <T> scala.collection.mutable.ArrayOps<T> refArrayOps(T[])(javap)

implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps[T][api 2.11 2.10] [source 2.11 2.10 2.9]


Kafka在MAC上启动错误。跟Java和scala有关的东西……NoSuchMethodError: scala.Predef $ .refArrayOps

. lang。NoSuchMethodError: scala.Predef $ .refArrayOps

如何修复NoSuchMethodError?

. lang。NoSuchMethodError: org.apache.hadoop.hive.common.FileUtils.mkdir试图将表保存到Hive

. lang。NoSuchMethodError: scala.Predef $。Spark job中的refArrayOps


可以运行

import java.net.URLClassLoader
import java.util.Arrays
//  List.unfold(getClass.getClassLoader) { cl =>
//    val urls = s"classloader: ${cl.getClass.getName}" :: 
//      (cl match {
//        case cl: URLClassLoader =>
//          "classloader urls:" :: 
//            cl.getURLs.map(_.toString).toList
//        case _ => List("not URLClassLoader")
//      })
//    Option.when(cl != null)((urls, cl.getParent))
//  }.flatten.foreach(println)
var cl = getClass.getClassLoader
while (cl != null) {
println(s"classloader: ${cl.getClass.getName}")
cl match {
case cl: URLClassLoader =>
println("classloader urls:")
// cl.getURLs.foreach(println) // uses Scala refArrayOps again
println(Arrays.toString(cl.getURLs.asInstanceOf[Array[Object]])) // pure Java
case _ =>
println("not URLClassLoader")
}
cl = cl.getParent
}

println(
System.getProperty("java.class.path")
)

(System.getProperty("java.class.path")和getClassLoader.getURLs()有什么区别?)

在您正在使用的实际Spark环境中。然后您将看到您的类路径。是否存在不同的scala-library,是否存在不同的_2.11,_2.12,_2.13依赖项

https://www.scala-sbt.org/1.x/docs/Howto-Classpaths.html

scalacOptions += "-Ylog-classpath"


scala -version显示了系统中安装的Scala。有可能Scala在类路径中是不同的。

为什么SBT显示与我的系统不同的尺度厌恶?

构建。sbt不能在不同的Scala版本中工作

最新更新