Databricks错误java.lang.NoSuchMethodError:scala.预定义$.refArrayO



我正试图从这个链接运行一些示例代码:https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/5537430417240233/312903576646278/3506802399907740/latest.html

我在集群上的databricks笔记本上运行它,运行时6.3(包括Apache Spark 2.4.4和Scala 2.11)我最初使用创建一个数据帧

import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder.getOrCreate
import spark.implicits._
val df = Seq(
("one", 2.0),
("two", 1.5),
("three", 8.0)
).toDF("id", "val")

然后我试着通过运行df.select("id").map(_.getString(0)).collect.toList

我得到下面的错误

java.lang.NoSuchMethodError:scala。预定义$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object,

t line3700fe51392b4be9744f6b3a059dbfa46.$read$$iw$$iw$$iw$$iw$$iw$$。(命令-1275538363433250:2)电话:3700fe51392b4abe9744f6b3a059dbfa46.$read$$iw$$iw$$。(命令-1275538363433250:53)在线3700fe51392b4abe9744f6b3a059dbfa46.$read$$iw$$iw$$iw$$iw$$iw$$。(命令-1275538363433250:55)在线3700fe51392b4abe9744f6b3a059dbfa46.$read$$iw$$iw$$iw$$iw$$iw。(命令-1275538363433250:57)在线3700fe51392b4abe9744f6b3a059dbfa46.$read$$iw$$iw$$iw$$iw。(命令-1275538363433250:59)在线3700fe51392b4be9744f6b3a059dbfa46.$read$$iw$$iw$$iw.(命令-1275538363433250:61)在3700fe51392b4be9744f6b3a059dbfa46.$read$$iw$$iw.(命令-1275538363433250:63)在3700fe51392b4be9744f6b3a059dbfa46.$read$$iw.(命令-1275538363433250:65)在3700fe51392b4be9744f6b3a059dbfa46.$read.(命令-1275538363433250:67)在3700fe51392b4be9744f6b3a059dbfa46.$read$.(命令-1275538363433250:71)第3700fe51392b4be9744f6b3a059dbfa46.$read$.行(命令-1275538363433250)在线3700fe51392b4be9744f6b3a059dbfa46.$eval$.$print$lzycompute(:7)电话:3700fe51392b4be9744f6b3a059dbfa46.$eval$.$print(:6)第3700fe51392b4be9744f6b3a059dbfa46.$eval.$print()行在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)位于java.lang.reflect.Method.ioke(Method.java:498)位于scala.tools.nsc.pinterpreter.IMain$ReadEvalPrint.call(IMain.scala:793)位于scala.tools.nsc.pinterpreter.IMain$Request.loadAndRun(IMain.scala:1054)位于scala.tools.nsc.pinterpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)位于scala.tools.nsc.pinterpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)位于scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)位于scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)位于scala.tools.nsc.pinterpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)位于scala.tools.nsc.expreter.IMain.expreter(IMain.scala:576)位于scala.tools.nsc.expreter.IMain.expreter(IMain.scala:572)在com.databricks.backend.demon.driver.DriverILoop.execute(DriverILoop.scala:215)在com.databricks.backend.doemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)网址:com.databricks.backend.demon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)网址:com.databricks.backend.demon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)在com.databricks.backend.demon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:699)网址:com.databricks.backend.demon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:652)在com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)在com.databricks.backend.demon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:385)在com.databricks.backend.demon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:362)访问com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:251)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)网址:com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:246)在com.databricks.backend.demon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)网址:com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:288)在com.databricks.backend.demon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)在com.databricks.backend.demon.driver.DriverLocal.execute(DriverLocal.scala:362)在com.databricks.backend.demon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)在com.databricks.backend.demon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)在scala.util.Try$.apply(Try.scala:192)在com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)在com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)在com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)在com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)在com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)在com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)在java.lang.Thread.run(Thread.java:748)

我在运行时遇到了同样的错误df.select("id").collect().map(_(0)).toList

但跑步时不行df.select("id").rdd.map(_(0)).collect.toList

上面成功运行的命令返回一个List[Any],但我需要一个List[String]

有人能提供建议吗?我怀疑这是sparkscala版本不匹配,但我不知道出了什么问题。

异常

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;

如果混合使用不同的scala版本,通常会出现这种情况。您是否为scala2.12加载了任何依赖项?

EDIT:我刚刚用相同的运行时在databricks中测试了您的代码,运行得很好。。。。

.map中,使用.toString转换为字符串,则结果将为List[String]

Example:

df.select("id").collect().map(x => x(0).toString).toList
List[String] = List(one, two, three)

最新更新