Java无法运行H2o Sparkling Water,Java.lang.NumberFormatException:



我正在尝试运行Spark+H2o Sparkling Water项目,并使用Maven作为我的构建工具。

以下是每个包的版本:

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
<scala.version>2.11</scala.version>
<spark.version>2.3.1</spark.version>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>

我收到以下错误:

Exception in thread "main" java.lang.NumberFormatException: Not a version: 9
at scala.util.PropertiesTrait$class.parts$1(Properties.scala:184)
at scala.util.PropertiesTrait$class.isJavaAtLeast(Properties.scala:187)
at scala.util.Properties$.isJavaAtLeast(Properties.scala:17)

我使用的是Java 8,所以我不知道它为什么会给我这个错误。

编辑:我的POM文件中的其余依赖项如下:

<dependencies>
<dependency>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp-tools</artifactId>
<version>1.8.1</version>
</dependency>
<!-- Spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.version}</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_${scala.version}</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- Sparkling Water -->
<dependency>
<groupId>ai.h2o</groupId>
<artifactId>sparkling-water-package_2.11</artifactId>
<version>3.28.0.3-1-2.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-repl_2.11</artifactId>
<version>2.4.4</version>
</dependency>
</dependencies>

编辑(已解决(:当我使用Spark 2.4.5版时,问题得到了解决

已解决:当我使用Spark 2.4.5版时,问题已解决

相关内容

最新更新