尝试构建简单的Spark独立Java应用程序时出现Maven包错误



我试图构建一个简单的Spark独立Java应用程序,就像Spark - Self-Contains应用程序一样。

/* SimpleApp.java */
import org.apache.spark.sql.SparkSession;
public class SimpleApp {
public static void main(String[] args) {
String logFile = "YOUR_SPARK_HOME/README.md"; // Should be some file on your system
SparkSession spark = SparkSession.builder().appName("Simple Application").getOrCreate();
Dataset<String> logData = spark.read.textFile(logFile).cache();
long numAs = logData.filter(s -> s.contains("a")).count();
long numBs = logData.filter(s -> s.contains("b")).count();
System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);
spark.stop();
}
}

封装结构如下

./pom.xml
./src
./src/main
./src/main/java
./src/main/java/SimpleApp.java

这是绒球.xml

<project>
<groupId>edu.berkeley</groupId>
<artifactId>simple-project</artifactId>
<modelVersion>4.0.0</modelVersion>
<name>Simple Project</name>
<packaging>jar</packaging>
<version>1.0</version>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
</project>

如果我运行mvn package,则会出现以下错误。

[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] /Users/fengyich/Dev/Sandbox/SimpleApp/src/main/java/SimpleApp.java:[8,9] cannot find symbol
symbol:   class Dataset
location: class SimpleApp
[ERROR] /Users/fengyich/Dev/Sandbox/SimpleApp/src/main/java/SimpleApp.java:[8,40] cannot find symbol
symbol:   variable read
location: variable spark of type org.apache.spark.sql.SparkSession

添加一个额外的导入行

import org.apache.spark.sql.Dataset;

改变

spark.read.textFile(logFile).cache();

spark.read().textFile(logFile).cache();

我的绒球.xml如下所示

<project>
<groupId>edu.berkeley</groupId>
<artifactId>simple-project</artifactId>
<modelVersion>4.0.0</modelVersion>
<name>Simple Project</name>
<packaging>jar</packaging>
<version>1.0</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
</project>

这应该可以解决您的问题

也许你需要:import org.apache.spark.sql.Dataset

您可以尝试添加以下插件:

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>${java-version}</source>
<target>${java-version}</target>
</configuration>
</plugin>

最新更新