I get below error when i package (jar) and run my defaulthadoopjob.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.Tool
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 12 more
Could not find the main class: DefaultHadoopJobDriver. Program will exit.
Commands used to build Jar.
# jar -cvf dhj.jar
# hadoop -jar dhj.jar DefaultHadoopJobDriver
The above command gave me error "Failed to load Main-Class manifest attribute from dhj.jar"
rebuilt jar with manifest using below command
jar -cvfe dhj.jar DefaultHadoopJobDriver .
hadoop -jar dhj.jar DefaultHadoopJobDriver -- 这返回了我上面报告的原始错误消息。
我的Hadoop作业具有单个类"DefaultHoopJobDrive",它扩展了配置和实现工具,并将方法作为作业创建和输入路径,输出路径集的唯一代码运行。Aslo I.m使用新的API。
I'm running hadoop 1.2.1 and the Job works fine from eclipse.
This might be something to do with the classpath. Please help.
要执行该jar,您不必hadoop -jar
。命令如下所示:
hadoop jar <jar> [mainClass] args...
如果这个jar再次出现java.lang.ClassNotFoundException
异常,那么你可以使用:
hadoop classpath
命令以查看 Hadoop 安装类路径中是否存在hadoop-core-1.2.1.jar
?
仅供参考,如果它不在此列表中,则必须将此jar添加到Hadoop库目录中。
lib 文件夹中提供的所有 Hadoop jar 构建您的 Hadoop Java 代码。在这种情况下,你缺少Hadoop-core-*中存在的hadoop util类.jar
可以在 jar 中构建代码时指定类路径,也可以使用以下命令将其外部化
hadoop -cp <path_containing_hadoop_jars> -jar <jar_name>
如果有人正在使用 Maven 并降落在这里: 依赖关系问题可以通过要求 Maven 在父项目的 jar 本身中包含它所需的任何 jar 来解决。这样,Hadoop就不必在其他地方寻找依赖关系——它自己就可以在那里找到它们。具体操作方法如下: 1. 去绒球.xml
-
向
<project>
标记添加一个名为<build>
的部分 -
将以下内容添加到
<build></build>
部分:<plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>1.7.1</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <artifactSet> <excludes> <exclude>org.slf4j:slf4j-api</exclude> <exclude>junit:junit</exclude> <exclude>jmock:jmock</exclude> <exclude>xml-apis:xml-apis</exclude> <exclude>org.testng:testng</exclude> <exclude>org.mortbay.jetty:jetty</exclude> <exclude>org.mortbay.jetty:jetty-util</exclude> <exclude>org.mortbay.jetty:servlet-api-2.5</exclude> <exclude>tomcat:jasper-runtime</exclude> <exclude>tomcat:jasper-compiler</exclude> <exclude>org.apache.hadoop:hadoop-core</exclude> <exclude>org.apache.mahout:mahout-math</exclude> <exclude>commons-logging:commons-logging</exclude> <exclude>org.mortbay.jetty:jsp-api-2.1</exclude> <exclude>org.mortbay.jetty:jsp-2.1</exclude> <exclude>org.eclipse.jdt:core</exclude> <exclude>ant:ant</exclude> <exclude>org.apache.hadoop:avro</exclude> <exclude>jline:jline</exclude> <exclude>log4j:log4j</exclude> <exclude>org.yaml:snakeyaml</exclude> <exclude>javax.ws.rs:jsr311-api</exclude> <exclude>org.slf4j:jcl-over-slf4j</exclude> <exclude>javax.servlet:servlet-api</exclude> </excludes> </artifactSet> <filters> <filter> <artifact>*:*</artifact> <excludes> <exclude>META-INF/jruby.home</exclude> <exclude>META-INF/license</exclude> <exclude>META-INF/maven</exclude> <exclude>META-INF/services</exclude> </excludes> </filter> </filters> </configuration> </execution> </executions> </plugin>
现在再次生成项目,并使用普通hadoop java my.jar ...
命令运行。它现在不应该为依赖关系哭泣。希望这有帮助!