累积初始化 - [开始.主] 错误:正在初始化类装入器



我是Accumulo的新手,并尝试在Cloudera VM上安装v1.7。

我有Java 1.7和HDP 2.2,Zookeeper目前正在运行。 我主要尝试在没有事故的情况下遵循 INSTALL.md 并配置了Accumulo,但是在尝试初始化时出现以下错误:

./bin/accumulo init
2016-02-23 09:24:07,999 [start.Main] ERROR: Problem initializing the class loade                                                                             r
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.                                                                             java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces                                                                             sorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.accumulo.start.Main.getClassLoader(Main.java:68)
        at org.apache.accumulo.start.Main.main(Main.java:52)
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
        at org.apache.commons.vfs2.impl.DefaultFileSystemManager.<init>(DefaultF                                                                             ileSystemManager.java:120)
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.gene                                                                             rateVfs(AccumuloVFSClassLoader.java:246)
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.getC                                                                             lassLoader(AccumuloVFSClassLoader.java:204)
        ... 6 more
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFacto                                                                             ry
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass                                                                             (AccumuloClassLoader.java:281)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 9 more
Exception in thread "Thread-0" java.lang.NoClassDefFoundError: org/apache/common                                                                             s/io/FileUtils
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.clos                                                                             e(AccumuloVFSClassLoader.java:406)
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader$Accu                                                                             muloVFSClassLoaderShutdownThread.run(AccumuloVFSClassLoader.java:74)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.io.FileUtils
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass                                                                             (AccumuloClassLoader.java:281)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 3 more

我已经阅读过其他帖子,其中这已被归结为accumulo-env.sh中的错误设置,但是如下所示,我看不到我缺少什么

if [[ -z $HADOOP_HOME ]] ; then
   test -z "$HADOOP_PREFIX"      && export HADOOP_PREFIX=/usr/lib/hadoop
else
   HADOOP_PREFIX="$HADOOP_HOME"
   unset HADOOP_HOME
fi
# hadoop-2.0:
test -z "$HADOOP_CONF_DIR"       && export HADOOP_CONF_DIR="$HADOOP_PREFIX/etc/hadoop"
test -z "$ACCUMULO_HOME"         && export ACCUMULO_HOME="/etc/accumulo/accumulo-1.7.0"
test -z "$JAVA_HOME"             && export JAVA_HOME="/usr/java/jdk1.7.0_67-cloudera"
test -z "$ZOOKEEPER_HOME"        && export ZOOKEEPER_HOME=/usr/lib/zookeeper
test -z "$ACCUMULO_LOG_DIR"      && export ACCUMULO_LOG_DIR=$ACCUMULO_HOME/logs
if [[ -f ${ACCUMULO_CONF_DIR}/accumulo.policy ]]
then
   POLICY="-Djava.security.manager -Djava.security.policy=${ACCUMULO_CONF_DIR}/accumulo.policy"
fi

此外,我的常规类路径中有以下内容

<property>
<name>general.classpaths</name>
<value>
  <!-- Accumulo requirements -->
  $ACCUMULO_HOME/lib/accumulo-server.jar,
  $ACCUMULO_HOME/lib/accumulo-core.jar,
  $ACCUMULO_HOME/lib/accumulo-start.jar,
  $ACCUMULO_HOME/lib/accumulo-fate.jar,
  $ACCUMULO_HOME/lib/accumulo-proxy.jar,
  $ACCUMULO_HOME/lib/[^.].*.jar,
  <!-- ZooKeeper requirements -->
  $ZOOKEEPER_HOME/zookeeper[^.].*.jar,
  <!-- Common Hadoop requirements -->
  $HADOOP_CONF_DIR,
  <!-- Hadoop 2 requirements --><!--
  $HADOOP_PREFIX/share/hadoop/common/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/common/lib/(?!slf4j)[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/hdfs/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/mapreduce/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/yarn/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/yarn/lib/jersey.*.jar,
  --><!-- End Hadoop 2 requirements -->
  <!-- HDP 2.0 requirements --><!--
  /usr/lib/hadoop/[^.].*.jar,
  /usr/lib/hadoop/lib/[^.].*.jar,
  /usr/lib/hadoop-hdfs/[^.].*.jar,
  /usr/lib/hadoop-mapreduce/[^.].*.jar,
  /usr/lib/hadoop-yarn/[^.].*.jar,
  /usr/lib/hadoop-yarn/lib/jersey.*.jar,
  --><!-- End HDP 2.0 requirements -->
  <!-- HDP 2.2 requirements -->
  /usr/hdp/current/hadoop-client/[^.].*.jar,
  /usr/hdp/current/hadoop-client/lib/(?!slf4j)[^.].*.jar,
  /usr/hdp/current/hadoop-hdfs-client/[^.].*.jar,
  /usr/hdp/current/hadoop-mapreduce-client/[^.].*.jar,
  /usr/hdp/current/hadoop-yarn-client/[^.].*.jar,
  /usr/hdp/current/hadoop-yarn-client/lib/jersey.*.jar,
  /usr/hdp/current/hive-client/lib/hive-accumulo-handler.jar
  /usr/lib/hadoop/lib/commons-io-2.4.jar
  <!-- End HDP 2.2 requirements -->
</value>
<description>Classpaths that accumulo checks for updates and class files.</description>

任何帮助将不胜感激,有趣的是,我在尝试运行./bin/accumulo classpath时得到相同的结果

Accumulo希望从Hadoop安装中提取commons-io-2.4.jar。我不确定 CDH 是否正在打包这个 jar,或者您的配置文件是否没有正确指向它。

您可以尝试检查 accumulo classpath 的输出,以查看类路径上的项实际扩展为什么。accumulo-site.xml 中的 general.classpath 配置项是您要检查/修改的内容。

unset CLASSPATH

我有同样的问题花了几个小时才弄清楚。

相关内容

  • 没有找到相关文章

最新更新