在逐步遵循配置单元apachewiki时出现配置单元安装问题



按照Hive apache wiki上的指令一步一步地安装Hive后,我调用了Hive shell并键入"CREATE TABLE pokes(foo INT,bar STRING);",然后出现错误,日志也包含在内。

我是Hive的新手,有什么建议或意见吗?非常感谢!我在谷歌网上找到了问题,但没有找到解决方案。

我在Mac上使用Hadoop的单机模式。

hive>CREATE TABLE pokes(foo INT,bar STRING);失败:元数据中出错:javax.jdo.JDOFatalInternalException:捕获到意外异常。嵌套的可丢弃文件:java.lang.reflect.InvocationTargetException失败:执行错误,从org.apache.hadoop.hive.ql.exec.DDLTask 返回代码1

日志文件:

上次登录:6月14日星期二00:27:51关于ttys001谢志勇的MacBook Pro:~hadoop$cat/tmp/*/hive.log2011-06-14 00:31:54834错误元存储。HiveMetaStore(HiveMetaStore.java:executeWithRetry(334))-JDO数据存储错误。1000毫秒后重试元存储命令(第1次尝试,共1次)2011-06-14 00:31:56012错误执行。DDLTask(SessionState.java:printError(374))-失败:元数据错误:javax.jdo.JDOFatalInternalException:捕获到意外异常。嵌套的可丢弃文件:java.lang.reflect.InvocationTargetExceptionorg.apache.hadoop.hive.ql.metadata.HiveException:javax.jdo.JDOFatalInternalException:捕获到意外异常。嵌套的可丢弃文件:java.lang.reflect.InvocationTargetException网址:org.apache.hadop.hive.ql.metadata.hive.createTable(hive.java:491)网址:org.apache.hadop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3233)网址:org.apache.hadop.hive.ql.exec.DDLTask.execute(DDLTask.java:221)网址:org.apache.hadop.hive.ql.exec.Task.executeTask(Task.java:132)网址:org.apache.hadop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)网址:org.apache.hadop.hive.ql.Driver.launchTask(Driver.java:1238)网址:org.apache.hadop.hive.ql.Driver.exexecute(Driver.java:1050)网址:org.apache.hadop.hive.ql.Driver.run(Driver.java:885)网址:org.apache.hadop.hive.cli.CliDriver.processCmd(CliDriver.java:224)网址:org.apache.hadop.hive.cli.CliDriver.processLine(CliDriver.java:358)网址:org.apache.hadop.hive.cli.CliDriver.main(CliDriver.java:593)在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)位于java.lang.reflect.Method.ioke(Method.java:597)网址:org.apache.hadop.util.RunJar.main(RunJar.java:156)由以下原因引起:javax.jdo.JDOFatalInternalException:捕获到意外异常。嵌套的可丢弃文件:java.lang.reflect.InvocationTargetException位于javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelpr.java:1186)位于javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelpr.java:803)位于javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelpr.java:698)网址:org.apache.hadop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:237)网址:org.apache.hadop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:266)网址:org.apache.hadop.hive.metastore.ObjectStore.initialize(ObjectStore.java:199)网址:org.apache.hadop.hive.metastore.ObjectStore.setConf(ObjectStore.java:174)网址:org.apache.hoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)网址:org.apache.hoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)网址:org.apache.hadop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:369)网址:org.apache.hadop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:321)网址:org.apache.hadop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:466)网址:org.apache.hadop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:240)网址:org.apache.hadop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStore.java:203)网址:org.apache.hadop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:107)网址:org.apache.hadop.hive.ql.metadata.hive.createMetaStoreClient(hive.java:2010)网址:org.apache.hadop.hive.ql.metadata.hive.getMSC(hive.java:2020)网址:org.apache.hadop.hive.ql.metadata.hive.createTable(hive.java:485)…还有15个引起原因:java.lang.reflect.InvocationTargetException在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)位于java.lang.reflect.Method.ioke(Method.java:597)在javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)位于java.security.AccessController.doPrivileged(本机方法)在javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)位于javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelpr.java:1159)…还有32个导致原因:java.lang.NullPointerException网址:org.datanucles.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPpluginRegistry.java:443)网址:org.datanucles.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPpluginRegistry.java:355)网址:org.datanucles.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPpluginRegistry.java:215)网址:org.datanucles.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPpluginRegistry.java:156)网址:org.datanucleus.plugin.PluginManager.registerExtensionPoints(PluginManager.java:82)网址:org.datanucleus.OMFontext.(OMFContext.java:156)网址:org.datanucleus.OMFontext.(OMFContext.java:137)位于org.datanucleus.ObjectManagerFactoryImpl.initializeOMFContext(ObjectManagerFactoryImpl.java:132)网址:org.datanucles.jdo.JDOPersistenceManagerFactory.initializeProperties(JDOPersitenceManagerFactory.java:363)网址:org.datanucles.jdo.JDOPersistenceManagerFactory网址:org.datanucles.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersitenceManagerFactory.java:255)网址:org.datanucles.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersitenceManagerFactory.java:182)…40多个2011-06-14 00:31:56014错误ql.Driver(SessionState.java:printError(374))-失败:执行错误,从org.apache.hadop.hive.ql.exec.DDLTask返回代码1

http://getsatisfaction.com/cloudera/topics/hive_error_error_in_metadata_javax_jdo_jdofatalinternalexception

删除$HADOOP_HOME/build目录就像一个魅力。

删除目录$HADOOP_HOME/build(实际上只是重命名它)也适用于hive-0.9.0和HADOOP-1.0.4。

最新更新