prestodb配置单元sql查询错误



亲爱的朋友们,我可以用Hive配置Presto了。

我可以看到"显示表格"的结果。我可以在结果中看到"图书"表。

还要描述显示所有列详细信息的书籍。

我确实有一个"books"表,可以查询蜂窝并查看结果。

示例:蜂窝>从书本中选择*;

但当我尝试通过presto时。我得到以下错误

请引导我

错误

presto:default> select * from books;
Query 20131121_025845_00004_qqe25, FAILED, 1 node
Splits: 1 total, 0 done (0.00%)
0:00 [0 rows, 0B] [0 rows/s, 0B/s]
Query 20131121_025845_00004_qqe25 failed: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
presto:default>

服务器上的异常


45_00004_qqe25.1
java.lang.RuntimeException: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
        at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-15.0.jar:na]
        at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:433) ~[na:na]
        at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:392) ~[na:na]
        at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143) ~[guava-15.0.jar:na]
        at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138) ~[guava-15.0.jar:na]
        at com.facebook.presto.execution.SqlStageExecution.startTasks(SqlStageExecution.java:463) [presto-main-0.52.jar:0.52]
        at com.facebook.presto.execution.SqlStageExecution.access$300(SqlStageExecution.java:80) [presto-main-0.52.jar:0.52]
        at com.facebook.presto.execution.SqlStageExecution$5.run(SqlStageExecution.java:435) [presto-main-0.52.jar:0.52]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45]
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_45]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_45]
        at java.lang.Thread.run(Thread.java:744) [na:1.7.0_45]
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) ~[na:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1229) ~[na:na]
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) ~[na:na]
        at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_45]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_45]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_45]
        at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) ~[na:na]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) ~[na:na]
        at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na]
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:441) ~[na:na]
        at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1526) ~[na:na]
        at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1509) ~[na:na]
        at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1462) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1502) ~[na:na]
        at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na]
        at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na]
        at com.facebook.presto.hive.FileSystemWrapper$3.listStatus(FileSystemWrapper.java:146) ~[na:na]
        at org.apache.hadoop.fs.FileSystem$4.<init>(FileSystem.java:1778) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1777) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1760) ~[na:na]
        at com.facebook.presto.hive.util.AsyncRecursiveWalker$1.run(AsyncRecursiveWalker.java:58) ~[na:na]
        at com.facebook.presto.hive.util.SuspendingExecutor$1.run(SuspendingExecutor.java:67) ~[na:na]
        at com.facebook.presto.hive.util.BoundedExecutor.executeOrMerge(BoundedExecutor.java:82) ~[na:na]
        at com.facebook.presto.hive.util.BoundedExecutor.access$000(BoundedExecutor.java:41) ~[na:na]
        at com.facebook.presto.hive.util.BoundedExecutor$1.run(BoundedExecutor.java:53) ~[na:na]
        ... 3 common frames omitted
Caused by: java.io.IOException: Broken pipe
        at sun.nio.ch.FileDispatcherImpl.write0(Native Method) ~[na:1.7.0_45]
        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) ~[na:1.7.0_45]
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) ~[na:1.7.0_45]
        at sun.nio.ch.IOUtil.write(IOUtil.java:65) ~[na:1.7.0_45]
        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:487) ~[na:1.7.0_45]
        at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:62) ~[na:na]
        at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:143) ~[na:na]
        at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:na]
        at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:114) ~[na:na]
        at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.7.0_45]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.7.0_45]
        at java.io.DataOutputStream.flush(DataOutputStream.java:123) ~[na:1.7.0_45]
        at org.apache.hadoop.ipc.Client$Connection$3.run(Client.java:897) ~[na:na]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45]
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45]
        ... 3 common frames omitted
2013-11-20T21:58:45.915-0500    DEBUG   task-notification-1     com.facebook.presto.execution.TaskStateMachine  Task 20131121_025845_00004_qqe25.0.0 is CANCELED

我从谷歌小组得到了一些答案https://groups.google.com/forum/#!主题/presto用户/lVLvMGP1sKE

Dain Sundstrom
11月8日这是HDFS客户端中的一个错误(在我的代码中为org.apache.hadop.ipc.client:941),在快速查看该代码后,这似乎意味着客户端无法解析服务器响应。我的猜测是,我们与pres-to-hive-cdh4插件捆绑的客户端与您的Hadoop版本不兼容。此代码包括Cloudera Hadoop 2.0.0-cdh4.3.0版本。您使用的是什么版本?

相关内容

  • 没有找到相关文章

最新更新