Socket权限错误试图启动Hive



我已经在本地开发机器上设置了Hive和Hadoop。我已经配置Hive使用MySql作为它的元数据存储。当我尝试启动Hive时,我得到以下异常:

`C:Users<USER>hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/data-engineering/hadoop331/share/hadoop/common/lib/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/data-engineering/hive3/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2023-01-02 18:55:44,239 INFO conf.HiveConf: Found configuration file file:/C:/data-engineering/hive3/conf/hive-site.xml
Hive Session ID = 228b9d9a-c6aa-4d61-adaa-ade6f0b1d878
2023-01-02 18:55:49,590 INFO SessionState: Hive Session ID = 228b9d9a-c6aa-4d61-adaa-ade6f0b1d878
Logging initialized using configuration in jar:file:/C:/data-engineering/hive3/lib/hive-common-3.0.0.jar!/hive-log4j2.properties Async: true
2023-01-02 18:55:49,834 INFO SessionState:
Logging initialized using configuration in jar:file:/C:/data-engineering/hive3/lib/hive-common-3.0.0.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: java.net.SocketException: Call From HHC12368/172.21.160.1 to 0.0.0.0:19000 failed on socket exception: java.net.SocketException: Permission denied: no further information; For more details see:  http://wiki.apache.org/hadoop/SocketException
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:644)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:585)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:747)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.net.SocketException: Call From HHC12368/172.21.160.1 to 0.0.0.0:19000 failed on socket exception: java.net.SocketException: Permission denied: no further information; For more details see:  http://wiki.apache.org/hadoop/SocketException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:913)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:871)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1577)
at org.apache.hadoop.ipc.Client.call(Client.java:1519)
at org.apache.hadoop.ipc.Client.call(Client.java:1416)
at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242)
at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:129)
at com.sun.proxy.$Proxy28.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:965)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1731)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1752)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1749)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1764)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1760)
at org.apache.hadoop.hive.ql.exec.Utilities.ensurePathIsWritable(Utilities.java:4483)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:753)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:694)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:620)
... 9 more
Caused by: java.net.SocketException: Permission denied: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:701)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:822)
at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:414)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1647)
at org.apache.hadoop.ipc.Client.call(Client.java:1463)`

鉴于异常消息的性质,我假设它与Hive与Hadoop对话有关,因为我的Hadoop core-site.xml指定:

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://0.0.0.0:19000</value>
</property>
</configuration>

任何想法?老实说,我觉得这是向前一步两步后退设置!

解决这个问题的方法,正如提到的@OneCricketeer,是为core-site.xml中的属性fs.default.name设置一个正确的值。

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://{HDFS_IP_ADDRESS}:9000</value>
</property>
</configuration>

相关内容

  • 没有找到相关文章

最新更新