Hadoop with openjdk: error at start-dfs.sh (SSH?)



我在本教程设置4集群hadoop架构时遇到了问题。我有以下 4 台机器(虚拟化(:

  • 主节点
  • 节点 1
  • 节点2
  • 节点3

我将所有 conf 文件都设置在主节点上,并使用 scp 将它们导出到其他文件。主节点可以通过 SSH 访问从节点。我在所有机器上都以 .bashrc 格式设置了JAVA_HOME。但是,这就是我得到的:

hadoop@master-node:~$ start-dfs.sh
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 
0.0.0.0: Error: JAVA_HOME is not set and could not be found.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

[3种可能性] 使用 openJDK 11 似乎存在问题,尽管我不太确定这是否是导致这种混乱的原因。这些错误表明 ssh 存在问题,但 i( 我上传了我的 conf 文件没有任何问题,ii( 我可以从主节点访问所有节点。这可能与设置JAVA_HOME路径的方式有关吗?这是我的 .bashrc 的结尾:

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=PATH:$PATH/bin

提前感谢每个线索(我不怎么使用 java,我在这里感到有点迷茫(

[编辑] 与 OracleJDK8 相同

hadoop@master-node:~$  readlink -f /usr/bin/java
/usr/lib/jvm/java-8-oracle/jre/bin/java
hadoop@master-node:~$ export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre
hadoop@master-node:~$ start-dfs.sh
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 

0.0.0.0:错误:未设置JAVA_HOME,找不到。

你能像这样导出路径吗,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=$PATH:$JAVA_HOME/bin

然后,您必须执行以下命令以确保您的 PATH 包含JAVA_HOME变量。 在.bashrc文件中附加JAVA和PATH变量后,执行以下命令,

source ~/.bashrc

然后检查echo $PATH, 如果该值包含JAVA_HOME值,则它应该可以工作。

找到了!!!!!事实证明,JAVA_HOME通过 ssh 连接丢失了(为什么,我不知道。这让我找到了答案(

为了克服这个问题,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64

也应添加到

hadoop/etc/hadoop/hadoop-env.sh

最新更新