如何配置 hadoop 以使用非默认端口:"0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused"



当我运行start-dfs时,我得到了以下错误,看起来我需要告诉hadoop使用不同的端口,因为这是我ssh到localhost时所需要的。换句话说,以下成功工作:ssh -p 2020 localhost.

[Wed Jan 06 16:57:34 root@~]# start-dfs.sh
16/01/06 16:57:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: namenode running as process 85236. Stop it first.
localhost: datanode running as process 85397. Stop it first.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused
16/01/06 16:57:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

core-site.xml:

<configuration>
    <property>
        <name>fs.default.name</name>
            <value>hdfs://localhost:9000</value>
    </property>
</configuration>

hdfs-site.xml:

<configuration>
    <property>
        <name>dfs.replication</name>
            <value>1</value>
    </property>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>file:///hadoop/hdfs/namenode</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir</name>
        <value>file:///hadoop/hdfs/datanode</value>
    </property>
</configuration>

如果Hadoop集群节点在非标准端口上运行sshd侦听,那么可以告诉Hadoop脚本启动到该端口的ssh连接。事实上,可以自定义传递给ssh命令的任何选项。

这是由一个名为HADOOP_SSH_OPTS的环境变量控制的。您可以编辑您的hadoop-env.sh文件并在那里定义它。(默认情况下,未定义此环境变量。)

例如:

export HADOOP_SSH_OPTS="-p 2020"

相关内容

最新更新