我已经在Amazon EC2主机中安装了Apache Hadoop MapReduce 2.6.1,并使用专用IP地址配置了运行时。您可以在下面查看我的配置[2-5]。
为了方便调试我的问题,我允许安全组[6]中的所有入站流量。
问题是,当我启动MapReduce时,我会在SSH密钥中获得Permission denied
。
- 我注意到我不能从同一个主机
ssh
EC主机。如何修复此错误?ubuntu@ip-XXX-XX-XX-XX: ssh ubuntu@ip-XXX-XX-XX
- 我想在EC2中不同站点中运行的HDFS实例之间复制数据。使用主机的专用IP地址正确吗
[1] 错误,我有
ubuntu@ip-XXX-XX-XX-XX:~/Programs/medusa-2.0$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
15/12/16 10:52:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [ip-XXX-XX-XX-XX]
ip-XXX-XX-XX-XX: Permission denied (publickey).
ip-XXX-XX-XX-XX: Permission denied (publickey).
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Permission denied (publickey).
15/12/16 10:52:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
starting yarn daemons
starting resourcemanager, logging to /home/ubuntu/Programs/hadoop-2.6.2/logs/yarn-ubuntu-resourcemanager-ip-XXX-XX-XX-XX.out
ip-XXX-XX-XX-XX: Permission denied (publickey).
[2] Yarn-site.xml
ubuntu@ip-XXX-XX-XX-XX:~/Programs$ cat ./hadoop/etc/hadoop/yarn-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property> <name>yarn.log-aggregation-enable</name> <value>true</value> </property>
<property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property>
<property> <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name> <value>org.apache.hadoop.mapred.ShuffleHandler</value> </property>
<property> <name>yarn.resourcemanager.resource-tracker.address</name> <value>ip-XXX-XX-XX-XX:8025</value> </property>
<property> <name>yarn.resourcemanager.scheduler.address</name> <value>ip-XXX-XX-XX-XX:8030</value> </property>
<property> <name>yarn.resourcemanager.address</name> <value>ip-XXX-XX-XX-XX:8040</value> </property>
</configuration>
[3] core-site.xml
ubuntu@ip-XXX-XX-XX-XX:~/Programs$ cat ./hadoop/etc/hadoop/core-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property> <name>fs.default.name</name> <value>hdfs://ip-XXX-XX-XX-XX:9000</value> </property>
<property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value> </property>
</configuration>
[4] 从属
ubuntu@ip-XXX-XX-XX-XX:~/Programs$ cat ./hadoop/etc/hadoop/slaves
ip-XXX-XX-XX-XX
[5] SSH文件
ubuntu@ip-XXX-XX-XX-XX:~/Programs$ ls -alrt ~/.ssh/
total 24
-rw------- 1 ubuntu ubuntu 392 Dec 16 09:49 authorized_keys
-rw-r--r-- 1 ubuntu ubuntu 666 Dec 16 10:33 known_hosts
-rw-r--r-- 1 ubuntu ubuntu 404 Dec 16 10:33 id_rsa.pub
-rw------- 1 ubuntu ubuntu 1675 Dec 16 10:33 id_rsa
drwx------ 2 ubuntu ubuntu 4096 Dec 16 10:33 .
drwxr-xr-x 7 ubuntu ubuntu 4096 Dec 16 10:34 ..
[6] 安全组
All traffic All All 0.0.0.0/0
SSH TCP 22 0.0.0.0/0
这与SSH问题有关。您没有正确配置SSH。
您可以测试sshubuntu@ip****在运行start-all.sh程序之前。如果它不起作用,那么您需要解决ssh问题并重试。