Hadoop:start-all.sh 运行时出错



当运行启动.sh时,然后发生错误。

yunweiguo@172.16.192.134's password: 
172.16.192.135: bash: line 0: cd: /Users/yunweiguo/hadoop/hadoop-1.2.1/libexec/..: No such file or directory
172.16.192.135: bash: /Users/yunweiguo/hadoop/hadoop-1.2.1/bin/hadoop-daemon.sh: No such file or directory

我猜你没有正确设置 bashrc,请遵循这个:

vi $HOME/.bashrc 在文件末尾放置以下行:(将 Hadoop 主页更改为您的)

 # Set Hadoop-related environment variables 
 export HADOOP_HOME=/usr/local/hadoop
 # Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on) 
 export JAVA_HOME=/usr/lib/jvm/java-6-sun
 # Some convenient aliases and functions for running Hadoop-related commands 
  unalias fs &> /dev/null
  alias fs="hadoop fs"
  unalias hls &> /dev/null 
  alias hls="fs -ls"
 # If you have LZO compression enabled in your Hadoop cluster and
 # compress job outputs with LZOP (not covered in this tutorial):
 # Conveniently inspect an LZOP compressed file from the command
 # line; run via:
 #
 # $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
 #
 # Requires installed 'lzop' command.
  lzohead () {
     hadoop fs -cat $1 | lzop -dc | head -1000 | less 
    }
 # Add Hadoop bin/ directory to PATH
  export PATH=$PATH:$HADOOP_HOME/bin

最新更新