在Mac OS X上,我使用以下命令从源代码编译Spark:
jacek:~/oss/spark
$ SPARK_HADOOP_VERSION=2.4.0 SPARK_YARN=true SPARK_HIVE=true SPARK_GANGLIA_LGPL=true xsbt
...
[info] Set current project to root (in build file:/Users/jacek/oss/spark/)
> ; clean ; assembly
...
[info] Packaging /Users/jacek/oss/spark/examples/target/scala-2.10/spark-examples-1.0.0-SNAPSHOT-hadoop2.4.0.jar ...
[info] Done packaging.
[info] Done packaging.
[success] Total time: 1964 s, completed May 9, 2014 5:07:45 AM
当我启动./bin/spark-shell
时,我注意到以下警告消息:
警告原生hadoop库:无法为您的平台……在合适的地方使用内置java类
可能是什么问题?
jacek:~/oss/spark
$ ./bin/spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on classpath
14/05/09 21:11:17 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
14/05/09 21:11:17 INFO SecurityManager: Changing view acls to: jacek
14/05/09 21:11:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jacek)
14/05/09 21:11:17 INFO HttpServer: Starting HTTP Server
Welcome to
____ __
/ __/__ ___ _____/ /__
_ / _ / _ `/ __/ '_/
/___/ .__/_,_/_/ /_/_ version 1.0.0-SNAPSHOT
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0)
Type in expressions to have them evaluated.
Type :help for more information.
...
14/05/09 21:11:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
...
Apache Hadoop中支持的原生库平台指南文档如下:
本机hadoop库仅支持*nix平台。的库不能在Cygwin或Mac OS X平台上工作。
原生hadoop库主要用于GNU/Linus平台和已在这些发行版上测试:
- RHEL4/Fedora
Ubuntu Gentoo在上述所有发行版中,32/64位原生hadoop库将与相应的32/64位jvm一起工作。
在Mac OS X上应该忽略WARN消息,因为本机库不存在于该平台。
根据我的经验,如果您将cd
放入/sparkDir/conf
并将spark-env.sh.template
重命名为spark-env.sh
,然后设置JAVA_OPTS
和hadoop_DIR
,它就可以工作了。
您还必须编辑/etc/profile
行:
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH