用官方的构建好的Hadoop 2.4.0(133MB)安装后,每次输入hadoop命令进去都会获得这样一个Warning,如图:html
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicablejava
网上搜罗了一些资料,下面是解决办法:shell
设置logger级别,看下具体缘由:apache
export HADOOP_ROOT_LOGGER=DEBUG,console
再次随便运行一个hadoop相关的命令,将看到下面的错误:架构
wrong ELFCLASS32?,难道是加载的so文件系统版本不对?app
因而执行命令svn
cd $HADOOP_HOME/lib/native file libhadoop.so.1.0.0
结果是这样的:oop
果真是80386,是32位的系统版本,而个人hadoop环境是64位OSui
原来直接从apache镜像中下载的编译好的Hadoop版本native library都是32版本的,若是要支持64位版本,必须本身从新编译,这就有点坑爹了,要知道几乎全部的生产环境都是64位的OSspa
Hadoop官方对于native library的一段话验证了这一点(详见:http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html#Download):
“The pre-built 32-bit i386-Linux native hadoop library is available as part of the hadoop distribution and is located in the lib/native directory”
因而从新checkout source code:
svn checkout http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.4.0/
加上编译native的选项,编译时会根据当前的操做系统架构来生产相应的native库(详见:http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html#Build):
mvn package -Pdist,native -DskipTests -Dtar
待编译完成后,到release-2.4.0/hadoop-dist/target/hadoop-2.4.0/lib/native目录下找到这个libhadoop.so.1.0.0文件,再次查看其file type:
file libhadoop.so.1.0.0
结果是这样的:
用这个文件替换掉$HADOOP_HOME/lib/native/libhadoop.so.1.0.0,以后再次随便运行一个Hadoop相关命令:
unset HADOOP_ROOT_LOGGER hadoop fs -ls /
再看输出:
WARN消失,问题解决。
参考网址:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html