linux编译64bitHadoop (eg: ubuntu14.04 and hadoop 2.3.0)

Hadoop官网提供的编译好的hadoop-2.3.0.tar.gz二进制包是在32位系统上编译的,在64系统上运行会有一些错误,比如:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

此时需要自行编译hadoop 2.30 源码。本人编译的hadoop 2.30的二进制包经实验证明可以正确安装并正确运行了Word Count程序。此处可以下载我编译的二进制包:

http://pan.baidu.com/s/1eQrgsWa

以下为编译步骤

1. 更新软件包列表

apt-get update

2. 安装编译所需要的软件: 为什么需要这些依赖包? 我也不知道==

apt-get install -y openjdk-7-jdk libprotobuf-dev protobuf-compiler maven cmake build-essential pkg-config libssl-dev zlib1g-dev llvm-gcc automake autoconf make

3. 下载hadoop 2.30的源文件包

wget http://archive.apache.org/dist/hadoop/core/hadoop-2.3.0/hadoop-2.3.0-src.tar.gz

4. 解压hadoop 2.30 的源文件包

tar -xzvf hadoop-2.3.0-src.tar.gz

5. 进入hadoop 2.30 文件夹

cd hadoop-2.3.0-src

6. 编译hadoop 2.30 源文件

mvn package -Pdist,native -DskipTests –Dtar

正确执行的结果如下:

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main ................................ SUCCESS [1:11.968s]

[INFO] Apache Hadoop Project POM ......................... SUCCESS [30.393s]

[INFO] Apache Hadoop Annotations ......................... SUCCESS [18.398s]

[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.246s]

[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [20.372s]

[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [23.721s]

[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [1:41.836s]

[INFO] Apache Hadoop Auth ................................ SUCCESS [22.303s]

[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [7.052s]

[INFO] Apache Hadoop Common .............................. SUCCESS [2:29.466s]

[INFO] Apache Hadoop NFS ................................. SUCCESS [11.604s]

[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.073s]

[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:30.230s]

[INFO] Apache Hadoop HttpFS .............................. SUCCESS [17.976s]

[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [19.927s]

[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [3.304s]

[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.032s]

[INFO] hadoop-yarn ....................................... SUCCESS [0.033s]

[INFO] hadoop-yarn-api ................................... SUCCESS [36.284s]

[INFO] hadoop-yarn-common ................................ SUCCESS [33.912s]

[INFO] hadoop-yarn-server ................................ SUCCESS [0.213s]

[INFO] hadoop-yarn-server-common ......................... SUCCESS [8.193s]

[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [41.181s]

[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [2.768s]

[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [13.923s]

[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.904s]

[INFO] hadoop-yarn-client ................................ SUCCESS [4.363s]

[INFO] hadoop-yarn-applications .......................... SUCCESS [0.120s]

[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.262s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [1.615s]

[INFO] hadoop-yarn-site .................................. SUCCESS [0.086s]

[INFO] hadoop-yarn-project ............................... SUCCESS [2.703s]

[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.132s]

[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [18.951s]

[INFO] hadoop-mapreduce-client-common .................... SUCCESS [14.320s]

[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.330s]

[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [9.664s]

[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [7.678s]

[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [9.263s]

[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.549s]

[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [5.748s]

[INFO] hadoop-mapreduce .................................. SUCCESS [2.880s]

[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.080s]

[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [14.648s]

[INFO] Apache Hadoop Archives ............................ SUCCESS [2.602s]

[INFO] Apache Hadoop Rumen ............................... SUCCESS [5.706s]

[INFO] Apache Hadoop Gridmix ............................. SUCCESS [3.649s]

[INFO] Apache Hadoop Data Join ........................... SUCCESS [2.483s]

[INFO] Apache Hadoop Extras .............................. SUCCESS [2.678s]

[INFO] Apache Hadoop Pipes ............................... SUCCESS [6.359s]

[INFO] Apache Hadoop OpenStack support ................... SUCCESS [5.088s]

[INFO] Apache Hadoop Client .............................. SUCCESS [4.534s]

[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.433s]

[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [7.757s]

[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [4.099s]

[INFO] Apache Hadoop Tools ............................... SUCCESS [0.428s]

[INFO] Apache Hadoop Distribution ........................ SUCCESS [18.045s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 14:59.240s

[INFO] Finished at: Thu Jan 15 18:51:59 JST 2015

[INFO] Final Memory: 168M/435M

[INFO] ------------------------------------------------------------------------

编译好的二进制文件包位于

hadoop-2.3.0-src/hadoop-dist/target/hadoop-2.3.0.tar.gz

PS: 使用自行编译的hadoop 2.30二进制包安装hadoop 2.30时需要注意删除 .bashrc文件与hadoop-env.sh文件中下面两行(默认不会有这两行,但是尝试解决报错时可能改写了)

export HADOOP_COMMON_LIB_NATIVE_DIR="~/hadoop/lib/"

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=~/hadoop/lib/"

时间: 2024-12-23 00:13:46

linux编译64bitHadoop (eg: ubuntu14.04 and hadoop 2.3.0)的相关文章

一、Ubuntu14.04下安装Hadoop2.4.0 (单机模式)

一.在Ubuntu下创建hadoop组和hadoop用户 增加hadoop用户组,同时在该组里增加hadoop用户,后续在涉及到hadoop操作时,我们使用该用户. 1.创建hadoop用户组 2.创建hadoop用户 sudo adduser -ingroup hadoop hadoop 回车后会提示输入新的UNIX密码,这是新建用户hadoop的密码,输入回车即可. 如果不输入密码,回车后会重新提示输入密码,即密码不能为空. 最后确认信息是否正确,如果没问题,输入 Y,回车即可. 3.为ha

二、Ubuntu14.04下安装Hadoop2.4.0 (伪分布模式)

在Ubuntu14.04下安装Hadoop2.4.0 (单机模式)基础上配置 一.配置core-site.xml /usr/local/hadoop/etc/hadoop/core-site.xml 包含了hadoop启动时的配置信息. 编辑器中打开此文件 sudo gedit /usr/local/hadoop/etc/hadoop/core-site.xml 在该文件的<configuration></configuration>之间增加如下内容: <property&g

Ubuntu 12.04下Hadoop 2.2.0 集群搭建(原创)

现在大家可以跟我一起来实现Ubuntu 12.04下Hadoop 2.2.0 集群搭建,在这里我使用了两台服务器,一台作为master即namenode主机,另一台作为slave即datanode主机,增加更多的slave只需重复slave部分的内容即可. 系统版本: master:Ubuntu 12.04 slave:Ubuntu 12.04 hadoop:hadoop 2.2.0 安装ssh服务:sudo apt-get install ssh 有时也要更新一下vim:sudo apt-ge

opencv3.0的编译安装(ubuntu14.04系统),opencv设置摄像头分辨率并捕获图片

搭建C/C++编译环境:sudo apt-get install build-essential 安装关联库:sudo apt-get install cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev python-dev python-numpy libtbb2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev libjasper-dev l

Linux学习篇之---Ubuntu14.04防火墙配置

Ubuntu14.04防火墙配置 1.安装: apt-get install ufw 2.启用: ufw enable ufw default deny 3.开启/禁用: ufw allow 22/tcp   允许所有的外部IP访问本机的22/tcp (ssh)端口 ufw deny 22/tcp     禁止所有外部IP访问本机的22/tcp(ssh)端口 ufw delete deny 22/tcp 删除防火墙中的规则 4.实例: 1)查看本机防火墙状态: 2)启用防火墙: 因为我用的ssh

ubuntu14.04 安装hadoop

1.下载hadoop,解压 这个可以到http://hadoop.apache.org/里面找到 2.配置.bashrc文件 3.创建hadoop数据保存 我创建在了hadoop文件里,名字叫做data 4.配置hadoop-env.sh文件 5.配置mapred-site.xml文件 首先,在/home/tian/Downloads/hadhoop/hadoop-2.6.0/etc/hadoop文件夹里面有个mapred-queues.xml.template文件,我们需要复制一份,名字叫做m

Ubuntu14.04 安装配置Hadoop2.6.0

目前关于Hadoop的安装配置教程书上.官方教程.博客都有很多,但由于对Linux环境的不熟悉以及各种教程或多或少有这样那样的坑,很容易导致折腾许久都安装不成功(本人就是受害人之一).经过几天不断尝试,终于成功了.现将整个关于Hadoop的安装配置步骤分享出来,一方面为了希望能对有需要的朋友有所帮助,一方面为了总结所遇到的问题和解决方案.接下来开始正文. 准备工作 提前下载好以下工具: JDK安装包 Eclipse安装包 Hadoop安装包 Eclipse的Hadoop插件插件 本人用的JDK版

Ubuntu14.04安装和配置Tomcat8.0.12

Ubuntu14.04长的好看,所以一时间很感兴趣,研究各种软件的安装和开发环境的配置.今天先把安装的tomcat 8.0.12的教程分享给大家.如果你需要,请收藏!!! 官方网站下载最新的tomcat: http://tomcat.apache.org/download-80.cgi 在ubuntu上,我们下载zip和tar.gz. 解压tomcat 8,用下面的命令(我下载的是tar.gz格式的): tar -zxvf apache-tomcat-8.0.12.tar.gz 开始配置tomc

Ubuntu14.04安装配置Hadoop2.6.0(完全分布式)与 wordcount实例运行

我的环境是:Ubuntu14.04+Hadoop2.6.0+JDK1.8.0_25 官网2.6.0的安装教程:http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/SingleCluster.html 为了方面配置,我在每台机器上都使用了hadoop用户来操作,这样做的确够方便. 结点信息:(分布式集群架构:master为主节点,其余为从节点) 机器名 IP 作用 master 122.205.135.254