HBASE 0.98版本安装,一步曲:编译HADOOP 2.2.0 x64版本

1、安装JDK

下载JDK 7u55版本,安装
JDK和JRE都需要,JDK里面有tools.jar,这个jar包是一定需要的
安装在/java上

2、下载Hadoop 2.2.0源代码

wget http://apache.dataguru.cn/hadoop/common/stable/hadoop-2.2.0-src.tar.gz
解压缩
tar
zxvf hadoop-2.2.0-src.tar.gz -C /tmp

3、安装依赖包

yum -y install lzo-devel zlib-devel gcc autoconf automake libtool gcc-c++
openssl-devel openssl-devel

4、安装相关编译用软件

Protobuf编译和安装

tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
--prefix=/usr/local/protobuf
make
make install

Ant安装

tar -zxvf apache-ant-1.9.2-bin.tar.gz
mv apache-ant-1.9.2
/usr/local/ant

Maven安装

tar -zxvf apache-maven-3.0.5-bin.tar.gz
mv apache-maven-3.0.5
/usr/local/maven

Findbugs安装

tar -zxfv findbugs-2.0.2.tar.gz
mv findbugs-2.0.2 /usr/local/findbugs

cmake编译安装

tar -zvxf cmake-2.8.8.tar.gz
cd cmake-2.8.8
./bootstrap
make
make
install

5、配置路径

vim /etc/profile

#java
export JAVA_HOME=/java
export JRE_HOME=$JAVA_HOME/jre
export
CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/lib
export
PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin

#maven
export MAVEN_HOME=/usr/local/maven
export MAVEN_OPTS="-Xms256m
-Xmx512m"
export CLASSPATH=.:$CLASSPATH:$MAVEN_HOME/lib
export
PATH=$PATH:$MAVEN_HOME/bin

#protobuf
export PROTOBUF_HOME=/usr/local/protobuf
export
CLASSPATH=.:$CLASSPATH:$PROTOBUF_HOME/lib
export
PATH=$PATH:$PROTOBUF_HOME/bin

#ant
export ANT_HOME=/usr/local/ant
export
CLASSPATH=.:$CLASSPATH:$ANT_HOME/lib
export PATH=$PATH:$ANT_HOME/bin

#findbugs
export FINDBUGS_HOME=/usr/local/findbugs
export
CLASSPATH=.:$CLASSPATH:$FINDBUGS_HOME/lib
export
PATH=$PATH:$FINDBUGS_HOME/bin

source /etc/profile

即刻生效

9、修改依赖Bug

vim /hadoop-2.2.0/hadoop-common-project/hadoop-auth/pom.xml

在dependency部分加入:

org.mortbay.jetty
jetty
test

org.mortbay.jetty
jetty-util
test

10、编译

cd hadoop-2.2.0-src

mvn clean package -Pdist,native -DskipTests -Dtar

[INFO]
------------------------------------------------------------------------
[INFO]
Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main
................................ SUCCESS [10.796s]
[INFO] Apache Hadoop
Project POM ......................... SUCCESS [8.171s]
[INFO] Apache Hadoop
Annotations ......................... SUCCESS [18.306s]
[INFO] Apache Hadoop
Assemblies .......................... SUCCESS [1.704s]
[INFO] Apache Hadoop
Project Dist POM .................... SUCCESS [8.222s]
[INFO] Apache Hadoop
Maven Plugins ....................... SUCCESS [17.120s]
[INFO] Apache Hadoop
Auth ................................ SUCCESS [15.952s]
[INFO] Apache Hadoop
Auth Examples ....................... SUCCESS [12.085s]
[INFO] Apache Hadoop
Common .............................. SUCCESS [4:57.617s]
[INFO] Apache
Hadoop NFS ................................. SUCCESS [25.393s]
[INFO] Apache
Hadoop Common Project ...................... SUCCESS [0.231s]
[INFO] Apache
Hadoop HDFS ................................ SUCCESS [5:51.635s]
[INFO]
Apache Hadoop HttpFS .............................. SUCCESS
[1:27.220s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............
SUCCESS [59.011s]
[INFO] Apache Hadoop HDFS-NFS ............................
SUCCESS [11.979s]
[INFO] Apache Hadoop HDFS Project ........................
SUCCESS [0.195s]
[INFO] hadoop-yarn .......................................
SUCCESS [1:41.292s]
[INFO] hadoop-yarn-api
................................... SUCCESS [1:53.028s]
[INFO]
hadoop-yarn-common ................................ SUCCESS
[1:47.889s]
[INFO] hadoop-yarn-server ................................
SUCCESS [0.712s]
[INFO] hadoop-yarn-server-common .........................
SUCCESS [38.517s]
[INFO] hadoop-yarn-server-nodemanager ....................
SUCCESS [53.352s]
[INFO] hadoop-yarn-server-web-proxy ......................
SUCCESS [13.733s]
[INFO] hadoop-yarn-server-resourcemanager ................
SUCCESS [49.935s]
[INFO] hadoop-yarn-server-tests ..........................
SUCCESS [3.230s]
[INFO] hadoop-yarn-client ................................
SUCCESS [23.036s]
[INFO] hadoop-yarn-applications ..........................
SUCCESS [0.690s]
[INFO] hadoop-yarn-applications-distributedshell .........
SUCCESS [7.623s]
[INFO] hadoop-mapreduce-client ...........................
SUCCESS [0.581s]
[INFO] hadoop-mapreduce-client-core ......................
SUCCESS [1:26.644s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher
.... SUCCESS [8.783s]
[INFO] hadoop-yarn-site
.................................. SUCCESS [1.217s]
[INFO]
hadoop-yarn-project ............................... SUCCESS [30.587s]
[INFO]
hadoop-mapreduce-client-common .................... SUCCESS
[1:19.185s]
[INFO] hadoop-mapreduce-client-shuffle ...................
SUCCESS [17.693s]
[INFO] hadoop-mapreduce-client-app .......................
SUCCESS [41.929s]
[INFO] hadoop-mapreduce-client-hs ........................
SUCCESS [18.209s]
[INFO] hadoop-mapreduce-client-jobclient .................
SUCCESS [24.663s]
[INFO] hadoop-mapreduce-client-hs-plugins ................
SUCCESS [7.631s]
[INFO] Apache Hadoop MapReduce Examples ..................
SUCCESS [22.663s]
[INFO] hadoop-mapreduce ..................................
SUCCESS [10.093s]
[INFO] Apache Hadoop MapReduce Streaming .................
SUCCESS [19.489s]
[INFO] Apache Hadoop Distributed Copy ....................
SUCCESS [51.046s]
[INFO] Apache Hadoop Archives ............................
SUCCESS [7.621s]
[INFO] Apache Hadoop Rumen ...............................
SUCCESS [20.543s]
[INFO] Apache Hadoop Gridmix .............................
SUCCESS [15.156s]
[INFO] Apache Hadoop Data Join ...........................
SUCCESS [9.968s]
[INFO] Apache Hadoop Extras ..............................
SUCCESS [9.504s]
[INFO] Apache Hadoop Pipes ...............................
SUCCESS [15.708s]
[INFO] Apache Hadoop Tools Dist ..........................
SUCCESS [5.261s]
[INFO] Apache Hadoop Tools ...............................
SUCCESS [0.268s]
[INFO] Apache Hadoop Distribution ........................
SUCCESS [1:15.418s]
[INFO] Apache Hadoop Client
.............................. SUCCESS [29.025s]
[INFO] Apache Hadoop
Mini-Cluster ........................ SUCCESS [0.735s]
[INFO]
------------------------------------------------------------------------
[INFO]
BUILD SUCCESS
[INFO]
------------------------------------------------------------------------
[INFO]
Total time: 34:15.365s
[INFO] Finished at: Fri May 16 16:15:37 CST
2014
[INFO] Final Memory: 101M/385M
[INFO]
------------------------------------------------------------------------

编译完毕会在

hadoop-2.2.0-src/hadoop-dist/target/

产生一个包

hadoop-2.2.0.tar.gz

这个包就是最终可部署的hadoop包

整个编程过程可能会出现中间失败的情况,有很多原因,有可能是因为连接主机下载依赖不成功。

可以尝试:mvn clean package -Pdist,native -DskipTests -Dtar

多来几次,就可以了。

时间: 2024-10-07 07:37:11

HBASE 0.98版本安装,一步曲:编译HADOOP 2.2.0 x64版本的相关文章

Ubuntu14.04用apt安装CDH5.1.2[Apache Hadoop 2.3.0]

--------------------------------------- 博文作者:迦壹 博客名称:Ubuntu14.04用apt安装CDH5.1.2[Apache Hadoop 2.3.0] 博客地址:http://idoall.org/home.php?mod=space&uid=1&do=blog&id=558 转载声明:可以转载, 但必须以超链接形式标明文章原始出处和作者信息及版权声明,谢谢合作! -----------------------------------

CentOS 6.4 64位 源码编译hadoop 2.2.0

CentOS 6.4 64位 源码编译hadoop 2.2.0 搭建环境:Centos 6.4 64bit 1.安装JDK 参考这里2.安装mavenmaven官方下载地址,可以选择源码编码安装,这里就直接下载编译好的wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.1.1/binaries/apache-maven-3.1.1-bin.zip解压文件后,同样在/etc/profie里配置环境变量vim /etc/profieexport

OpenCV-2.4.2 安装三步曲

注意:本人未使用  ffmpeg 的全部依赖库,比如AAC 音频编码库(libfaac-dev),MP3 编码库(ibmp3lame-dev),具体的配置为: ./configure --enable-shared --enable-gpl --enable-version3 --enable-nonfree --enable-x11grab --enable-libx264 --enable-libxvid ===========================================

python3.7 下easygui 0.98的安装和简单教程

安装方法 1.去官网下载最新版的ZIP包 发现是easygui-0.97.zip,没关系,解压到任意地方 找到easygui.py,拷贝进电脑Python安装目录下的Lib\site-packages文件夹下 这时打开IDLE,import easygui,回车,如果没报错,那就是识别到了. 但是运行时会报错,比如 AttributeError: module 'easygui' has no attribute 'msgbox' 不要急,这时打开cmd,直接输入: 输入pip uninstal

64位centos 下编译 hadoop 2.6.0 源码

64位os下为啥要编译hadoop就不解释了,百度一下就能知道原因,下面是步骤: 前提:编译源码所在的机器,必须能上网,否则建议不要尝试了 一. 下载必要的组件 a) 下载hadoop源码 (当前最新的稳定版是2.6.0)地址  http://mirrors.hust.edu.cn/apache/hadoop/common/stable/hadoop-2.6.0-src.tar.gz b) 下载apache-ant (centos自带的ant版本太低,编译过程中会报错)地址: http://mi

CentOS 64位上编译 Hadoop 2.6.0

1.操作系统编译环境 yum install cmake lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool ncurses-devel openssl-devel libXtst 2.安装JDK 下载JDK1.7,注意只能用1.7,否则编译会出错 http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html tar zxvf

Mac OS X 10.10编译Hadoop 2.6.0笔记

原本通过brew 安装了Hadoop 2.6.0,但是使用过程中报一个错误(错误描述),查了资料说是因为native库的问题,看了一下,发现通过brew安装的版本根本就没有native目录,所以下载了源码自己编译.结果陆续出了不少问题,记录如下: 1.错误: 程序包com.sun.javadoc不存在 这个错误尝试了很多方法,包括在pom.xml中指定tools.jar文件等等 最后通过'export JAVA_HOME=/usr/libexec/java_home -v 1.6'命令切换JDK

64位CentOS上编译 Hadoop 2.2.0

1. 下载Hadoop 2.2.0 源码包,并解压 $ wget http://mirrors.hust.edu.cn/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz $ tar zxf hadoop-2.2.0-src.tar.gz 2. 安装下面的软件 $ sudo yum install cmake lzo-devel zlib-devel gcc autoconf automake libtool ncurses-dev

Centos 6.5 X64 环境下编译 hadoop 2.6.0 --已验证

Centos 6.5 x64 hadoop 2.6.0 jdk 1.7 protobuf-2.5.0 maven-3.0.5 set environment export JAVA_HOME=/home/linux/jdk export CLASSPATH=JAVA_HOME/lib/tools.jar export PROTOC_HOME=/home/linux/protobuf export MAVEN_HOME=/home/linux/maven export PATH=$PROTOC_H