RedHat Ent 6.5 64bit编译安装hadoop2.4.1

感谢原帖:http://blog.csdn.net/w13770269691/article/details/16883663/

step 1.修改yum:(针对redhat ent未注册用户,注册用户直接跳过这一步)

参考:http://blog.csdn.net/zhngjan/article/details/20843465

step 2.下载源码包:http://mirrors.hust.edu.cn/apache/hadoop/common/stable2/

wget http://mirrors.hust.edu.cn/apache/hadoop/common/stable2/hadoop-2.4.1-src.tar.gz

step 3.准备maven作为编译hadoop的工具

a.下载编译好的maven包:

wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.1.1/binaries/apache-maven-3.1.1-bin.tar.gztar -zxvf apache-maven-3.1.1-bin.tar.gz -C /opt/

b.配置maven的环境变量,在/etc/profile文件结尾中添加如下代码

export MAVEN_HOME=/opt/apache-maven-3.1.1
export PATH=$PATH:${MAVEN_HOME}/bin

c.执行如下命令使配置文件生效

source /etc/profile

d.测试maven

mvn -version

e.由于maven国外服务器可能连不上,先给maven配置一下国内镜像

在maven目录下,conf/settings.xml,在<mirrors></mirros>里添加如下内容(注意不要添加到注释里面了)

<mirror>
  <id>nexus-osc</id>
  <mirrorOf>*</mirrorOf>
  <name>Nexusosc</name>
  <url>http://maven.oschina.net/content/groups/public/</url>
</mirror>

在maven目录下,conf/settings.xml,在<profiles></profiles>添加如下内容(注意不要添加到注释里面了)

<profile>
  <id>jdk-1.7</id>
  <activation>
  <jdk>1.7</jdk>
  </activation>
  <repositories>
  <repository>
   <id>nexus</id>
   <name>local private nexus</name>
   <url>http://maven.oschina.net/content/groups/public/</url>
   <releases>
    <enabled>true</enabled>
   </releases>
   <snapshots>
    <enabled>false</enabled>
   </snapshots>
  </repository>
  </repositories>
  <pluginRepositories>
   <pluginRepository>
    <id>nexus</id>
    <name>local private nexus</name>
    <url>http://maven.oschina.net/content/groups/public/</url>
    <releases>
      <enabled>true</enabled>
    </releases>
    <snapshots>
      <enabled>false</enabled>
    </snapshots>
   </pluginRepository>
  </pluginRepositories>
</profile>  

step 4.hadoop2.4.1编译需要protoc2.5.0的支持,所以还要安装下载protoc2.5.0

官方网址:https://code.google.com/p/protobuf/downloads/list

百度网盘网址:http://pan.baidu.com/s/1pJlZubT

a.对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略

yum install gcc
yum intall gcc-c++
yum install make 

b.配置protoc

tar -xvf protobuf-2.5.0.tar.bz2
cd protobuf-2.5.0
./configure --prefix=/opt/protoc/
make && make install
yum install gcc 

红色字体是自定义的protoc安装目录,完成上面的命令后,再配置protoc环境变量,同样在/etc/profile尾部加入:

export PATH=/opt/protoc/bin:$PATH

step 5.,还不要着急开始编译安装,不然又是各种错误,需要安装cmake,openssl-devel,ncurses-devel依赖

yum install cmake
yum install openssl-devel
yum install ncurses-devel  

step 6. 解压并编译hadoop源码(一定要进入到源码解压的位置再执行mvn编译)

tar -zxvf hadoop-2.4.1-src.tar.gz -C /opt/cd /opt/hadoop-2.4.1-srcmvn package -Pdist,native -DskipTests -Dtar

编译耗时比较长,耐心等待,成功后结果类似于:

 pache Hadoop Distributed Copy .................... SUCCESS [33.648s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [7.303s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [21.288s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [14.611s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [8.334s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [10.737s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [18.321s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [17.136s]
[INFO] Apache Hadoop Client .............................. SUCCESS [14.563s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.254s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [17.245s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [14.478s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.084s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [41.979s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 34:24.464s
[INFO] Finished at: Thu Aug 07 14:25:51 CST 2014
[INFO] Final Memory: 159M/814M
[INFO] ------------------------------------------------------------------------
You have new mail in /var/spool/mail/root

编译后的路径在:hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1

进入hadoop-2.4.1目录,测试安装是否成功

[[email protected] bin]# cd /opt/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1/bin
[[email protected] bin]# ./hadoop version
Hadoop 2.4.1
Subversion Unknown -r Unknown
Compiled by root on 2014-08-07T05:52Z
Compiled with protoc 2.5.0
From source with checksum bb7ac0a3c73dc131f4844b873c74b630
This command was run using
/opt/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1/share/hadoop/common/hadoop-common-2.4.1.jar
[[email protected] bin]# cd ..
[[email protected] hadoop-2.4.1]# file lib/native/*
lib/native/libhadoop.a: current ar archive
lib/native/libhadooppipes.a: current ar archive
lib/native/libhadoop.so: symbolic link to `libhadoop.so.1.0.0‘
lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1
(SYSV), dynamically linked, not stripped
lib/native/libhadooputils.a: current ar archive
lib/native/libhdfs.a: current ar archive
lib/native/libhdfs.so: symbolic link to `libhdfs.so.0.0.0‘
lib/native/libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1
(SYSV), dynamically linked, not stripped
[[email protected] hadoop-2.4.1]#

RedHat Ent 6.5 64bit编译安装hadoop2.4.1

时间: 2024-08-11 07:55:57

RedHat Ent 6.5 64bit编译安装hadoop2.4.1的相关文章

编译安装hadoop2.6.3

一.安装环境 1.1  JAVA 安装java1.7 下载jdk1.7: [[email protected]~]# wget http://download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-linux-x64.tar.gz?AuthParam=1452765180_64b65bb908cae46ab9a9e492c842d7c7 设置JAVA环境变量: PATH=$PATH:$HOME/bin:/usr/local/mongodb-l

[Ubuntu] Ubuntu14.04 64bit 编译安装nginx1.7+php5.4+mysql5.6

我的操作系统是Ubuntu14.04,其它linux系统的操作流程类似. 主要安装的软件是nginx1.7+php5.4+mysql5.6 1. 创建必要目录 sudo mkdir ~/setup sudo mkdir /opt/software sudo chmod 777 /opt/software 2. 下载必要软件 cd ~/Downloads wget http://am1.php.net/distributions/php-5.4.29.tar.gz tar zxvf ~/Downl

Ubuntu14.04 64bit 编译安装nginx1.7+php5.4+mysql5.6

我的操作系统是Ubuntu14.04,其它linux系统的操作流程类似. 主要安装的软件是nginx1.7+php5.4+mysql5.6 1. 创建必要目录 sudo mkdir ~/setup sudo mkdir /opt/software sudo chmod 777 /opt/software 2. 下载必要软件 cd ~/Downloads wget http://am1.php.net/distributions/php-5.4.29.tar.gz tar zxvf ~/Downl

CentOS 5 64bit 编译安装MySQL报错

报错情况:   在执行./configure时出现configure: error: No curses/termcap library found 解决方法: ./configure时加上参数--with-named-curseslibs=/usr/lib/libncursesw.so.5   ./configure再报错: mysql /usr/lib/libncursesw.so.5: could not read symbols: File in wrong format   原来的编译

Hadoop第3周练习--Hadoop2.X编译安装和实验

1    练习题目 2    编译Hadoop2.X 64bit 2.1  运行环境说明 2.1.1   硬软件环境 2.1.2   集群网络环境 2.2  环境搭建 2.2.1   JDK安装和Java环境变量配置 2.2.2   安装并设置maven 2.2.3   以root用户使用yum安装svn 2.2.4   以root用户使用yum安装autoconf automake libtool cmake 2.2.5   以root用户使用yum安装ncurses-devel 2.2.6 

Hadoop-2.5.1 编译安装步骤

环境:  系统  CentOS 6.3 64 位  * 2  Master           10.10.55.112 Slave1           10.10.55.133 软件:  Hadoop- 2.5.1- src.tar.gz 一.准备环境 1.1     分别设置节点ip地址(固定)  此步骤所有节点都要操作  vi /etc/sysconfig/network- scripts/ifcfg – eth0      1.2     修改节点主机名此步骤所有节点都要操作  vi 

hadoop2.1.0编译安装教程(转载)

由于现在hadoop2.0还处于beta版本,在apache官方网站上发布的beta版本中只有编译好的32bit可用,如果你直接下载安装在64bit的linux系统的机器上,运行会报一个INFO util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable的错误,但在实际测试中是可以正常安装并可以运行自带的w

hadoop-2.7.4编译安装

1.protobuf编译安装 [[email protected] ~]# yum install svn autoconf automake libtool cmake ncurses-devel openssl-devel gcc* [[email protected] ~]# tar -zxf protobuf-2.5.0.tar.gz [[email protected] ~]# cd protobuf-2.5.0/ [[email protected] protobuf-2.5.0]#

Linux/RedHat 编译安装GNU gcc 4.9.0 (g++)

这里说的是编译安装,yum/apt-get 等安装方法比较简单,不阐述! 1.下载源码包:gcc.gnu.org 2.解压: tar -xjvf gcc-4.9.0.tar.bz2 3.下载编译所需的依赖包: 最简单的方法,直接执行: ./contrib/download_prerequisites 脚本自动下载依赖包 gmp, mpfr,mpc. 也可以手动下载然后移到/gcc-4.9.0目录下面自动一起安装,或者自行先编译安装 4.执行configure命令,产生makefile: mkdi