编译安装hadoop2.6.3

一、安装环境

1.1  JAVA

安装java1.7

下载jdk1.7:

[[email protected]~]# wget http://download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-linux-x64.tar.gz?AuthParam=1452765180_64b65bb908cae46ab9a9e492c842d7c7

设置JAVA环境变量:

PATH=$PATH:$HOME/bin:/usr/local/mongodb-linux-x86_64-3.2.0/bin

JAVA_HOME=/usr/local/jdk1.7.0_79

CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib:$CLASS

PATH=$JAVA_HOME/bin:$PATH

1.2 Maven

下载安装maven:

[[email protected]~]# wget http://mirrors.cnnic.cn/apache/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz

然后解压,到/usr/local/,配置path

PATH=$JAVA_HOME/bin:/usr/local/apache-maven-3.3.9/bin:$PATH

1.3 Findbugs

可选安装,略

1.4 protobuf

参考:http://blog.csdn.net/huguoping830623/article/details/45482725

安装:./configure --prefix=/usr/local/protobuf2.5/

make && make install

安装中可能报错:

configure: error: C++ preprocessor "/lib/cpp" fails sanity check

原因是需要gcc相关包:

  # yum install glibc-headers
  # yum install gcc-c++

安装后把/usr/loca/bin目录加入PATH

1.5 其它包

yum -y install  lzo-devel  zlib-devel  gcc autoconf automake libtool openssl-devel fuse-devel cmake 

* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac

* Zlib devel (if compiling native code)

* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )

* Jansson C XML parsing library ( if compiling libwebhdfs )

./configure

make && make install

* Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs )

[[email protected] ~]# wgethttps://github.com/libfuse/libfuse/releases/download/fuse_2_9_4/fuse-2.8.6.tar.gz

tar -zxvf fuse-2.8.6.tar.gz

   ./configure
   make -j8
   make install

* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

二、编译安装

1.下载hadoop源文件

[[email protected]~]# wget http://www.apache.org/dyn/closer.cgi/hadoop/common/hadoop-2.6.3/hadoop-2.6.3-src.tar.gz

然后解压

2.编译hadoop文件

进入hadoop源文件的解压目录,执行:

$ mvn package -Pdist,native -DskipTests -Dtar

要保证能联网,因此要从网上下载依赖包

编译过程要很久。编译完成后放在 hadoop-2.6.3-src/hadoop-dist/target下。

[[email protected] target]# ls -l

total 530484

drwxr-xr-x. 2 root root      4096 Jan 19 13:13 antrun

-rw-r--r--. 1 root root      1867 Jan 19 13:13 dist-layout-stitching.sh

-rw-r--r--. 1 root root       640 Jan 19 13:13 dist-tar-stitching.sh

drwxr-xr-x. 9 root root      4096 Jan 19 13:13 hadoop-2.6.3        hadoop编译后的解压文件

-rw-r--r--. 1 root root 180792661 Jan 19 13:13 hadoop-2.6.3.tar.gz hadoop安装文件

-rw-r--r--. 1 root root      2778 Jan 19 13:13 hadoop-dist-2.6.3.jar

-rw-r--r--. 1 root root 362386511 Jan 19 13:13 hadoop-dist-2.6.3-javadoc.jar

drwxr-xr-x. 2 root root      4096 Jan 19 13:13 javadoc-bundle-options

drwxr-xr-x. 2 root root      4096 Jan 19 13:13 maven-archiver

drwxr-xr-x. 2 root root      4096 Jan 19 13:13 test-dir

复制到安装目录:

[[email protected] target]# cp -r hadoop-2.6.3 /usr/local/

[[email protected] target]# cd /usr/local/hadoop-2.6.3

编辑hadoop-env.sh文件

[[email protected] hadoop-2.6.3]# vi etc/hadoop/hadoop-env.sh

修改:

export JAVA_HOME=JAVA_HOME=/usr/local/jdk1.7.0_79

添加:

export HADOOP_PREFIX=/usr/local/hadoop-2.6.3

然后测试:

[[email protected] hadoop-2.6.3]# bin/hadoop version

Hadoop 2.6.3

Subversion Unknown -r Unknown

Compiled by root on 2016-01-19T04:56Z

Compiled with protoc 2.5.0

From source with checksum 722f77f825e326e13a86ff62b34ada

This command was run using /usr/local/hadoop-2.6.3/share/hadoop/common/hadoop-common-2.6.3.jar

测试成功!

三、测试

  $ mkdir input
  $ cp etc/hadoop/*.xml input
  $ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.3.jar grep input output ‘dfs[a-z.]+‘
  $ cat output/*

如果报INFO metrics.MetricsUtil: Unable to obtain hostName:node1错

修改/etc/hosts文件,加入本机的别名node1

127.0.0.1 localhost localhost.localdomain localhost修改为

127.0.0.1 localhost.localdomain node1

来自为知笔记(Wiz)

时间: 2024-10-11 00:22:11

编译安装hadoop2.6.3的相关文章

RedHat Ent 6.5 64bit编译安装hadoop2.4.1

感谢原帖:http://blog.csdn.net/w13770269691/article/details/16883663/ step 1.修改yum:(针对redhat ent未注册用户,注册用户直接跳过这一步) 参考:http://blog.csdn.net/zhngjan/article/details/20843465 step 2.下载源码包:http://mirrors.hust.edu.cn/apache/hadoop/common/stable2/ wget http://m

Hadoop-2.5.1 编译安装步骤

环境:  系统  CentOS 6.3 64 位  * 2  Master           10.10.55.112 Slave1           10.10.55.133 软件:  Hadoop- 2.5.1- src.tar.gz 一.准备环境 1.1     分别设置节点ip地址(固定)  此步骤所有节点都要操作  vi /etc/sysconfig/network- scripts/ifcfg – eth0      1.2     修改节点主机名此步骤所有节点都要操作  vi 

Hadoop第3周练习--Hadoop2.X编译安装和实验

1    练习题目 2    编译Hadoop2.X 64bit 2.1  运行环境说明 2.1.1   硬软件环境 2.1.2   集群网络环境 2.2  环境搭建 2.2.1   JDK安装和Java环境变量配置 2.2.2   安装并设置maven 2.2.3   以root用户使用yum安装svn 2.2.4   以root用户使用yum安装autoconf automake libtool cmake 2.2.5   以root用户使用yum安装ncurses-devel 2.2.6 

hadoop2.1.0编译安装教程(转载)

由于现在hadoop2.0还处于beta版本,在apache官方网站上发布的beta版本中只有编译好的32bit可用,如果你直接下载安装在64bit的linux系统的机器上,运行会报一个INFO util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable的错误,但在实际测试中是可以正常安装并可以运行自带的w

Ubuntu12.04-x64编译Hadoop2.2.0和安装Hadoop2.2.0集群

1 . 安装maven .libssl-dev .cmake 和JDK 安装本机库http://wiki.apache.org/hadoop/HowToContribute sudo apt-get -y install maven build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev sudo tar -zxvf jdk-7u51-linux-x64.tar.gz /usr/lib/jv

hadoop-2.7.4编译安装

1.protobuf编译安装 [[email protected] ~]# yum install svn autoconf automake libtool cmake ncurses-devel openssl-devel gcc* [[email protected] ~]# tar -zxf protobuf-2.5.0.tar.gz [[email protected] ~]# cd protobuf-2.5.0/ [[email protected] protobuf-2.5.0]#

如何编译Apache Hadoop2.6.0源代码

如何编译Apache Hadoop2.6.0源代码 1.安装CentOS 我使用的是CentOS6.5,下载地址是http://mirror.neu.edu.cn/centos/6.5/isos/x86_64/,选择CentOS-6.5-x86_64-bin-DVD1.iso 下载,注意是64位的,大小是4GB,需要下载一段时间的.其实6.x的版本都可以,不一定是6.5. 我使用的是VMWare虚拟机,分配了2GB内存,20GB磁盘空间.内存太小,会比较慢:磁盘太小,编译时可能会出现空间不足的情

【甘道夫】Win7x64环境下编译Apache Hadoop2.2.0的Eclipse小工具

目标: 编译Apache Hadoop2.2.0在win7x64环境下的Eclipse插件 环境: win7x64家庭普通版 eclipse-jee-kepler-SR1-win32-x86_64.zip Apache Ant(TM) version 1.8.4 compiled on May 22 2012 java version "1.7.0_45" 參考文章: http://kangfoo.u.qiniudn.com/article/2013/12/build-hadoop2x

Ubuntu14.0上编译安装Hadoop

Ubuntu14.0上编译安装Hadoop 环境: hadoop-2.5.0.tar hadoop-2.5.0-src.tar jdk-7u71-linux-x64 protobuf-2.5.0.tar Maven3.0 安装步骤: 1 安装jdk ,配置环境变量 2 安装依赖包 3 安装maven 4安装protobuf-2.5.0.tar 5 编译Hadoop 6 安装hadoop 6.1 单机模式 6.2 伪分布模式 6.3 集群模式 1 安装jdk ,配置环境变量 下载jdk版本:jdk