hadoop2.7环境的编译安装

0.环境说明:

编译所用的操作系统为:

[[email protected] hadoop-2.7.1-src]# cat /etc/redhat-release

CentOS release 6.5 (Final)

hadoop的版本为2.7.1

1.安装依赖软件包:

yum install svn autoconf automake libtool cmake ncurses-devel openssl-devel gcc*

2.配置java和maven环境:

wget wget http://download.oracle.com/otn-pub/java/jdk/8u60-b27/jdk-8u60-linux-x64.tar.gz?AuthParam=1443446776_174368b9ab1a6a92468aba5cd4d092d0

tar -zxvf jdk-8u60-linux-x64.tar.gz -C /usr/local

cd /usr/local

ln -s jdk1.8.0_60 jdk

echo ‘export JAVA_HOME=/usr/local/jdk‘ >>/etc/profile;

echo ‘export PATH=$JAVA_HOME/bin:$PATH‘>>/etc/profile;

wget http://mirrors.hust.edu.cn/apache/maven/maven-3/3.3.3/binaries/apache-maven-3.3.3-bin.tar.gz

tar -zxvf apache-maven-3.3.3-bin.tar.gz -C /usr/local

cd /usr/local

ln -s apache-maven-3.3.3 maven

echo ‘export PATH=/usr/local/maven/bin:$PATH‘ >/etc/profile.d/maven.sh;

3.下载并安装protobuf(必须使用2.5版本)

wget https://codeload.github.com/google/protobuf/zip/v2.5.0

unzip protobuf-2.5.0.zip

wget http://googletest.googlecode.com/files/gtest-1.5.0.tar.bz2

tar -jxvf gtest-1.5.0.tar.bz2

mv gtest-1.5.0 ./protobuf-2.5.0/gtest

./autogen.sh

./configure

make

make check

make install

which protoc

[[email protected] protobuf-master]# which protoc

/usr/local/bin/protoc

4.下载并安装ant:

wget http://mirrors.cnnic.cn/apache//ant/binaries/apache-ant-1.9.6-bin.zip

unzip apache-ant-1.9.6-bin.zip

mv apache-ant-1.9.6 /usr/local/ant

echo ‘export PATH=/usr/local/ant/bin:$PATH‘ >/etc/profile.d/ant.sh

5.编译hadoop:

tar -zxvf tar -zxvf hadoop-2.7.1-src.tar.gz

mvn package -Pdist,native -DskipTests -Dtar

6.故障处理:

第一次编译故障:

[ERROR] Failed to execute goal on project hadoop-auth: Could not resolve dependencies for project org.apache.hadoop:hadoop-auth:jar:2.7.1: The following artifacts could not be resolved: org.mockito:mockito-all:jar:1.8.5, org.mortbay.jetty:jetty-util:jar:6.1.26, org.mortbay.jetty:jetty:jar:6.1.26, org.apache.tomcat.embed:tomcat-embed-core:jar:7.0.55, org.apache.httpcomponents:httpclient:jar:4.2.5, org.apache.zookeeper:zookeeper:jar:3.4.6: Could not transfer artifact org.mockito:mockito-all:jar:1.8.5 from/to central (https://repo.maven.apache.org/maven2): GET request of: org/mockito/mockito-all/1.8.5/mockito-all-1.8.5.jar from central failed: SSL peer shut down incorrectly -> [Help 1]

[ERROR]

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR] For more information about the errors and possible solutions, please read the following articles:

[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException

[ERROR]

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn <goals> -rf :hadoop-auth

解决办法:

这种情况很常见,这是因为插件没有下载完毕造成的。多执行几次下面命令就可以了

mvn package -Pdist,native -DskipTests -Dtar

第二次编译故障:

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.7.1:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is ‘libprotoc 3.0.0‘, expected version is ‘2.5.0‘ -> [Help 1]

protobuf版本过新,需要使用2.5的版本;

7.编译成功的日志:

[INFO] Apache Hadoop Main ................................. SUCCESS [  7.502 s]

[INFO] Apache Hadoop Project POM .......................... SUCCESS [  4.844 s]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 10.274 s]

[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.477 s]

[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  4.568 s]

[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 11.000 s]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  9.870 s]

[INFO] Apache Hadoop Auth ................................. SUCCESS [  9.003 s]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  9.321 s]

[INFO] Apache Hadoop Common ............................... SUCCESS [03:21 min]

[INFO] Apache Hadoop NFS .................................. SUCCESS [ 20.029 s]

[INFO] Apache Hadoop KMS .................................. SUCCESS [ 21.350 s]

[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.079 s]

[INFO] Apache Hadoop HDFS ................................. SUCCESS [10:57 min]

[INFO] Apache Hadoop HttpFS ............................... SUCCESS [01:15 min]

[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 46.255 s]

[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 21.495 s]

[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.242 s]

[INFO] hadoop-yarn ........................................ SUCCESS [  0.137 s]

[INFO] hadoop-yarn-api .................................... SUCCESS [01:34 min]

[INFO] hadoop-yarn-common ................................. SUCCESS [01:31 min]

[INFO] hadoop-yarn-server ................................. SUCCESS [  0.291 s]

[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 35.037 s]

[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 44.224 s]

[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  4.315 s]

[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 17.461 s]

[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 46.435 s]

[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 10.698 s]

[INFO] hadoop-yarn-client ................................. SUCCESS [  8.976 s]

[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 10.343 s]

[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.113 s]

[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  7.395 s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  4.006 s]

[INFO] hadoop-yarn-site ................................... SUCCESS [  0.108 s]

[INFO] hadoop-yarn-registry ............................... SUCCESS [ 12.317 s]

[INFO] hadoop-yarn-project ................................ SUCCESS [ 18.781 s]

[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.396 s]

[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 46.350 s]

[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 34.772 s]

[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  8.779 s]

[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 22.440 s]

[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 12.865 s]

[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [01:45 min]

[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  6.051 s]

[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  8.077 s]

[INFO] hadoop-mapreduce ................................... SUCCESS [ 12.782 s]

[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 24.680 s]

[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 50.965 s]

[INFO] Apache Hadoop Archives ............................. SUCCESS [  6.861 s]

[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 12.928 s]

[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  6.784 s]

[INFO] Apache Hadoop Data Join ............................ SUCCESS [  3.629 s]

[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  7.135 s]

[INFO] Apache Hadoop Extras ............................... SUCCESS [  6.233 s]

[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 31.548 s]

[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 10.084 s]

[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [35:23 min]

[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 36.126 s]

[INFO] Apache Hadoop Client ............................... SUCCESS [ 24.463 s]

[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.353 s]

[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 12.506 s]

[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 34.475 s]

[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.159 s]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [02:37 min]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 01:12 h

[INFO] Finished at: 2015-10-03T03:54:29+08:00

[INFO] Final Memory: 91M/237M

[INFO] ------------------------------------------------------------------------

[[email protected] hadoop-2.7.1-src]#

8.检查生成包:

cd /tmp/hadoop-2.7.1-src/hadoop-dist/target;

[[email protected] target]# ls -ld hadoop*

drwxr-xr-x 9 root root      4096 10月  3 03:51 hadoop-2.7.1

-rw-r--r-- 1 root root 194796372 10月  3 03:52 hadoop-2.7.1.tar.gz

-rw-r--r-- 1 root root      2823 10月  3 03:52 hadoop-dist-2.7.1.jar

-rw-r--r-- 1 root root 390430395 10月  3 03:54 hadoop-dist-2.7.1-javadoc.jar

9.至此编译工作顺利结束;

时间: 2024-10-26 09:56:47

hadoop2.7环境的编译安装的相关文章

libCURL开源库在VS2010环境下编译安装,配置详解

libCURL开源库在VS2010环境下编译安装,配置详解 转自:http://my.oschina.net/u/1420791/blog/198247 CURL开源库VS2010环境下编译安装,配置详解 一 准备 1.1 CURL官网下载地址:http://curl.haxx.se/download.html 1.2 找到源码包,我这里下载的是7.32.0版:http://curl.haxx.se/download/curl-7.32.0.zip 二 步骤 2.1 打开curl-7.32.0\

CentOS 6.6 环境下 编译安装LNMP

环境:      OS:     CentOS 6.6     IP: 172.16.66.100    Nginx:  nginx-1.6.2    PHP:     php-5.4.40    Xcache: xcache-3.2.0    Mysql:  mariadb-5.5.43 一.前期环境准备: 1.根据官方ISO 创建本地yum源[[email protected] ~]# mkdir /mnt/cd [[email protected] ~]# mount /dev/cdrom

zabbix分布式监控环境完全编译安装部署

很多文档使用的是yum安装mysql.http.php等工具.这里采用源码安装的形式,实现从LNMP-zabbix的全程记录. 一.LNMP平台搭建 参考:http://www.ttlsa.com/nginx/nginx-install-on-linux/ http://swht1278.blog.51cto.com/7138082/1623886 1.nginx的安装 1.1 依赖环境的部署 yum -y install gcc gcc-c++ autoconf automake zlib z

超详细LAMP环境手动编译安装实例

LAMP编译安装实例: HTTPD编译安装: 下载软件包:     # wget http://mirrors.hust.edu.cn/apache//apr/apr-1.6.3.tar.gz http://mirrors.hust.edu.cn/apache//apr/apr-util-1.5.1.tar.gz http://mirrors.shu.edu.cn/apache//httpd/httpd-2.4.29.tar.gz 需要依赖最新版apr和apr-util apr:Apache P

RedHat Enterprise Server RHEL v7.7 环境下编译安装python v3.7.5

RedHat Enterprise Server RHEL v7.7 环境使用官方安装光盘只能安装python v2.7版本.由于工作中要使用python v3.x版本,只能手工编译安装了,以下是步骤. 首先到python官网下载python v3.7.5的源码包,下载好的文件Python-3.7.5.tgz通过sftp上传到RHEL v7.7机器上.官网地址:https://www.python.org/ 1,编译源码前先用yum安装一些必要的模块.如果机器可以连接到外网,也可以用阿里云镜像y

linux环境手动编译安装Nginx实践过程 附异常解决

1.下载nginx源码包并解压 可在http://nginx.org/en/download.html下载.tar.gz的源码包,如(nginx-1.4.7.tar.gz) 或者使用云盘下载   http://url.cn/5kRqr3n   (密码:f72dcD) 下载后通过tar -xvzf 进行解压,解压后的nginx目录结构如下: 2.为nginx设置安装目录和启用的模块 切换到解压后的nginx目录中执行: ./configure --prefix=/opt/demo/nginx --

在centos环境下编译安装myrocksdb

rocksdb(https://rocksdb.org.cn/)是脸书(facebook)公司开源的一个key-value存储引擎,基于leveldb开发.rocksdb使用的是LSM存储引擎,纯c++编写.rocksdb具有很好的读写性能.但是rocksdb的实际操作需要很好的阅读rocksdb api文档,很多实现要自己编写代码来执行,还要考虑诸如线程安全等问题.Myrocks是rocksdb和mysql结合的结果,它将mysql的innodb引擎替换为rocksdb引擎,剥离实际操作底层r

Linux环境下编译安装PHP

继上一篇文章Mysql,这一章讲如何部署php服务. 三.php安装 1.首先安装GD库和GD库关联程序 (用来处理和生成图片). yum install \libjpeg-devel \libpng-devel \freetype-devel \zlib-devel \gettext-devel \libXpm-devel \libxml2-devel \fontconfig-devel \openssl-devel \bzip2-devel 2.解压安装gd库到/opt tar xzvf g

Apache-rhel5.8环境下编译安装

Apache安装过程 Step 1:安装包gcc或gcc-c++# yum install gcc#yum install gcc-c++ Step 2:安装包APR和APR-Utilapr-1.4.8.tar.gz apr-util-1.5.2.tar.gz # tar -zxf apr-1.4.8.tar.gz# cd apr-1.4.8 新建目录/usr/local/apr,用作安装目录:# mkdir /usr/local/apr# ./configure --prefix=/usr/l