Hadoop配置lzo和lzop

在使用flume采集日志写入到hdfs时,用到了lzo压缩算法,这个算法将让保存的文件缩小为原文件的三分之一。由于此压缩算法不是hadoop默认的,需要另外安装,下面记录下,全文主要参考文末博文完成。

编译安装lzo和lzop

注意,有多少个节点,就安装多个少!注意,有多少个节点,就安装多个少!注意,有多少个节点,就安装多个少!

lzo

先编译安装lzo。

(1)编译需要使用gcc和g++,需要提前安装好,这个在安装ruby也需要用到。

# yum安装
[[email protected] /kkb/soft]# yum -y install gcc-c++ lzo-devel zlib-
...省略
Installed:
  lzo-devel.x86_64 0:2.06-8.el7                   zlib-devel.x86_64 0:1.2.7-18.el7

Dependency Installed:
  lzo-minilzo.x86_64 0:2.06-8.el7

Complete!
You have new mail in /var/spool/mail/root

(2)wget下载lzo,使用了2.10版本,目前这个版本在Hadoop CDH2.6.0上是可以用的。

# 下载
[[email protected] /kkb/install]# wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.10.tar.gz
--2020-01-20 06:46:29--  http://www.oberhumer.com/opensource/lzo/download/lzo-2.10.tar.gz
Resolving www.oberhumer.com (www.oberhumer.com)... 193.170.194.40
Connecting to www.oberhumer.com (www.oberhumer.com)|193.170.194.40|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 600622 (587K) [application/x-gzip]
Saving to: ‘lzo-2.10.tar.gz’

100%[==========================================================>] 600,622     52.2KB/s   in 11s

2020-01-20 06:46:41 (52.0 KB/s) - ‘lzo-2.10.tar.gz’ saved [600622/600622]

You have new mail in /var/spool/mail/root

(3)解压、编译和安装,下面省略了部分日志。

# 1 解压压缩包
[[email protected] /kkb/install]# tar -zxvf lzo-2.10.tar.gz
lzo-2.10/
lzo-2.10/AUTHORS
lzo-2.10/B/
...略
# 进入解压缩后目录
[[email protected] /kkb/install]# cd lzo-2.10/
# prefix指定安装目录,后面将配置临时环境变量用
[[email protected] /kkb/install/lzo-2.10]# ./configure -prefix=/usr/local/hadoop/lzo/
configure: Configuring LZO 2.10
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking target system type... x86_64-pc-linux-gnu
checking whether to enable maintainer-specific portions of Makefiles... no
checking for gcc... gcc

...略

   LZO configuration summary
   -------------------------
   LZO version                : 2.10
   configured for host        : x86_64-pc-linux-gnu
   source code location       : .
   compiler                   : gcc
   preprocessor definitions   : -DLZO_HAVE_CONFIG_H=1
   preprocessor flags         :
   compiler flags             : -g -O2
   build static library       : yes
   build shared library       : no
   enable i386 assembly code  : no

   LZO 2.10 configured.

   Copyright (C) 1996-2017 Markus Franz Xaver Johannes Oberhumer
   All Rights Reserved.

   The LZO library is free software; you can redistribute it and/or
   modify it under the terms of the GNU General Public License as
   published by the Free Software Foundation; either version 2 of
   the License, or (at your option) any later version.

   The LZO library is distributed in the hope that it will be useful,
   but WITHOUT ANY WARRANTY; without even the implied warranty of
   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
   GNU General Public License for more details.

   Markus F.X.J. Oberhumer
   <[email protected]>
   http://www.oberhumer.com/opensource/lzo/

Type ‘make‘ to build LZO.
Type ‘make check‘ and ‘make test‘ to test LZO.
Type ‘make install‘ to install LZO.
After installing LZO, please have a look at ‘examples/simple.c‘.

You have new mail in /var/spool/mail/root
# 2 编译
[[email protected] /kkb/install/lzo-2.10]# make
make  all-am
make[1]: Entering directory `/kkb/install/lzo-2.10‘
  CC       src/lzo1.lo
  CC       src/lzo1_99.lo
  CC       src/lzo1a.lo
  CC       src/lzo1a_99.lo
  ...略
make[1]: Leaving directory `/kkb/install/lzo-2.10‘
You have new mail in /var/spool/mail/root
# 3 安装
[[email protected] /kkb/install/lzo-2.10]# make install
make[1]: Entering directory `/kkb/install/lzo-2.10‘
 /usr/bin/mkdir -p ‘/usr/local/hadoop/lzo/lib‘
 /bin/sh ./libtool   --mode=install /usr/bin/install -c   src/liblzo2.la ‘/usr/local/hadoop/lzo/lib‘
libtool: install: /usr/bin/install -c src/.libs/liblzo2.lai /usr/local/hadoop/lzo/lib/liblzo2.la
libtool: install: /usr/bin/install -c src/.libs/liblzo2.a /usr/local/hadoop/lzo/lib/liblzo2.a
libtool: install: chmod 644 /usr/local/hadoop/lzo/lib/liblzo2.a
libtool: install: ranlib /usr/local/hadoop/lzo/lib/liblzo2.a
libtool: finish: PATH="/kkb/install/spark/bin:/kkb/install/spark/sbin:/kkb/install/hbase-1.2.0-cdh5.14.2/bin:/kkb/install/zookeeper-3.4.5-cdh5.14.2/bin:/kkb/install/hadoop-2.6.0-cdh5.14.2/bin:/kkb/install/jdk1.8.0_181/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/sbin" ldconfig -n /usr/local/hadoop/lzo/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /usr/local/hadoop/lzo/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the ‘-LLIBDIR‘
flag during linking and do at least one of the following:
   - add LIBDIR to the ‘LD_LIBRARY_PATH‘ environment variable
     during execution
   - add LIBDIR to the ‘LD_RUN_PATH‘ environment variable
     during linking
   - use the ‘-Wl,-rpath -Wl,LIBDIR‘ linker flag
   - have your system administrator add LIBDIR to ‘/etc/ld.so.conf‘

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
 /usr/bin/mkdir -p ‘/usr/local/hadoop/lzo/share/doc/lzo‘
 /usr/bin/install -c -m 644 AUTHORS COPYING NEWS THANKS doc/LZO.FAQ doc/LZO.TXT doc/LZOAPI.TXT ‘/usr/local/hadoop/lzo/share/doc/lzo‘
 /usr/bin/mkdir -p ‘/usr/local/hadoop/lzo/lib/pkgconfig‘
 /usr/bin/install -c -m 644 lzo2.pc ‘/usr/local/hadoop/lzo/lib/pkgconfig‘
 /usr/bin/mkdir -p ‘/usr/local/hadoop/lzo/include/lzo‘
 /usr/bin/install -c -m 644 include/lzo/lzo1.h include/lzo/lzo1a.h include/lzo/lzo1b.h include/lzo/lzo1c.h include/lzo/lzo1f.h include/lzo/lzo1x.h include/lzo/lzo1y.h include/lzo/lzo1z.h include/lzo/lzo2a.h include/lzo/lzo_asm.h include/lzo/lzoconf.h include/lzo/lzodefs.h include/lzo/lzoutil.h ‘/usr/local/hadoop/lzo/include/lzo‘
make[1]: Leaving directory `/kkb/install/lzo-2.10‘

lzop

上面安装好后,安装lzop。

(1)wget下载lzop,参考博文选择了1.04版本。

# 下载
[[email protected] /kkb/install]# wget http://www.lzop.org/download/lzop-1.04.tar.gz
--2020-01-20 07:48:13--  http://www.lzop.org/download/lzop-1.04.tar.gz
Resolving www.lzop.org (www.lzop.org)... 193.170.194.40
Connecting to www.lzop.org (www.lzop.org)|193.170.194.40|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 393483 (384K) [application/x-gzip]
Saving to: ‘lzop-1.04.tar.gz’

100%[==========================================================>] 393,483      266KB/s   in 1.4s

2020-01-20 07:48:16 (266 KB/s) - ‘lzop-1.04.tar.gz’ saved [393483/393483]

(2) 解压、编译和安装,也省略了部分日志。

# 解压
[[email protected] /kkb/install]# tar -zxvf lzop-1.04.tar.gz
lzop-1.04/
lzop-1.04/AUTHORS
lzop-1.04/B/
lzop-1.04/B/00README.TXT
# 进入解压缩目录
[[email protected] /kkb/install]# cd lzop-1.04/
# 这里不指定安装目录,默认即可
[[email protected] /kkb/install/lzop-1.04]# ./configure
configure: Configuring lzop 1.04
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu

...略

   lzop configuration summary
   --------------------------
   lzop version               : 1.04
   configured for host        : x86_64-pc-linux-gnu
   source code location       : .
   compiler                   : gcc
   preprocessor definitions   : -DLZOP_HAVE_CONFIG_H=1
   preprocessor flags         :
   compiler flags             : -g -O2
   linker flags               :
   link libraries             : -llzo2

   lzop 1.04 configured.

   Copyright (C) 1996-2017 Markus Franz Xaver Johannes Oberhumer
   All Rights Reserved.

   lzop and the LZO library are free software; you can redistribute them
   and/or modify them under the terms of the GNU General Public License as
   published by the Free Software Foundation; either version 2 of
   the License, or (at your option) any later version.

   This program is distributed in the hope that it will be useful,
   but WITHOUT ANY WARRANTY; without even the implied warranty of
   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
   GNU General Public License for more details.

   Markus F.X.J. Oberhumer
   <[email protected]>
   http://www.oberhumer.com/opensource/lzop/

Type `make‘ to build lzop. Type `make install‘ to install lzop.
After installing lzop, please read the accompanied documentation.

# 编译,-j可以提高编译速度,10代表允许10个编译命令同时执行
[[email protected] /kkb/install/lzop-1.04]# make -j 10
make  all-am
make[1]: Entering directory `/kkb/install/lzop-1.04‘
  CC       src/c_ansic.o
  CC       src/c_ansim.o
  CC       src/c_init.o
  CC       src/c_none.o
  CC       src/c_screen.o
  CC       src/compress.o
  CC       src/djgpp2.o
  CC       src/filter.o
  CC       src/frames.o
  CC       src/help.o
  CC       src/lzop.o
  CC       src/mblock.o
  CC       src/p_lzo.o
  CC       src/s_curses.o
  CC       src/s_djgpp2.o
  CC       src/s_object.o
  CC       src/s_vcsa.o
  CC       src/util.o
  CCLD     src/lzop
make[1]: Leaving directory `/kkb/install/lzop-1.04‘
You have new mail in /var/spool/mail/root
# 安装
[[email protected] /kkb/install/lzop-1.04]# make install
make[1]: Entering directory `/kkb/install/lzop-1.04‘
 /usr/bin/mkdir -p ‘/usr/local/bin‘
  /usr/bin/install -c src/lzop ‘/usr/local/bin‘
 /usr/bin/mkdir -p ‘/usr/local/share/doc/lzop‘
 /usr/bin/install -c -m 644 AUTHORS COPYING NEWS README THANKS doc/lzop.html doc/lzop.man doc/lzop.ps doc/lzop.tex doc/lzop.txt doc/lzop.pod ‘/usr/local/share/doc/lzop‘
 /usr/bin/mkdir -p ‘/usr/local/share/man/man1‘
 /usr/bin/install -c -m 644 doc/lzop.1 ‘/usr/local/share/man/man1‘
make[1]: Leaving directory `/kkb/install/lzop-1.04‘

以上,就完成了lzo和lzop的安装,接下来需要编译hadoop-lzo源码。

hadoop-lzo-master的安装编译

需要安装编译hadoop-lzo源码包,将打包后的文件分发到{HADOOP_HOME}/share/hadoop/commons目录下。

(1)下载源码包。

# 下载
[[email protected] /kkb/install] wget https://github.com/twitter/hadoop-lzo/archive/master.zip

(2)解压到当前目录

# unzip解压,可以-d指定目录
[[email protected] /kkb/install] unzip master.zip

(3)声明两个临时环境变量。

# 这个目录就是上面指定的
[r[email protected] /kkb/install] export C_INCLUDE_PATH=/usr/local/hadoop/lzo/include
[[email protected] /kkb/install] export LIBRARY_PATH=/usr/local/hadoop/lzo/lib 

(4)使用maven编译打包,将hadoop-lzo的jar包进行分发,注意需要在linux下提前安装好maven,这里我选择在其中一台安装,将jar包分发。maven的安装类似windows,参考https://www.cnblogs.com/youngchaolin/p/11825510.html

[[email protected] /kkb/install/hadoop-lzo-master]# mvn clean package -Dmaven.test.skip=true
[INFO] Scanning for projects...
[INFO]
[INFO] ----------------< com.hadoop.gplcompression:hadoop-lzo >----------------
[INFO] Building hadoop-lzo 0.4.21-SNAPSHOT
[INFO] --------------------------------[ jar ]---------------------------------
Downloading from alimaven: https://maven.aliyun.com/repository/central/org/apache/maven/plugins/maven-clean-plugin/2.5/maven-clean-plugin-2.5.pom
Downloaded from alimaven: https://maven.aliyun.com/repository/central/org/apache/maven/plugins/maven-clean-plugin/2.5/maven-clean-plugin-2.5.pom (3.9 kB at 4.1 kB/s)
Downloading from alimaven: https://maven.aliyun.com/repository/central/org/apache/maven/plugins/maven-clean-plugin/2.5/maven-clean-plugin-2.5.jar
Downloaded from alimaven: https://maven.aliyun.com/repository/central/org/apache/maven/plugins/maven-clean-plugin/2.5/maven-clean-plugin-2.5.jar (25 kB at 60 kB/s)
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-lzo ---
[INFO] Deleting /kkb/install/hadoop-lzo-master/target
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (check-platform) @ hadoop-lzo ---
[INFO] Executing tasks

check-platform:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (set-props-non-win) @ hadoop-lzo ---
[INFO] Executing tasks

set-props-non-win:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (set-props-win) @ hadoop-lzo ---
[INFO] Executing tasks

set-props-win:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-lzo ---
[INFO] Using ‘UTF-8‘ encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /kkb/install/hadoop-lzo-master/src/main/resources
[INFO]
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-lzo ---
[INFO] Compiling 25 source files to /kkb/install/hadoop-lzo-master/target/classes
[WARNING] bootstrap class path not set in conjunction with -source 1.6
/kkb/install/hadoop-lzo-master/src/main/java/com/hadoop/compression/lzo/LzoIndexer.java:[82,18] [deprecation] isDir() in FileStatus has been deprecated
[WARNING] /kkb/install/hadoop-lzo-master/src/main/java/com/hadoop/compression/lzo/DistributedLzoIndexer.java:[52,20] [deprecation] isDir() in FileStatus has been deprecated
[WARNING] /kkb/install/hadoop-lzo-master/src/main/java/com/hadoop/compression/lzo/DistributedLzoIndexer.java:[112,14] [deprecation] Job(Configuration) in Job has been deprecated
[WARNING] /kkb/install/hadoop-lzo-master/src/main/java/com/hadoop/mapreduce/LzoIndexOutputFormat.java:[31,28] [deprecation] cleanupJob(JobContext) in OutputCommitter has been deprecated
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (build-info-non-win) @ hadoop-lzo ---
[INFO] Executing tasks

build-info-non-win:
[propertyfile] Creating new property file: /kkb/install/hadoop-lzo-master/target/classes/hadoop-lzo-build.properties
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (build-info-win) @ hadoop-lzo ---
[INFO] Executing tasks

build-info-win:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (check-native-uptodate-non-win) @ hadoop-lzo ---
[INFO] Executing tasks

check-native-uptodate-non-win:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (check-native-uptodate-win) @ hadoop-lzo ---
[INFO] Executing tasks

check-native-uptodate-win:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (build-native-non-win) @ hadoop-lzo ---
[INFO] Executing tasks

build-native-non-win:
    [mkdir] Created dir: /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib
    [mkdir] Created dir: /kkb/install/hadoop-lzo-master/target/classes/native/Linux-amd64-64/lib
    [mkdir] Created dir: /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo
    [javah] [Forcefully writing file RegularFileObject[/kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoCompressor.h]]
    [javah] [Forcefully writing file RegularFileObject[/kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoCompressor_CompressionStrategy.h]]
    [javah] [Forcefully writing file RegularFileObject[/kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoDecompressor.h]]
    [javah] [Forcefully writing file RegularFileObject[/kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoDecompressor_CompressionStrategy.h]]
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for a thread-safe mkdir -p... /usr/bin/mkdir -p
     [exec] checking for gawk... gawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking whether to enable maintainer-specific portions of Makefiles... no
     [exec] checking for style of include used by make... GNU
     [exec] checking for gcc... gcc
     [exec] checking whether the C compiler works... yes
     [exec] checking for C compiler default output file name... a.out
     [exec] checking for suffix of executables...
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether gcc accepts -g... yes
     [exec] checking for gcc option to accept ISO C89... none needed
     [exec] checking dependency style of gcc... gcc3
     [exec] checking how to run the C preprocessor... gcc -E
     [exec] checking for grep that handles long lines and -e... /usr/bin/grep
     [exec] checking for egrep... /usr/bin/grep -E
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking minix/config.h usability... no
     [exec] checking minix/config.h presence... no
     [exec] checking for minix/config.h... no
     [exec] checking whether it is safe to define __EXTENSIONS__... yes
     [exec] checking for gcc... (cached) gcc
     [exec] checking whether we are using the GNU C compiler... (cached) yes
     [exec] checking whether gcc accepts -g... (cached) yes
     [exec] checking for gcc option to accept ISO C89... (cached) none needed
     [exec] checking dependency style of gcc... (cached) gcc3
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking for a sed that does not truncate output... /usr/bin/sed
     [exec] checking for fgrep... /usr/bin/grep -F
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
     [exec] checking the name lister (/usr/bin/nm -B) interface... BSD nm
     [exec] checking whether ln -s works... yes
     [exec] checking the maximum length of command line arguments... 1572864
     [exec] checking whether the shell understands some XSI constructs... yes
     [exec] checking whether the shell understands "+="... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for objdump... objdump
     [exec] checking how to recognize dependent libraries... pass_all
     [exec] checking for ar... ar
     [exec] checking for strip... strip
     [exec] checking for ranlib... ranlib
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for dlfcn.h... yes
     [exec] checking for objdir... .libs
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC -DPIC
     [exec] checking if gcc PIC flag -fPIC -DPIC works... yes
     [exec] checking if gcc static flag -static works... no
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking if gcc supports -c -o file.o... (cached) yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] checking for dlopen in -ldl... yes
     [exec] checking for unistd.h... (cached) yes
     [exec] checking stdio.h usability... yes
     [exec] checking stdio.h presence... yes
     [exec] checking for stdio.h... yes
     [exec] checking stddef.h usability... yes
     [exec] checking stddef.h presence... yes
     [exec] checking for stddef.h... yes
     [exec] checking lzo/lzo2a.h usability... yes
     [exec] which: no otool in (/kkb/install/maven-3.6.3/apache-maven-3.6.3/bin:/kkb/install/spark/bin:/kkb/install/spark/sbin:/kkb/install/hbase-1.2.0-cdh5.14.2/bin:/kkb/install/zookeeper-3.4.5-cdh5.14.2/bin:/kkb/install/hadoop-2.6.0-cdh5.14.2/sbin:/kkb/install/hadoop-2.6.0-cdh5.14.2/bin:/kkb/install/jdk1.8.0_181/bin:/kkb/install/spark/bin:/kkb/install/spark/sbin:/kkb/install/hbase-1.2.0-cdh5.14.2/bin:/kkb/install/zookeeper-3.4.5-cdh5.14.2/bin:/kkb/install/hadoop-2.6.0-cdh5.14.2/sbin:/kkb/install/hadoop-2.6.0-cdh5.14.2/bin:/kkb/install/jdk1.8.0_181/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin)
     [exec] checking lzo/lzo2a.h presence... yes
     [exec] checking for lzo/lzo2a.h... yes
     [exec] checking Checking for the ‘actual‘ dynamic-library for ‘-llzo2‘... "liblzo2.so.2"
     [exec] checking for special C compiler options needed for large files... no
     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... yes
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] checking for memset... yes
     [exec] checking for JNI_GetCreatedJavaVMs in -ljvm... yes
     [exec] checking jni.h usability... yes
     [exec] checking jni.h presence... yes
     [exec] checking for jni.h... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: executing depfiles commands
     [exec] config.status: executing libtool commands
     [exec] depbase=`echo impl/lzo/LzoCompressor.lo | sed ‘s|[^/]*$|.deps/&|;s|\.lo$||‘`;     [exec] /bin/sh ./libtool --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I/kkb/install/hadoop-lzo-master/src/main/native -I./impl  -I/kkb/install/jdk1.8.0_181/include -I/kkb/install/jdk1.8.0_181/include/linux -I/kkb/install/hadoop-lzo-master/src/main/native/impl -Isrc/com/hadoop/compression/lzo  -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoCompressor.lo -MD -MP -MF $depbase.Tpo -c -o impl/lzo/LzoCompressor.lo /kkb/install/hadoop-lzo-master/src/main/native/impl/lzo/LzoCompressor.c &&     [exec] mv -f $depbase.Tpo $depbase.Plo
     [exec] libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I/kkb/install/hadoop-lzo-master/src/main/native -I./impl -I/kkb/install/jdk1.8.0_181/include -I/kkb/install/jdk1.8.0_181/include/linux -I/kkb/install/hadoop-lzo-master/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoCompressor.lo -MD -MP -MF impl/lzo/.deps/LzoCompressor.Tpo -c /kkb/install/hadoop-lzo-master/src/main/native/impl/lzo/LzoCompressor.c  -fPIC -DPIC -o impl/lzo/.libs/LzoCompressor.o
     [exec] libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I/kkb/install/hadoop-lzo-master/src/main/native -I./impl -I/kkb/install/jdk1.8.0_181/include -I/kkb/install/jdk1.8.0_181/include/linux -I/kkb/install/hadoop-lzo-master/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoCompressor.lo -MD -MP -MF impl/lzo/.deps/LzoCompressor.Tpo -c /kkb/install/hadoop-lzo-master/src/main/native/impl/lzo/LzoCompressor.c -o impl/lzo/LzoCompressor.o >/dev/null 2>&1
     [exec] depbase=`echo impl/lzo/LzoDecompressor.lo | sed ‘s|[^/]*$|.deps/&|;s|\.lo$||‘`;     [exec] /bin/sh ./libtool --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I/kkb/install/hadoop-lzo-master/src/main/native -I./impl  -I/kkb/install/jdk1.8.0_181/include -I/kkb/install/jdk1.8.0_181/include/linux -I/kkb/install/hadoop-lzo-master/src/main/native/impl -Isrc/com/hadoop/compression/lzo  -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoDecompressor.lo -MD -MP -MF $depbase.Tpo -c -o impl/lzo/LzoDecompressor.lo /kkb/install/hadoop-lzo-master/src/main/native/impl/lzo/LzoDecompressor.c &&     [exec] mv -f $depbase.Tpo $depbase.Plo
     [exec] libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I/kkb/install/hadoop-lzo-master/src/main/native -I./impl -I/kkb/install/jdk1.8.0_181/include -I/kkb/install/jdk1.8.0_181/include/linux -I/kkb/install/hadoop-lzo-master/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoDecompressor.lo -MD -MP -MF impl/lzo/.deps/LzoDecompressor.Tpo -c /kkb/install/hadoop-lzo-master/src/main/native/impl/lzo/LzoDecompressor.c  -fPIC -DPIC -o impl/lzo/.libs/LzoDecompressor.o
     [exec] libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I/kkb/install/hadoop-lzo-master/src/main/native -I./impl -I/kkb/install/jdk1.8.0_181/include -I/kkb/install/jdk1.8.0_181/include/linux -I/kkb/install/hadoop-lzo-master/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoDecompressor.lo -MD -MP -MF impl/lzo/.deps/LzoDecompressor.Tpo -c /kkb/install/hadoop-lzo-master/src/main/native/impl/lzo/LzoDecompressor.c -o impl/lzo/LzoDecompressor.o >/dev/null 2>&1
     [exec] /bin/sh ./libtool --tag=CC   --mode=link gcc -g -Wall -fPIC -O2 -m64 -g -O2 -L/kkb/install/jdk1.8.0_181/jre/lib/amd64/server -Wl,--no-as-needed -o libgplcompression.la -rpath /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/../install/lib impl/lzo/LzoCompressor.lo impl/lzo/LzoDecompressor.lo  -ljvm -ldl
     [exec] libtool: link: gcc -shared  impl/lzo/.libs/LzoCompressor.o impl/lzo/.libs/LzoDecompressor.o   -L/kkb/install/jdk1.8.0_181/jre/lib/amd64/server -ljvm -ldl  -m64 -Wl,--no-as-needed   -Wl,-soname -Wl,libgplcompression.so.0 -o .libs/libgplcompression.so.0.0.0
     [exec] libtool: link: (cd ".libs" && rm -f "libgplcompression.so.0" && ln -s "libgplcompression.so.0.0.0" "libgplcompression.so.0")
     [exec] libtool: link: (cd ".libs" && rm -f "libgplcompression.so" && ln -s "libgplcompression.so.0.0.0" "libgplcompression.so")
     [exec] libtool: link: ar cru .libs/libgplcompression.a  impl/lzo/LzoCompressor.o impl/lzo/LzoDecompressor.o
     [exec] libtool: link: ranlib .libs/libgplcompression.a
     [exec] libtool: link: ( cd ".libs" && rm -f "libgplcompression.la" && ln -s "../libgplcompression.la" "libgplcompression.la" )
     [exec] libtool: install: cp /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/.libs/libgplcompression.so.0.0.0 /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib/libgplcompression.so.0.0.0
     [exec] libtool: install: (cd /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib && { ln -s -f libgplcompression.so.0.0.0 libgplcompression.so.0 || { rm -f libgplcompression.so.0 && ln -s libgplcompression.so.0.0.0 libgplcompression.so.0; }; })libtool: install: warning: remember to run `libtool --finish /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/../install/lib‘
     [exec]
     [exec] libtool: install: (cd /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib && { ln -s -f libgplcompression.so.0.0.0 libgplcompression.so || { rm -f libgplcompression.so && ln -s libgplcompression.so.0.0.0 libgplcompression.so; }; })
     [exec] libtool: install: cp /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/.libs/libgplcompression.lai /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib/libgplcompression.la
     [exec] libtool: install: cp /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/.libs/libgplcompression.a /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib/libgplcompression.a
     [exec] libtool: install: chmod 644 /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib/libgplcompression.a
     [exec] libtool: install: ranlib /kkb/install/hadoop-lzo-master/target/native/Linux-amd64-64/lib/libgplcompression.a
     [copy] Copying 5 files to /kkb/install/hadoop-lzo-master/target/classes/native/Linux-amd64-64/lib
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (build-native-win) @ hadoop-lzo ---
[INFO] Executing tasks

build-native-win:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-lzo ---
[INFO] Not copying test resources
[INFO]
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-lzo ---
[INFO] Not compiling test sources
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (prep-test) @ hadoop-lzo ---
[INFO] Executing tasks

prep-test:
    [mkdir] Created dir: /kkb/install/hadoop-lzo-master/target/test-classes/logs
[INFO] Executed tasks
[INFO]
[INFO] --- maven-surefire-plugin:2.14.1:test (default-test) @ hadoop-lzo ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ hadoop-lzo ---
[INFO] Building jar: /kkb/install/hadoop-lzo-master/target/hadoop-lzo-0.4.21-SNAPSHOT.jar
[INFO]
[INFO] >>> maven-source-plugin:2.2.1:jar (attach-sources) > generate-sources @ hadoop-lzo >>>
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (check-platform) @ hadoop-lzo ---
[INFO] Executing tasks

check-platform:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (set-props-non-win) @ hadoop-lzo ---
[INFO] Executing tasks

set-props-non-win:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (set-props-win) @ hadoop-lzo ---
[INFO] Executing tasks

set-props-win:
[INFO] Executed tasks
[INFO]
[INFO] <<< maven-source-plugin:2.2.1:jar (attach-sources) < generate-sources @ hadoop-lzo <<<
[INFO]
[INFO]
[INFO] --- maven-source-plugin:2.2.1:jar (attach-sources) @ hadoop-lzo ---
[INFO] Building jar: /kkb/install/hadoop-lzo-master/target/hadoop-lzo-0.4.21-SNAPSHOT-sources.jar
[INFO]
[INFO] --- maven-javadoc-plugin:2.9:jar (attach-javadocs) @ hadoop-lzo ---
[INFO]
Loading source files for package org.apache.hadoop.io.compress...
Loading source files for package com.quicklz...
Loading source files for package com.hadoop.mapred...
Loading source files for package com.hadoop.compression.lzo...
Loading source files for package com.hadoop.compression.lzo.util...
Loading source files for package com.hadoop.mapreduce...
Constructing Javadoc information...
Standard Doclet version 1.8.0_181
Building tree for all the packages and classes...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/org/apache/hadoop/io/compress/LzoCodec.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/quicklz/QuickLZ.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/DeprecatedLzoLineRecordReader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/DeprecatedLzoTextInputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/CChecksum.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/DChecksum.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/DistributedLzoIndexer.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/GPLNativeCodeLoader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzoCodec.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzoIndex.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzoIndexer.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzoInputFormatCommon.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzopCodec.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzopDecompressor.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzopInputStream.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/LzopOutputStream.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/util/CompatibilityUtil.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/LzoIndexOutputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/LzoIndexRecordWriter.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/LzoLineRecordReader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/LzoSplitInputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/LzoSplitRecordReader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/LzoSplitRecordReader.Counters.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/LzoTextInputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/overview-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/package-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/package-summary.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/package-tree.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/util/package-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/util/package-summary.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/util/package-tree.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/package-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/package-summary.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/package-tree.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/package-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/package-summary.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/package-tree.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/quicklz/package-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/quicklz/package-summary.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/quicklz/package-tree.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/org/apache/hadoop/io/compress/package-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/org/apache/hadoop/io/compress/package-summary.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/org/apache/hadoop/io/compress/package-tree.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/constant-values.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/org/apache/hadoop/io/compress/class-use/LzoCodec.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/quicklz/class-use/QuickLZ.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/class-use/DeprecatedLzoLineRecordReader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/class-use/DeprecatedLzoTextInputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzoIndexer.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzoInputFormatCommon.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/GPLNativeCodeLoader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzopOutputStream.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/CChecksum.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzopInputStream.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/DistributedLzoIndexer.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzoIndex.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzoCodec.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzopDecompressor.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/DChecksum.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/class-use/LzopCodec.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/util/class-use/CompatibilityUtil.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/class-use/LzoIndexRecordWriter.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/class-use/LzoSplitRecordReader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/class-use/LzoSplitRecordReader.Counters.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/class-use/LzoTextInputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/class-use/LzoIndexOutputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/class-use/LzoLineRecordReader.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/class-use/LzoSplitInputFormat.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/package-use.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/compression/lzo/util/package-use.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapred/package-use.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/hadoop/mapreduce/package-use.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/com/quicklz/package-use.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/org/apache/hadoop/io/compress/package-use.html...
Building index for all the packages and classes...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/overview-tree.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/index-all.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/deprecated-list.html...
Building index for all classes...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/allclasses-frame.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/allclasses-noframe.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/index.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/overview-summary.html...
Generating /kkb/install/hadoop-lzo-master/target/apidocs/help-doc.html...
[INFO] Building jar: /kkb/install/hadoop-lzo-master/target/hadoop-lzo-0.4.21-SNAPSHOT-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  19.171 s
[INFO] Finished at: 2020-01-20T07:54:51+08:00
[INFO] ------------------------------------------------------------------------
You have new mail in /var/spool/mail/root

最后BUILD SUCCESS说明成功。

(5)将hadoop-lzo的jar包复制到hadoop的{HADOOP_HOME}/share/hadoop/common目录下。

# 编译完成新增一个target目录
[[email protected] /kkb/install/hadoop-lzo-master]# ll
total 80
-rw-r--r--  1 root root 35151 Jun  7  2019 COPYING
-rw-r--r--  1 root root 19758 Jan 20 07:00 pom.xml
-rw-r--r--  1 root root 10179 Jun  7  2019 README.md
drwxr-xr-x  2 root root  4096 Jun  7  2019 scripts
drwxr-xr-x  4 root root  4096 Jun  7  2019 src
drwxr-xr-x 10 root root  4096 Jan 20 07:54 target
[[email protected] /kkb/install/hadoop-lzo-master]# cd target/
# 将hadoop-lzo的jar包复制到hadoop的{HADOOP_HOME}/share/hadoop/common目录下
[[email protected] /kkb/install/hadoop-lzo-master/target]# cp hadoop-lzo-0.4.21-SNAPSHOT.jar /kkb/install/hadoop-2.6.0-cdh5.14.2/share/hadoop/common/

(6)将上面的jar包scp分发到其他节点,注意要分发!注意要分发!注意要分发!

hadoop的core-site.xml和mapred-site中热添加设置

需要添加以下设置。

core-site.xml

添加lzo和lzop压缩相关的配置。

<!--添加压缩算法配置-->
<property>
    <name>io.compression.codecs</name>
    <value>org.apache.hadoop.io.compress.GzipCodec,
            org.apache.hadoop.io.compress.DefaultCodec,
            org.apache.hadoop.io.compress.BZip2Codec,
            com.hadoop.compression.lzo.LzoCodec,
            com.hadoop.compression.lzo.LzopCodec
    </value>
</property>
<property>
    <name>io.compression.codec.lzo.class</name>
    <value>com.hadoop.compression.lzo.LzoCodec</value>
</property>

mapred-site.xml

参考博文,添加如下配置。

<!--mapreduce输出使用lzo压缩-->
<property>
    <name>mapred.compress.map.output</name>
    <value>true</value>
</property>
<property>
    <name>mapred.map.output.compression.codec</name>
    <value>com.hadoop.compression.lzo.LzoCodec</value>
</property> 

参考博文还在hive中测试了,这里我省略了,只在flume中测试了下,安装配置前会报错提示lzo的类找不到,安装后没有报错,提示成功加载。

# 重启集群后启动flume agent,提示成功加载。20/01/20 10:41:06 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 5dbdddb8cfb544e58b4e0b9664b9d1b66657faf5]

以上,就是hadoop配置lzo和lzop的过程,网上博文很多,自己也笔记一下后用。

参考博文:

(1)https://www.cnblogs.com/qingfengyiran-top1/p/11308251.html

(2)https://www.cnblogs.com/xjh713/p/10063370.html

原文地址:https://www.cnblogs.com/youngchaolin/p/12217121.html

时间: 2024-10-06 14:57:24

Hadoop配置lzo和lzop的相关文章

Hadoop配置lzo

编译: 0. 环境准备 maven(下载安装,配置环境变量,修改sitting.xml加阿里云镜像) gcc-c++ zlib-devel autoconf automake libtool 通过yum安装即可,yum -y install gcc-c++ lzo-devel zlib-devel autoconf automake libtool 1. 下载.安装并编译LZO wget http://www.oberhumer.com/opensource/lzo/download/lzo-2

Hadoop 2.2.0安装和配置lzo

转自:http://www.iteblog.com/archives/992 Hadoop经常用于处理大量的数据,如果期间的输出数据.中间数据能压缩存储,对系统的I/O性能会有提升.综合考虑压缩.解压速度.是否支持split,目前lzo是最好的选择.LZO(LZO是Lempel-Ziv-Oberhumer的缩写)是一种高压缩比和解压速度极快的编码,它的特点是解压缩速度非常快,无损压缩,压缩后的数据能准确还原,lzo是基于block分块的,允许数据被分解成chunk,能够被并行的解压.LZO库实现

Hadoop安装lzo实验

参考http://blog.csdn.net/lalaguozhe/article/details/10912527 环境:hadoop2.3cdh5.0.2 hive 1.2.1 目标:安装lzo 测试作业运行与hive表创建使用lzo格式存储 之前安装试用snappy的时候,发现cdh解压后的native中已经包含了libsnappy之类的本地库,但是没有包含lzo. 所以lzo的使用,除了要安装lzo程序之外,还要编译安装hadoop-lzo. 1.安装lzo.可以yum安装,也可以根据上

hadoop配置名称节点HA基本流程

hadoop配置HA(NN) 配置信息参考hadoop配置名称节点HA原理 1.停止所有进程 stop-dfs.sh 2.配置hdfs-site.xml和core-site.xml 3.将s201的id_rsa发送到s205(确保两个NN能同时ssh到各个DN) 4.将s201的工作目录复制到s205 5.启动服务journalnode hadoop-daemons.sh start journalnode 6.初始化journalnode hdfs namenode -initializeSh

Hadoop配置过程!

1             Hadoop配置 注意事项:关闭所有防火墙 服务器 IP 系统 Master 10.0.0.9 Centos 6.0 X64 Slave1 10.0.0.11 Centos 6.0 X64 Slave2 10.0.0.12 Centos 6.0 X64 Hadoop版本:hadoop-0.20.2.tar.gz 1.1      在master:(在slave1和slave2上操作和以下相同) #vi /etc/hosts        三台机器相同配置 10.0.0

hadoop mapreduce lzo

import com.hadoop.compression.lzo.LzoIndexer; import com.hadoop.compression.lzo.LzopCodec; FileOutputFormat. setCompressOutput( job, true); // 设置压缩 FileOutputFormat. setOutputCompressorClass( job, LzopCodec.class ); // 选择压缩类型 result = job .waitForCom

Hadoop技术内幕——Hadoop配置信息处理

配置系统是复杂软件必不可少的一部分,org.apache.hadoop.conf.Configuration在Hadooop各个子项目中发挥着重要作用. windows系统广泛使用一种特殊批的ASCII文件.ini作为其主要配置文件标准,被称为(Initialization File)或概要文件(profile):java中JDK提供了java.util.Properties类处理简单的配置文件.而Hadoop采用自己独有的配置文件管理系统. Hadoop的配置文件的根元素是configurat

hadoop配置错误

经过上一周的郁闷期(拖延症引发的郁闷),今天终于开始步入正轨了.今天主要是解决hadoop配置的错误以及网络时断时续的问题. 首先说明一下之前按照这篇文章的方法配置完全没有问题,但是等我配置好了发现hadoop的版本和我的需求有点不一样,于是重新安装低版本的hadoop,结果就遇到问题了. 一,Hadoop错误 1. dataNode总是启动不了?  no datanode to stop 怎么解决的呢.不需要hadoop namenode -format:把 dfs/data 删除即可,res

Hadoop配置过程实践!

1             Hadoop配置 注意事项:关闭所有防火墙 服务器 IP 系统 Master 10.0.0.9 Centos 6.0 X64 Slave1 10.0.0.11 Centos 6.0 X64 Slave2 10.0.0.12 Centos 6.0 X64 Hadoop版本:hadoop-0.20.2.tar.gz 1.1      在master:(在slave1和slave2上操作和以下相同) #vi /etc/hosts        三台机器相同配置10.0.0.