Hive-0.13.1安装配置

Hive安装配置

参考网址

http://blog.yidooo.net/archives/apache-hive-installation.html

http://www.cnblogs.com/linjiqin/archive/2013/03/04/2942402.html

Hbase In Action(HBase实战)和Hbase:The Definitive Guide(HBase权威指南)两本书中,有很多入门级的代码,可以选择自己感兴趣的check out。地址分别为 https://github.com/HBaseinaction   https://github.com/larsgeorge/hbase-book

解压

$ tar -xzvf hive-x.y.z.tar.gz

tar  zcvf  hive-0.13.1.tar.gz  hive

Hive配置

复制

cd conf

cp hive-default.xml.template hive-site.xml

cp hive-env.sh.template hive-env.sh

cp hive-log4j.properties.template hive-log4j.properties

cp hive-exec-log4j.properties.template hive-exec-log4j.properties

安装Mysql JDBC Connector

存储元数据是采用第三方的mysql数据库,这种情况下需要下载一个数据包mysql-connector-java-5.1.26-bin.jar,放到hive的lib目录下

cp mysql-connector-java-5.1.26-bin.jar   hive/lib

修改配置

hive-site.xml


<?xml   version="1.0"?>

<?xml-stylesheet   type="text/xsl" href="configuration.xsl"?>

<configuration>

<!--<property>

<name>hive.metastore.warehouse.dir</name>

<value>hdfs://localhost:9000/hive/warehousedir</value>

</property>-->

<property>

<name>hive.metastore.warehouse.dir</name>

<value>/user/hive/warehouse</value>

<description>location of   default database for the warehouse</description>

</property>

<!--<property>

<name>hive.exec.scratchdir</name>

<value>hdfs://localhost:9000/hive/scratchdir</value>

</property>-->

<property>

<name>hive.exec.scratchdir</name>

<value>/tmp/hive-${user.name}</value>

<description>Scratch   space for Hive jobs</description>

</property>

<property>

<name>hive.querylog.location</name>

<value>/local/usr/hive/logs</value>

</property>

<!--<property>

<name>hive.querylog.location</name>

<value>/tmp/${user.name}</value>

<description>

Location of Hive run   time structured log file

</description>

</property>-->

<property>

<name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:mysql://192.168.0.177:3306/hive?createDatabaseIfNotExist=true</value>

</property>

<property>

<name>javax.jdo.option.ConnectionDriverName</name>

<value>com.mysql.jdbc.Driver</value>

</property>

<property>

<name>javax.jdo.option.ConnectionUserName</name>

<value>root</value>

</property>

<property>

<name>javax.jdo.option.ConnectionPassword</name>

<value>yunho201311</value>

</property>

<property>

<name>hive.aux.jars.path</name>

<value>file:///usr/local/hive/lib/hive-hbase-handler-0.13.1.jar,file:///usr/local/hive/lib/protobuf-java-2.5.0.jar,file:///usr/local/hive/lib/hbase-client-0.98.6.1-hadoop2.jar,file:///usr/local/hive/lib/hbase-common-0.98.6.1-hadoop2.jar,file:///usr/local/hive/lib/zookeeper-3.4.5.jar,file:///usr/local/hive/lib/guava-11.0.2.jar</value>

</property>

<!--<property>

<name>hive.metastore.uris</name>

<value>thrift://192.168.0.177:9083</value>

</property>-->

<property>

<name>hive.zookeeper.quorum</name>

<value>192.168.0.177</value>

<description>The list of   ZooKeeper servers to talk to. This is only needed for read/write   locks.</description>

</property>

</configuration>

file:///usr/local/hive/lib/hive-hbase-handler-0.13.1.jar,file:///usr/local/hive/lib/protobuf-java-2.5.0.jar,file:///usr/local/hive/lib/hbase-client-0.98.6.1-hadoop2.jar,file:///usr/local/hive/lib/hbase-common-0.98.6.1-hadoop2.jar,file:///usr/local/hive/lib/zookeeper-3.4.5.jar,file:///usr/local/hive/lib/guava-11.0.2.jar

hive.aux.jars.path的value中间不允许有空格,回车,换行什么的,全部写在一行上就行了,不然会出各种错。这些jar要复制到hive的lib中。

•hive-env.sh


# Set   HADOOP_HOME to point to a specific hadoop install directory

HADOOP_HOME=/usr/local/hadoop

# Hive   Configuration Directory can be controlled by:

export HIVE_CONF_DIR=/usr/local/hive/conf

启动hive

./hive -hiveconf hive.root.logger=DEBUG,console

报的各种错误及解决方法

报错:

ERROR exec.DDLTask: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase

缺少 hbase-server-0.98.6.1-hadoop2.jar

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingInterface

缺少 hbase-protocol-0.98.6.1-hadoop2.jar

14/11/04 13:34:57 [main]: ERROR exec.DDLTask: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:MetaException(message:java.io.IOException: java.lang.reflect.InvocationTargetException

Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace

缺少 htrace-core-2.04.jar

Specified key was too long; max key length is 767 bytes

这个问题,在网上看了一些资料,解决方法:

alter database hive character set latin1;

删除hive数据库,新建时设置为latin1即可;

设置数据库字符集

latin1 -- cp1252 West European

latin1_swedish_ci

启动

[[email protected] bin]# sh hive

Logging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.properties

hive> show tables;

OK

dev_opt

dev_opt1

Time taken: 0.489 seconds, Fetched: 2 row(s)

集成

只想使用hive查询hbase的数据,并不对hbase数据进行修改,因此使用外表即可。

Hbase中的表 dev_opt

‘dev_opt‘, {NAME => ‘opt‘, DATA_BLOCK_ENCODING => ‘ true

NONE‘, BLOOMFILTER => ‘ROW‘, REPLICATION_SCOPE => ‘

0‘, COMPRESSION => ‘NONE‘, VERSIONS => ‘1‘, TTL =>

‘FOREVER‘, MIN_VERSIONS => ‘0‘, KEEP_DELETED_CELLS

=> ‘false‘, BLOCKSIZE => ‘65536‘, IN_MEMORY => ‘fal

se‘, BLOCKCACHE => ‘true‘}

dev_opt:只有一个列簇opt,下有两个列:opt:dvid,opt:value

创建外表

CREATE EXTERNAL TABLE dev_opt(key string, opt map<string,string>)

STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘

WITH SERDEPROPERTIES ("hbase.columns.mapping" = "opt:")

TBLPROPERTIES("hbase.table.name" = "dev_opt");

在hive中查询:

Hive>  select * from dev_opt;

结果:

bd98db741dfa471f8fbf413841da4b7e-test_yz-2014-10-29 15:37:11    {"dvid":"7","value":"1"}

c95808d07d83430d919b3766cafc3ff3-username-2014-10-22 09:51:13   {"dvid":"5","value":"commandvaluestr"}

Time taken: 0.138 seconds, Fetched: 38 row(s)

CREATE EXTERNAL TABLE dev_opt1(key string, dvid int, value string)

STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘

WITH SERDEPROPERTIES ("hbase.columns.mapping" = "opt:dvid,opt:value")

TBLPROPERTIES("hbase.table.name" = "dev_opt");

在hive中查询:

Hive>  select * from dev_opt1;

bd98db741dfa471f8fbf413841da4b7e-test_yz-2014-10-29 15:36:34    3       1

bd98db741dfa471f8fbf413841da4b7e-test_yz-2014-10-29 15:37:11    7       1

c95808d07d83430d919b3766cafc3ff3-username-2014-10-22 09:51:13   5       commandvaluestr

Time taken: 0.986 seconds, Fetched: 38 row(s)

hive> select * from dev_opt1 where dvid = 5;

Total jobs = 1

Launching Job 1 out of 1

Number of reduce tasks is set to 0 since there‘s no reduce operator

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/CompatibilityFactory

at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)

at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)

at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:392)

at org.apache.hadoop.hive.ql.plan.PlanUtils.configureJobConf(PlanUtils.java:849)

at org.apache.hadoop.hive.ql.plan.MapWork.configureJobConf(MapWork.java:503)

at org.apache.hadoop.hive.ql.plan.MapredWork.configureJobConf(MapredWork.java:68)

at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:368)

at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)

at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)

at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)

at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.CompatibilityFactory

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

... 26 more

FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. org/apache/hadoop/hbase/CompatibilityFactory

hbase-hadoop2-compat-0.98.6.1-hadoop2.jar

hive> select * from dev_opt1 where dvid = 5;

Total jobs = 1

Launching Job 1 out of 1

Number of reduce tasks is set to 0 since there‘s no reduce operator

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/CompatibilityFactory

at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)

at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)

at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:392)

at org.apache.hadoop.hive.ql.plan.PlanUtils.configureJobConf(PlanUtils.java:849)

at org.apache.hadoop.hive.ql.plan.MapWork.configureJobConf(MapWork.java:503)

at org.apache.hadoop.hive.ql.plan.MapredWork.configureJobConf(MapredWork.java:68)

at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:368)

at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)

at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)

at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)

at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.CompatibilityFactory

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

... 26 more

FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. org/apache/hadoop/hbase/CompatibilityFactory

hbase-hadoop2-compat-0.98.6.1-hadoop2.jar

hbase-hadoop-compat-0.98.6.1-hadoop2.jar

java.lang.NoClassDefFoundError: org/cliffc/high_scale_lib/Counter

at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)

at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)

at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:392)

at org.apache.hadoop.hive.ql.plan.PlanUtils.configureJobConf(PlanUtils.java:849)

at org.apache.hadoop.hive.ql.plan.MapWork.configureJobConf(MapWork.java:503)

at org.apache.hadoop.hive.ql.plan.MapredWork.configureJobConf(MapredWork.java:68)

at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:368)

at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)

at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)

at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)

at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.lang.ClassNotFoundException: org.cliffc.high_scale_lib.Counter

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

... 26 more

FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. org/cliffc/high_scale_lib/Counter

high-scale-lib-1.1.1.jar

hive> select * from dev_opt1 where dvid = 5;

Total jobs = 1

Launching Job 1 out of 1

Number of reduce tasks is set to 0 since there‘s no reduce operator

Starting Job = job_1413857279729_0001, Tracking URL = http://iiot-test-server1:8088/proxy/application_1413857279729_0001/

Kill Command = /usr/local/hadoop/bin/hadoop job  -kill job_1413857279729_0001

Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0

2014-11-06 13:42:31,721 Stage-1 map = 0%,  reduce = 0%

2014-11-06 13:42:40,043 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 2.91 sec

MapReduce Total cumulative CPU time: 2 seconds 910 msec

Ended Job = job_1413857279729_0001

MapReduce Jobs Launched:

Job 0: Map: 1   Cumulative CPU: 2.91 sec   HDFS Read: 259 HDFS Write: 134 SUCCESS

Total MapReduce CPU Time Spent: 2 seconds 910 msec

OK

100WJL001M11000000000001-lmm-2014-11-01 20:00:31        5       12

c95808d07d83430d919b3766cafc3ff3-username-2014-10-22 09:51:13   5       commandvaluestr

Time taken: 34.5 seconds, Fetched: 2 row(s)

时间: 2024-08-24 11:14:53

Hive-0.13.1安装配置的相关文章

Ubuntu Server 13.10 安装配置图解教程

一.Ubuntu Server 13.10系统安装 Ubuntu分为桌面版(desktop)和服务器版(Server),下面为大家介绍服务器版本Ubuntu Server 13.10的详细安装过程. 附Ubuntu Server 13.10系统镜像下载地址: 32位:http://releases.ubuntu.com/13.10/ubuntu-13.10-server-i386.iso 64位:http://releases.ubuntu.com/13.10/ubuntu-13.10-serv

【甘道夫】Hive 0.13.1 on Hadoop2.2.0 + Oracle10g部署详解

环境: hadoop2.2.0 hive0.13.1 Ubuntu 14.04 LTS java version "1.7.0_60" Oracle10g ***欢迎转载,请注明来源***    http://blog.csdn.net/u010967382/article/details/38709751 到以下地址下载安装包 http://mirrors.cnnic.cn/apache/hive/stable/apache-hive-0.13.1-bin.tar.gz 安装包解压到

Hadoop-2.6.0伪分布--安装配置hbase

Hadoop-2.6.0伪分布--安装配置hbase 1.用到的hadoop与hbase: 2.安装hadoop: 具体安装看这篇博文: http://blog.csdn.net/baolibin528/article/details/42939477 hbase所有版本下载 http://archive.apache.org/dist/hbase/ 3.解压hbase: 结果: 4.修改hbase 文件权限为一般用户权限: 5.设置环境变量: 配置内容: 保存配置: 6.进入配置文件目录: 7

Hive 0.13 数据类型

hive支持的数据类型路下 数值类型 Numeric Types TINYINT (1字节,数据范围: -128 to 127) SMALLINT (2字节,数据范围: -32,768 to 32,767) INT (4字节,数据范围:-2,147,483,648 to 2,147,483,647) BIGINT (8字节,数据范围: -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807) FLOAT (4字节, 单精度浮点数) DOUB

linux小白 mysql5.0源码安装配置

安装mysql-5.0.45.tar.gz(该软件包下载地址:http://www.filewatcher.com/m/mysql-5.0.45.tar.gz.24433261-0.html) # groupadd mysql              #添加mysql组 # useradd -g mysql mysql      #添加mysql用户,且加入mysql组 --------------------编译过程---------------------------------- # t

RoseHA9.0 for WindowsServer2008R2 安装配置

-恢复内容开始--- RoseHA9.0 for WindowsServer2008R2 安装配置 一.RoseHA安装 将软件拖进去,双击软件,一路下一步,安装,完成. 二.RoseHA调试 然后下一步 搞定. ---恢复内容结束--- 原文地址:https://www.cnblogs.com/fengkeke/p/9136330.html

【甘道夫】Hive 0.13.1 on Hadoop2.2.0 + Oracle10g部署详细解释

环境: hadoop2.2.0 hive0.13.1 Ubuntu 14.04 LTS java version "1.7.0_60" Oracle10g ***欢迎转载.请注明来源***    http://blog.csdn.net/u010967382/article/details/38709751 到下面地址下载安装包 http://mirrors.cnnic.cn/apache/hive/stable/apache-hive-0.13.1-bin.tar.gz 安装包解压到

ADFS3.0与SharePoint2013安装配置(原创)

现在越来越多的企业使用ADFS作为单点登录,我希望今天的内容能帮助大家了解如何配置ADFS和SharePoint 2013.安装配置SharePoint2013这块就不做具体描述了,今天主要讲一下怎么安装配置ADFS3.0. 1.   ADFS安装 在ADFS服务器上,以域管理员Administrator身份登录,以管理员身份启动Powershell,运行命令“Install-WindowsFeature -name ADFS-Federation”安装ADFS Windows角色功能. 启动服

Hive 2.1.1安装配置

##前期工作 安装JDK 安装Hadoop 安装MySQL ##安装Hive ###下载Hive安装包 可以从 Apache 其中一个镜像站点中下载最新稳定版的 Hive, apache-hive-2.1.1-bin.tar.gz. 解压安装Hive 使用以下命令安装 Hive: sudo mv apache-hive-2.1.1-bin.tar.gz /opt cd /opt sudo tar -xzvf apache-hive-2.1.1-bin.tar.gz ##解压 sudo ln -s