Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode

动态分区数太大的问题:[Fatal Error] Operator FS_2 (id=2): Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode.

hive> insert into table sogouq_test partition(query_time) select user_id,query_word,query_order,click_order,url,query_time from sogouq_test_tmp;

Total MapReduce jobs = 3

Launching Job 1 out of 3

Number of reduce tasks is set to 0 since there‘s no reduce operator

Starting Job = job_1409113942738_0026, Tracking URL = http://centos1:8088/proxy/application_1409113942738_0026/

Kill Command = /home/hadoop-2.2/bin/hadoop job  -kill job_1409113942738_0026

Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0

2014-08-27 03:55:16,868 Stage-1 map = 0%,  reduce = 0%

[Fatal Error] Operator FS_2 (id=2): Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode.

Sample of 100 partitions created under hdfs://centos1:8020/hive/scratchdir/hive_2014-08-27_03-55-09_118_348369539322185503-1/_tmp.-ext-10002:

.../query_time=20111230000005

.../query_time=20111230000007

.../query_time=20111230000008

.../query_time=20111230000009

.../query_time=20111230000010

.../query_time=20111230000011

查看最大分区数:

hive> set hive.exec.max.dynamic.partitions.pernode;

hive.exec.max.dynamic.partitions.pernode=100

将该參数设置的大一点,问题即解决。

时间: 2024-10-09 00:29:51

Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode的相关文章

在hive中查询导入数据表时FAILED: SemanticException [Error 10096]: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict

当我们出现这种情况时 FAILED: SemanticException [Error 10096]: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict 这时候我们需要改变一下设置 set hive.exec.dynamici.partition=true;set h

ORA-00019: maximum number of session licenses exceeded 超出最大会话许可数

ORA-00019: maximum number of session licenses exceededORA-00019: 超出最大会话许可数 Cause:       All licenses are in use.      所有许可都在使用中. Action:       Increase the value of the LICENSE MAX SESSIONS initialization parameter.      增加最大会话许可数的初始化参数值. 原文地址:https:

解决导出Excel时提示“Warning: Maximum number of format records exceeded. Using default format”

最近在使用jxl导出Excel文件时出现一个问题,就是当数据量比较多的时候,后天会提示" Warning:  Maximum number of format records exceeded.  Using default format",然后导出的Excel文件打开后会报错,导出失败.解决方案是更换一下jar包. 下载地址:

hive.exec.parallel 设置job并行执行

通过设置hive.exec.parallel参数,使得在同一sql中的job可以并行的执行,因为在某些场景下,同一sql中,有时子查询之间并无关联. 默认情况下为false 可以在执行HQL之前,set该参数,hive.exec.parallel=具体某个数值. #####虽然可以保证并行运行,但是会耗费更多的资源.

Hive之分区(Partitions)和桶(Buckets)

转自:http://www.aahyhaa.com/archives/316 hive引入partition和bucket的概念,中文翻译分别为分区和桶(我觉的不是很合适,但是网上基本都是这么翻译,暂时用这个吧),这两个概念都是把数据划分成块,分区是粗粒度的划分桶是细粒度的划分,这样做为了可以让查询发生在小范围的数据上以提高效率. 首先介绍分区的概念,还是先来个例子看下如果创建分区表:[code lang=”sql”]create table logs_partition(ts bigint,l

Hive Sum MAX Over Demo(单月访问次数和总访问次数)

A,2015-01,5A,2015-01,15B,2015-01,5A,2015-01,8B,2015-01,25A,2015-01,5A,2015-02,4A,2015-02,6B,2015-02,10B,2015-02,5A,2015-03,16A,2015-03,22B,2015-03,23B,2015-03,10B,2015-03,1 求每个用户单月的访问次数和总访问次数create external table if not exists t_access(uname string c

Hive Sum MAX MIN聚合函数

数据准备cookie1,2015-04-10,1cookie1,2015-04-11,5cookie1,2015-04-12,7cookie1,2015-04-13,3cookie1,2015-04-14,2cookie1,2015-04-15,4cookie1,2015-04-16,4创建数据库及表create database if not exists cookie;use cookie;drop table if exists cookie1;create table cookie1(c

ORA-00020: maximum number of processes (xxxx) exceeded 报错解决方法

转自:http://blog.51cto.com/lee90/1788124 今天java开发在连接线上的oracle大量导数据,一会提示连接不上数据库了.我本地用sqldeveloper也连接不上. 登录到服务器,重启oracle,本地还是不能连接. 在xshell里面登录oracle # su - oracle # sqlplus / as sysdba    连接Oracle 提示要输入用户名和密码. 并报错ORA-00020: maximumnumber of processes (30

ORA-00020:maximum number of processes (150) exceeded 错误解决方法

解决方案 1.查看进程数 SQL> show parameter proce NAME                                 TYPE        VALUE------------------------------------ ----------- --------------aq_tm_processes                      integer     1db_writer_processes                  integer