hive执行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:

hive> select product_id, track_time from trackinfo limit 5;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there‘s no reduce operator
org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException: The number of tasks for this job 156028 exceeds the configured limit 5000
        at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3943)
        at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
Caused by: java.io.IOException: The number of tasks for this job 156028 exceeds the configured limit 5000
        at org.apache.hadoop.mapred.JobInProgress.checkTaskLimits(JobInProgress.java:509)
        at org.apache.hadoop.mapred.JobInProgress.<init>(JobInProgress.java:485)
        at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3941)
        ... 10 more

        at org.apache.hadoop.ipc.Client.call(Client.java:1066)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at org.apache.hadoop.mapred.$Proxy11.submitJob(Unknown Source)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:921)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
        at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:447)
        at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1336)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1122)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:935)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Job Submission failed with exception ‘org.apache.hadoop.ipc.RemoteException(java.io.IOException: java.io.IOException: The number of tasks for this job 156028 exceeds the configured limit 5000
        at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3943)
        at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
Caused by: java.io.IOException: The number of tasks for this job 156028 exceeds the configured limit 5000
        at org.apache.hadoop.mapred.JobInProgress.checkTaskLimits(JobInProgress.java:509)
        at org.apache.hadoop.mapred.JobInProgress.<init>(JobInProgress.java:485)
        at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3941)
        ... 10 more
)‘
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask

错误原因:

由于trackinfo表的数据太庞大,而且写的sql语句

select product_id, track_time from trackinfo limit 5

性能耗损太大了,将会导致调度的Map数太多,超出了job的限制。

来看看hive中trackinfo表的数据量有多大:

-bash-3.2$ hadoop fs -dus /data/share/trackinfo
Warning: $HADOOP_HOME is deprecated.

hdfs://yhd-hadoop06.int.yihaodian.com:9000/data/share/trackinfo 19387740988708

可以看到,trackinfo的数据量大概有19TB这么大

改写sql语句,指定where条件,性能得到了提升

select product_id, track_time from trackinfo where ds=‘2014-5-13‘ limit 5

问题同时也被解决了

hive执行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:,布布扣,bubuko.com

时间: 2024-10-03 13:09:52

hive执行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:的相关文章

hive运行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:

hive> select product_id, track_time from trackinfo limit 5; Total MapReduce jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOEx

转载:Linux下执行SVN命令时提示错误:Valid UTF-8 data

在Linux下执行svn add *时出现如下错误: svn:  Valid UTF-8  data(hex: 4b)followed by invalid UTF-8 sequence(hex:  fc 63 68  65) 出现这个错误是因为svn库里有文件的名字不是utf-8编码的,这种情况对于中文来说很常见.比如在自己的windows上建了一个中文名字的文件,就会使这种情况. 几经周折,才找到解决办法: 首先,执行命令: ls * | file -/dev/stdin:  ISO-885

Hive JDBC:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous

今天使用JDBC来操作Hive时,首先启动了hive远程服务模式:hiveserver2 &(表示后台运行),然后到eclipse中运行程序时出现错误: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.182.11:10000/default: Failed to open new session: java.lang.RuntimeException: org.

Hadoop HA HDFS启动错误之org.apache.hadoop.ipc.Client: Retrying connect to server问题解决

近日,在搭建Hadoop HA QJM集群的时候,出现一个问题,如本文标题. 网上有很多HA的博文,其实比较好的博文就是官方文档,讲的已经非常详细.所以,HA的搭建这里不再赘述. 本文就想给出一篇org.apache.hadoop.ipc.Client: Retrying connect to server错误的解决的方法. 因为在搜索引擎中输入了错误问题,没有找到一篇解决问题的.这里写一篇备忘,也可以给出现同样问题的朋友一个提示. 一.问题描述 HA按照规划配置好,启动后,NameNode不能

打开Excel2010时提示错误:向程序发送命令时出现问题!

使用的是正版的Windows7(64位)专业版系统以及正版的Office2010专业版(32位),从去年3月份发现:打开Excel2010时弹出'程序发送命令时出现问题'的对话框,随后无法打开Excel文件的问题,但是 Word 和PowerPoint并没有出现类似问题. 尝试了下面的四种方法都没能得到解决: 1.采用Excel菜单'常规'选项下取消勾选"忽略使用动态数据交换(DDE)的其他应用程序"的方法( article-1296-1.html). 2.取消excel中的加载项的方

使用tp访问数据库时提示错误: &#39;PDO&#39; not found, 和not defined constant mysql_attr_init_command

第一个问题: PDO not found 是因为 php没有安装pdo扩展, 无法提供给 php 以 数据库访问功能, 所以 报错是在文件: Think/Db.class.php的里面. 解决方法是: 给 php安装 PDO 扩展: dnf install php-pdo 这时 再看 就可以 在 默认 自动 加载的modules目录 中 看到: /usr/lib/php/modules pdo.so这个共享库 安装好 pdo后, 又出现了 没有定义 mysql_attr_init_command

Oracle登录时提示错误,导致用户无法登录

Oracle登录时提示错误,导致用户无法登录,错误如下 ------------------------------------------------------------------------- ORA-00604:递归SQL级别1出现错误 ORA-01653表SYS.AUD$无法通过1024(在表空间SYSTEM中扩展) ORA-02002:写入审计线索时出错 ORA-01653表SYS.AUD$无法通过1024(在表空间SYSTEM中扩展) ---------------------

命令行执行python模块时提示包找不到的问题

庄稼人不是专职python开发的道友,虽然与python相识已多年,可惜相识不相知,只是偶尔借助pydev写一些简单的小工具. 多年来,一直困惑于这样一个问题:同样的工程,同样的代码,使用pydev可以运行任意一个python脚本,而使用命令行运行却不行?命令行下(或者双击执行)总是提示"ImportError: No module named xxx"?pydev究竟做了什么魔术呢? 长话短说,以上面工程为例,如果是在命令行中直接执行 python c.py , 都会提示"

在phpmyadmin中执行sql语句出现的错误:Unknown storage engine &#39;InnoDB&#39;

在phpmyadmin中执行sql语句出现的错误:Unknown storage engine 'InnoDB' 解决方法:解决方法:             1.关闭MySQL数据库       2.修改my.ini文件,把skip-innodb这行注释掉       3.打开MySQL数据库 原因:没有开启MySQL InnoDB存储引擎. 在phpmyadmin中执行sql语句出现的错误:Unknown storage engine 'InnoDB'