Hadoop报错 " Message missing required fields: callId, status"解决方案

今天用hadoop HDFS连接获取文件目录的时候出错:

17:31:14,503 ERROR [STDERR] java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields:

callId, status; Host Details : local host is: "****/10.201.**.**"; destination host is: "***.ap.acxiom.net":8020;

17:31:14,506 ERROR [STDERR]     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)

17:31:14,509 ERROR [STDERR]     at org.apache.hadoop.ipc.Client.call(Client.java:1241)

17:31:14,510 ERROR [STDERR]     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)

17:31:14,510 ERROR [STDERR]     at $Proxy153.getBlockLocations(Unknown Source)

17:31:14,510 ERROR [STDERR]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

17:31:14,512 ERROR [STDERR]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

17:31:14,513 ERROR [STDERR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

17:31:14,514 ERROR [STDERR]     at java.lang.reflect.Method.invoke(Method.java:597)

17:31:14,514 ERROR [STDERR]     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)

17:31:14,516 ERROR [STDERR]     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)

17:31:14,518 ERROR [STDERR]     at $Proxy153.getBlockLocations(Unknown Source)

17:31:14,519 ERROR [STDERR]     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.jav

这种问题大部分都是版本的问题,于是查看环境相关信息:

Hadoop环境版本信息: Hadoop 2.5.0-cdh5.3.2

本地hadoop相关jar版本 2.0.0-cdh4.4.0

<dependency>

<groupId>org.apache.hadoop</groupId>

<artifactId>hadoop-common</artifactId>

<version>2.0.0-cdh4.4.0</version>

<exclusions>

<exclusion>

<artifactId>jdk.tools</artifactId>

<groupId>jdk.tools</groupId>

</exclusion>

</exclusions>

<scope>provided</scope>

</dependency>

<dependency>

<groupId>org.apache.hadoop</groupId>

<artifactId>hadoop-client</artifactId>

<version>2.0.0-cdh4.4.0</version>

<scope>provided</scope>

</dependency>

本地版本太低,接口不一致,将所有hadoop依赖修改版本号到2.2.0即可解决。

时间: 2024-10-25 11:10:44

Hadoop报错 " Message missing required fields: callId, status"解决方案的相关文章

eclipse连接远程Hadoop报错,Caused by: java.io.IOException: 远程主机强迫关闭了一个现有的连接。

eclipse连接远程Hadoop报错,Caused by: java.io.IOException: 远程主机强迫关闭了一个现有的连接.全部报错信息如下: Exception in thread "main" java.io.IOException: Call to hadoopmaster/192.168.1.180:9000 failed on local exception: java.io.IOException: 远程主机强迫关闭了一个现有的连接. at org.apach

解决eclipse下maven工程报错:Missing artifact jdk.tools:jdk

1.进入jdk/lib目录,执行: mvn install:install-file -DgroupId=jdk.tools -DartifactId=jdk.tools -Dpackaging=jar -Dversion=1.7 -Dfile=tools.jar -DgeneratePom=true 2.加入依赖 <dependency> <groupId>jdk.tools</groupId> <artifactId>jdk.tools</arti

mysql报错Multi-statement transaction required more than &#39;max_binlog_cache_size&#39; bytes of storage

mysql报错Multi-statement transaction required more than 'max_binlog_cache_size' bytes of storage 在执行create table  xx  as  select xx的时候 或者在执行 tpcc-mysql的tpcc_load 的时候 都会遇到这个错误 1534, HY000, Writing one row to the row-based binary log failedRetrying ... 1

SQLSERVER 创建ODBC 报错的解决办法 SQLState:&#39;01000&#39;的解决方案

错误详情如下: SQLState:'01000' SQL Server 错误:14 [Microsoft][ODBC SQL Server Driver][DBNETLIB] ConnectionOpen (Invalid Instance()). 连接失败: SQLState:'08001' SQL Server 错误:14 [Microsoft][ODBC SQL Server Driver][DBNETLIB] 无效的连接. 解决办法: 在创建ODBC数据源的步骤中,点击“客户端配置”,勾

hadoop报错:Does not contain a valid host:port authority

今天用sbin/start-yarn.sh启动yarn的时候,遇到下面的错误 java.lang.IllegalArgumentException: Does not contain a valid host:port authority: master at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:211) at org.apache.hadoop.net.NetUtils.createSocketAddr(N

解决hiveserver2报错:java.io.IOException: Job status not available - Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

用户使用的sql: select count( distinct patient_id ) from argus.table_aa000612_641cd8ce_ceff_4ea0_9b27_0a3a743f0fe3; 下面做不同的测试: 1.beeline -u jdbc:hive2://0.0.0.0:10000 -e "select count( distinct patient_id ) from argus.table_aa000612_641cd8ce_ceff_4ea0_9b27_

Windows下Hadoop报错Failed to locate the winutils

首先我这里只是一个简单的hdfs查询程序,并没有搭建Hadoop环境,搭建环境还需去看详细教程. 报错:Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null \bin\winutils.exe in the Hadoop binaries. 前辈说这是Window系统的原因,其实并没有影响,回头要放到服务器上,这个问题

HADOOP报错,随笔

1.新版本2.4.1中,profile/hadoop-env.sh中均己设置 JAVA_HOME,  java -version也正常. 启动时报错: [[email protected] ~]# start-all.sh Starting namenodes on [] localhost: Error: JAVA_HOME is not set and could not be found. localhost: Error: JAVA_HOME is not set and could n

gdb调试报错:Missing separate debuginfos

在centos7上面gdb程序时候,报错信息是:Missing separate debuginfos, use: debuginfo-install glibc-2.17-157.el7_3.5.x86_64 解决方案:1 先修改"/etc/yum.repos.d/CentOS-Debuginfo.repo"文件的?enable=1:有时候该文件不存在,则需要手工创建此文件并加入以下内容: [debug] name=CentOS-7 - Debuginfo baseurl=http: