今天用hadoop HDFS连接获取文件目录的时候出错:
17:31:14,503 ERROR [STDERR] java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields:
callId, status; Host Details : local host is: "****/10.201.**.**"; destination host is: "***.ap.acxiom.net":8020; 17:31:14,506 ERROR [STDERR] at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) 17:31:14,509 ERROR [STDERR] at org.apache.hadoop.ipc.Client.call(Client.java:1241) 17:31:14,510 ERROR [STDERR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) 17:31:14,510 ERROR [STDERR] at $Proxy153.getBlockLocations(Unknown Source) 17:31:14,510 ERROR [STDERR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 17:31:14,512 ERROR [STDERR] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 17:31:14,513 ERROR [STDERR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 17:31:14,514 ERROR [STDERR] at java.lang.reflect.Method.invoke(Method.java:597) 17:31:14,514 ERROR [STDERR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) 17:31:14,516 ERROR [STDERR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) 17:31:14,518 ERROR [STDERR] at $Proxy153.getBlockLocations(Unknown Source) 17:31:14,519 ERROR [STDERR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.jav |
这种问题大部分都是版本的问题,于是查看环境相关信息:
Hadoop环境版本信息: Hadoop 2.5.0-cdh5.3.2
本地hadoop相关jar版本 2.0.0-cdh4.4.0
<dependency>
<groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.0.0-cdh4.4.0</version> <exclusions> <exclusion> <artifactId>jdk.tools</artifactId> <groupId>jdk.tools</groupId> </exclusion> </exclusions> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>2.0.0-cdh4.4.0</version> <scope>provided</scope> </dependency> |
本地版本太低,接口不一致,将所有hadoop依赖修改版本号到2.2.0即可解决。