HADOOP报错,随笔

1、新版本2.4.1中,profile/hadoop-env.sh中均己设置 JAVA_HOME,  java -version也正常。

启动时报错:

[[email protected] ~]# start-all.sh

Starting namenodes on []

localhost: Error: JAVA_HOME is not set and could not be found.

localhost: Error: JAVA_HOME is not set and could not be found.

...

starting yarn daemons
starting resourcemanager, logging to /home/lihanhui/open-source/hadoop-2.1.0-beta/logs/yarn-admin-resourcemanager-localhost.out
localhost: Error: JAVA_HOME is not set and could not be found

直接命令行执行export JAVA_HOME=/PATH/TO/JDK也无法解决问题:

最终在   hadoop-2.4.1/etc/hadoop/libexec/hadoop-config.sh  这个配置文件中搜到报错信息“JAVA_HOME is not set and could not be found

于是在这个配置文件中,  export JAVA_HOME=/PATH/JDK

问题得己解决、

时间: 2024-11-06 17:49:40

HADOOP报错,随笔的相关文章

eclipse连接远程Hadoop报错,Caused by: java.io.IOException: 远程主机强迫关闭了一个现有的连接。

eclipse连接远程Hadoop报错,Caused by: java.io.IOException: 远程主机强迫关闭了一个现有的连接.全部报错信息如下: Exception in thread "main" java.io.IOException: Call to hadoopmaster/192.168.1.180:9000 failed on local exception: java.io.IOException: 远程主机强迫关闭了一个现有的连接. at org.apach

hadoop报错:Does not contain a valid host:port authority

今天用sbin/start-yarn.sh启动yarn的时候,遇到下面的错误 java.lang.IllegalArgumentException: Does not contain a valid host:port authority: master at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:211) at org.apache.hadoop.net.NetUtils.createSocketAddr(N

Windows下Hadoop报错Failed to locate the winutils

首先我这里只是一个简单的hdfs查询程序,并没有搭建Hadoop环境,搭建环境还需去看详细教程. 报错:Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null \bin\winutils.exe in the Hadoop binaries. 前辈说这是Window系统的原因,其实并没有影响,回头要放到服务器上,这个问题

hadoop报错,随记

1.执行一个简单的job任务时,map完成100%,reduce一直0.00%,看log日志,一直重复copy,看不出什么.最后等很长时间, 报错如下: 最后,从网上找到解决方法: 就是/etc/sysconfig/network中定义的   HOSTNAME=cm134  和 /etc/hosts  指定的不一样,或者是在命令行下#hostname获取的主机名不一样,都有可能引起 这个错误: 修改后,重启后恢复:

Hadoop报错 " Message missing required fields: callId, status"解决方案

今天用hadoop HDFS连接获取文件目录的时候出错: 17:31:14,503 ERROR [STDERR] java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "****/10.201.*

hadoop报错:WARN mapred.JobClient: Error reading task outputNo route to host

解决方案: /etc/sysconfig/network/etc/hosts$hostname 这三处的主机名都要一样. 具体参考:http://blog.itpub.net/28254374/viewspace-1059607/

【Hadoop】9、hadoop1.2.1完全分布式安装过程异常报错

异常报错 1.ssh配置出错,ssh登录 The authenticity of host 192.168.0.xxx can't be established. 用ssh登录一个机器(换过ip地址),提示输入yes后,屏幕不断出现y,只有按ctrl + c结束 错误是:The authenticity of host 192.168.0.xxx can't be established. 以前和同事碰到过这个问题,解决了,没有记录,这次又碰到了不知道怎么处理,还好有QQ聊天记录,查找到一下,找

用java运行Hadoop程序报错:org.apache.hadoop.fs.LocalFileSystem cannot be cast to org.apache.

用java运行Hadoop例程报错:org.apache.hadoop.fs.LocalFileSystem cannot be cast to org.apache.所写代码如下: package com.pcitc.hadoop; import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.h

hadoop平台读取文件报错

背景: 生产环境有个脚本执行读取st层表数据时出现IO错误,查看表目录下的文件,都是压缩后的文件.详细信息如下: Task with the most failures(4): ----- Task ID: task_201408301703_172845_m_003505 URL: http://master:50030/taskdetails.jsp?jobid=job_201408301703_172845&tipid=task_201408301703_172845_m_003505 -