Hadoop: No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).解决办法

在eclipse中运行Hadoop程序时出现如下问题:

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836)
    at hadoop.in.action.WordCount.run(WordCount.java:89)
    at hadoop.in.action.WordCount.main(WordCount.java:98)

解决办法:将${HADOOP_HOME}/etc/hadoop/log4j.properties 拷贝到eclipse项目的sec文件夹下。

时间: 2024-10-03 03:48:54

Hadoop: No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).解决办法的相关文章

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.o

上面的报错是在本地java调试(windows) hadoop集群 出现的 解决方案: 在resources文件夹下面创建一个文件log4j.properties(这个其实hadoop安装目录下的 etc/hadoop/log4j.properties 同名,我尝试拿过来用,发现还是不行报错信息如下) log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFa

关于log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).的问题

几天初换IDEA编写java代码,感觉十分的爽,所以决定抛弃eclipse ---------------------------------------上面是题外话-------- 今天在运行某程序时出现了如上图所示的问题,查找资料后发现是由于由于log4j这个日志信息打印模块的配置信息没有给出造成的,简单的说就是人家要有个地方来存放日志的打印信息,可是你又不给人家,所以就报错了吧 解决办法: 在你的项目的src目录中创建一个名为log4j.properties的文本文件,记住是文本文件,不是

Apache Hadoop YARN: Moving beyond MapReduce and Batch Processing with Apache Hadoop 2

Apache Hadoop YARN: Moving beyond MapReduce and Batch Processing with Apache Hadoop 2 .mobi: http://www.t00y.com/file/79497801 Apache Hadoop YARN: Moving beyond MapReduce and Batch Processing with Apache Hadoop 2.pdf: http://www.t00y.com/file/8034244

【原创】问题定位分享(16)spark写数据到hive外部表报错ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat

spark 2.1.1 spark在写数据到hive外部表(底层数据在hbase中)时会报错 Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat at org.apache.spark.sql.hive.SparkHiveWrit

log4j:WARN No appenders could be found for logger (org.springframework.core.env.StandardEnvironment)的解决

报错:log4j:WARN No appenders could be found for logger (org.springframework.core.env.StandardEnvironment). 出错原因:缺少日志文件,主要是后面的原因org.springframework.core.env.StandardEnvironment 解决方案:在resource中添加对应的log4j.properties文件,文件内容为 log4j.rootLogger=DEBUG,A1log4j.

[hadoop] map函数中使用FileSystem对象出现java.lang.NullPointerException的原因及解决办法

问题描述: 在hadoop中处理多个文件,其中每个文件一个map. 我使用的方法为生成一个文件,文件中包含所有要压缩的文件在HDFS上的完整路径.每个map 任务获得一个路径名作为输入. 在eclipse中调试时,map中处理hdfs上的文件用到的FileSystem对象为整个class中的静态成员变量,在eclipse中运行没有错误,打包成jar提交到集群运行,就会在map函数中 FileStatus fileStatus = tmpfs.getFileStatus(inputdir); 这一

log4j:WARN No appenders could be found for logger 解决办法

转自:https://blog.csdn.net/chw0629/article/details/80567936 使用log4j时不起作用,每次执行完出现以下提示: log4j:WARN No appenders could be found for logger (org.apache.ibatis.logging.LogFactory). log4j:WARN Please initialize the log4j system properly.log4j:WARN See http:/

解决Exception: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z 等一系列问题

一.简介 Windows下的 Eclipse上调试Hadoop2代码,所以我们在windows下的Eclipse配置hadoop-eclipse-plugin-2.6.0.jar插件,并在运行Hadoop代码时出现了一系列的问题,搞了好几天终于能运行起代码.接下来我们来看看问题并怎么解决,提供给跟我同样遇到的问题作为参考. Hadoop2的WordCount.java统计代码如下: import java.io.IOException; import java.util.StringTokeni

用java运行Hadoop程序报错:org.apache.hadoop.fs.LocalFileSystem cannot be cast to org.apache.

用java运行Hadoop例程报错:org.apache.hadoop.fs.LocalFileSystem cannot be cast to org.apache.所写代码如下: package com.pcitc.hadoop; import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.h