FileAlreadyExistsException: Output directory hdfs://ubuntu:9000/output09 already exists

14/07/21 17:49:59 ERROR security.UserGroupInformation: PriviledgedActionException as:chenlongquan cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://ubuntu:9000/output09 already exists
Exception in thread "main" org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://ubuntu:9000/output09 already exists
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:137)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
        at com.pzoom.mapred.WordCount.main(WordCount.java:41)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

这个错误的大致意思是你所要上传的目录已经存在,在hadoop中仅仅能对文件进行一次上传。不能反复。你要想继续上传,解决方式:1:删除已经存在的目录。2:又一次创建一个目录。

时间: 2024-08-25 22:59:10

FileAlreadyExistsException: Output directory hdfs://ubuntu:9000/output09 already exists的相关文章

FileAlreadyExistsException: Output directory output already exists 解决办法

Hadoop 伪分布式环境时运行wordcount程序,之前已经运行过一次,再次将input文件夹中的内容上传至HDFS时会出现重复,但是还是可以上传成功.但是当再次运行wordcount程序时就会报错: 可以看到导致错误的原因时HDFS中的output文件夹已经存在了,所以必须要删除.(因为outout文件夹是由hadoop自动生成的,所以会报错) 于是查看HDFS: 删除output文件夹: 问题解决:

Hadoop问题:Input path does not exist: hdfs://Master:9000/user/hadoop/input

问题描述: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://Master:9000/user/hadoop/input at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:323) at org.apac

解决mvn clean后打包错:Cannot create resource output directory

今天遇到一个奇怪问题:使用mvn clean后,打包问是出现下面错. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:2.6:resources (default-resources) on project inventory: Cannot create resource output directory: E:\work\lanhusoft\cms\target\classes ->

Visual Studio - File Properties (Build Action, Copy to Output Directory)

Ref: MSDN (https://docs.microsoft.com/en-us/previous-versions/visualstudio/visual-studio-2010/0c6xyb66(v=vs.100) ) Difference between Build action content and 'Copy to output directory' in Visual Studio Question: In my project in Visual Studio, I hav

org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost:9000/usr/local/spark/zytdemo

意思说在 hdfs://localhost:9000/usr/local/spark/zytdemo找不到响应的文件,我们可以分析的得到他并不是加载本地文件,而是区hdfs上查找. 这是由于我们在之前配置时修改过 /usr/local/hadoop/etc/hadoop下的core-site.xml 所以我们要将spark读取的文件路径改为hdfs上的路径. 原文地址:https://www.cnblogs.com/zyt-bg/p/11477449.html

bash no such file or directory in ubuntu 1404

我今天在我的macbook pro retina 上的虚拟机里安装了ubuntu 1404. 当我试图运行cadence ncverilog时,ubuntu的终端报错"bash no such file or directory". 后来发现我安装的ubuntu是64位的,而ncverilog是32位的.因此通过一下命令安装32位的libc即可. sudo apt-get install libc6:i386

从代码上解决Output directory already exists错误,避免每次调试都要手动删除输出文件夹

// 判断output文件夹是否存在,如果存在则删除 Path outPath = new Path("hdfs://192.168.9.13:8020/meanwhileFind");// 输出路径 FileSystem fs = outPath.getFileSystem(conf);// 根据输出路径找到文件,参数为配置文件 if (fs.exists(outPath)) { fs.delete(outPath); // fs.delete(outPath, true);true

Hadoop问题记录:Wrong FS: hdfs://hp5-249:9000/, expected: file:///

一般在对文件操作的时候可能出现这个问题,可能是打开文件的时候出错,也可能是对目录进行遍历的时候出问题. 出现这种问题一般是在eclipse中运行hadoop的时候出现,直接切换到shell下发送命令,可能不会出现这个问题. 假设当前在eclipse的项目目录下,具体解决办法如下: cp $HADOOP_HOME/etc/hadoop/core-site.xml ./bin cp $HADOOP_HOME/etc/hadoop/hdfs-site.xml ./bin 接着在eclipse项目中点击

打包错误 error: make directory ....app/WeiboSDK.bundle/others: File exists

今天打包时候出现的错误,猜测意思大概是weiboSDK已经存在什么什么之类的,如是度娘了一下 stackOverflow上看到一个回答是 It looks like there are two en.lproj files being included in the build: one from SMBox and one from ApplRater. I had a similar build problem when my project contained two en.lproj f