hadoop2.6.0的eclipse插件编译和设置

1.编译hadoop2.6.0的eclipse插件

下载源码:

git clone https://github.com/winghc/hadoop2x-eclipse-plugin.git

编译源码:

cd src/contrib/eclipse-plugin

ant jar -Dversion=2.6.0 -Declipse.home=/opt/eclipse -Dhadoop.home=/opt/hadoop-2.6.0

eclipse.home 和 hadoop.home 设置成你自己的环境路径

命令行执行编译,产生了8个警告信息,直接忽略。

compile:

[echo] contrib: eclipse-plugin

[javac] /software/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/build.xml:76: warning: ‘includeantruntime‘ was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

[javac] Compiling 45 source files to /software/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/classes

[javac] /opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar(org/apache/hadoop/fs/Path.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘: class file for org.apache.hadoop.classification.InterfaceAudience not found

[javac] /opt/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar(org/apache/hadoop/hdfs/DistributedFileSystem.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘

[javac] /opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar(org/apache/hadoop/fs/FileSystem.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘

[javac] /opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar(org/apache/hadoop/fs/FileSystem.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘

[javac] /opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar(org/apache/hadoop/fs/FileSystem.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘

[javac] /opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar(org/apache/hadoop/fs/FileSystem.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘

[javac] /opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar(org/apache/hadoop/fs/FSDataInputStream.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘

[javac] /opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar(org/apache/hadoop/fs/FSDataOutputStream.class): warning: Cannot find annotation method ‘value()‘ in type ‘LimitedPrivate‘

[javac] Note: Some input files use or override a deprecated API.

[javac] Note: Recompile with -Xlint:deprecation for details.

[javac] Note: Some input files use unchecked or unsafe operations.

[javac] Note: Recompile with -Xlint:unchecked for details.

[javac] 8 warnings

生成位置:

[jar] Building jar: /software/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.6.0.jar

2.安装插件

登录桌面后面要打开eclipse的用户最好是hadoop的管理员,也就是hadoop安装时所配置用户,否则会出现拒绝读写权限问题。

复制编译好的jar到eclipse插件目录,重启eclipse

配置 hadoop 安装目录

window ->preference -> hadoop Map/Reduce -> Hadoop installation directory

配置Map/Reduce 视图

window ->Open Perspective -> other->Map/Reduce -> 点击“OK”

windows → show view → other->Map/Reduce Locations-> 点击“OK”

控制台会多出一个“Map/Reduce Locations”的Tab页

在“Map/Reduce Locations” Tab页 点击图标<大象+>或者在空白的地方右键,选择“New Hadoop location…”,弹出对话框“New hadoop location…”,配置如下内容:

注意:MR Master和DFS Master配置必须和mapred-site.xml和core-site.xml等配置文件一致

打开Project Explorer,查看HDFS文件系统。

3.新建Map/Reduce任务

File->New->project->Map/Reduce Project->Next

编写WordCount类:

package mytest;

import java.io.IOException;

import java.util.*;

import org.apache.hadoop.fs.Path;

import org.apache.hadoop.conf.*;

import org.apache.hadoop.io.*;

import org.apache.hadoop.mapred.*;

import org.apache.hadoop.util.*;

public class WordCount {

public static class Map extends MapReduceBase implements

Mapper<LongWritable, Text, Text, IntWritable> {

private final static IntWritable one = new IntWritable(1);

private Text word = new Text();

public void map(LongWritable key, Text value,

OutputCollector<Text, IntWritable> output, Reporter reporter)

throws IOException {

String line = value.toString();

StringTokenizer tokenizer = new StringTokenizer(line);

while (tokenizer.hasMoreTokens()) {

word.set(tokenizer.nextToken());

output.collect(word, one);

}

}

}

public static class Reduce extends MapReduceBase implements

Reducer<Text, IntWritable, Text, IntWritable> {

public void reduce(Text key, Iterator<IntWritable> values,

OutputCollector<Text, IntWritable> output, Reporter reporter)

throws IOException {

int sum = 0;

while (values.hasNext()) {

sum += values.next().get();

}

output.collect(key, new IntWritable(sum));

}

}

public static void main(String[] args) throws Exception {

JobConf conf = new JobConf(WordCount.class);

conf.setJobName("wordcount");

conf.setOutputKeyClass(Text.class);

conf.setOutputValueClass(IntWritable.class);

conf.setMapperClass(Map.class);

conf.setReducerClass(Reduce.class);

conf.setInputFormat(TextInputFormat.class);

conf.setOutputFormat(TextOutputFormat.class);

FileInputFormat.setInputPaths(conf, new Path(args[0]));

FileOutputFormat.setOutputPath(conf, new Path(args[1]));

JobClient.runJob(conf);

}

}

配置运行时参数:右键-->Run as-->Run Confiugrations

in是hdfs的文件夹(自己创建),里面放要处理的文件。out存放输出结果

将程序放在hadoop集群上运行:右键-->Runas -->Run on Hadoop,最终的输出结果会在HDFS相应的文件夹下显示。至此,Linux下hadoop-2.6.0 eclipse插件配置完成。

配置过程中出先的问题:

在eclipse中无法向文件HDFS文件系统写入的问题,这将直接导致eclipse下编写的程序不能在hadoop上运行。

打开conf/hdfs-site.xml,找到dfs.permissions属性修改为false(默认为true)OK了。

<property>

<name>dfs.permissions</name>

<value>false</value>

</property>

改完需要重启HDFS;

最简单的就是刚才说的登录桌面启动eclipse的用户本身就是hadoop的管理员

时间: 2024-10-17 21:04:06

hadoop2.6.0的eclipse插件编译和设置的相关文章

【甘道夫】Win7x64环境下编译Apache Hadoop2.2.0的Eclipse插件

目标: 编译Apache Hadoop2.2.0在win7x64环境下的Eclipse插件 环境: win7x64家庭普通版 eclipse-jee-kepler-SR1-win32-x86_64.zip Apache Ant(TM) version 1.8.4 compiled on May 22 2012 java version "1.7.0_45" 参考文章: http://kangfoo.u.qiniudn.com/article/2013/12/build-hadoop2x

hadoop-2.2.0配置eclipse插件(windows和linux平台)

目前配置eclipse插件主要有两个版本,一个是在windows下重新编译配置,另一个是在linux在重新配置编译. 下面逐步讲解在linux下编译-配置eclipse插件的过程. 环境: Ubuntu 12.04 64bit 3.2.0-29-generic eclipse-jee-luna-R-linux-gtk-x86_64.tar.gz Apache Ant(TM) version 1.8.2 JDK Version 1.7.0_67 安装前准备: Ant.jdk.eclipse.Apa

hadoop2.6.0的eclipse插件安装

1.安装插件 下载插件hadoop-eclipse-plugin-2.6.0.jar并将其放到eclips安装目录->plugins(插件)文件夹下.然后启动eclipse. 配置 hadoop 安装目录 配置Map/Reduce 视图 点击"大象" 在“Map/Reduce Locations” Tab页 点击图标“大象”,选择“New Hadoop location…”,弹出对话框“New hadoop location…”.填写Location name和右边的Port:9

在fedora20下配置hadoop2.5.1的eclipse插件

我现在是把hadoop-2.5.1的路径是/opt/lib64/hadoop-2.5.1下面,然后将hadoop-2.2.0的路径是/home/hadoop/下载/hadoop-2.2.0,我的eclipse的路径是/opt/programming/atd-bundle/eclipse. 因为老师需要我们写mapreduce程序,所以现在需要配置hadoop的eclipse插件.之前在windows下面安装hadoop一直会有莫名其妙的问题,所以索性直接在linux下面装了.Linux下面还更简

Hadoop2.2.0 eclipse插件编译及Ecliipse配置说明(图文版)

一.引言: 最近在做一个城商行项目的POC测试it版本,涉及到编译Linux64bti的源码和开发插件使用,作为笔记分享给大家. 二.插件编译 Hadoop2x版本的Eclipse插件已经单独抽取成独立的开源项目,区别于之前版本直接集成到Hadoop1.x版本的模式,需要单独下载,下载网址: https://github.com/winghc/hadoop2x-eclipse-plugin 2.1 源码编译 1 hadoop2x-eclipse-plugin 2 ================

【甘道夫】Win7x64环境下编译Apache Hadoop2.2.0的Eclipse小工具

目标: 编译Apache Hadoop2.2.0在win7x64环境下的Eclipse插件 环境: win7x64家庭普通版 eclipse-jee-kepler-SR1-win32-x86_64.zip Apache Ant(TM) version 1.8.4 compiled on May 22 2012 java version "1.7.0_45" 參考文章: http://kangfoo.u.qiniudn.com/article/2013/12/build-hadoop2x

Hadoop2.4.0 中Eclipse 平台的搭建

一.Hadoop2.4.0环境下Eclipse平台的搭建 1.安装Eclipse   对于hadoop集群,我们将eclipse安装在master节点上,首先下载Eclipse安装包(如:eclipse-jee-luna-SR1-linux-gtk.tar.gz)然后用tar -zxvf 命令解压,并把解压后的目录移动到/usr/local路径下,然后启动eclipse: 下载网址:http://www.eclipse.org/downloads/?osType=linux&release=un

hadoop1.2 eclipse插件编译

目录说明 在编译之前,我们需要先下载后hadoop 1.2.1的源码文件,并解压到合适的位置.目录结构如下: Eclipse: D:\eclipse Hadoop: D:\hadoop-1.2.1 Step1导入 Hadoop-eclipse 插件工程 1. 下载hadoop-1.2.1.tar.gz,并解压缩到 D盘根目录下2. 在 Eclipse 中选择 File->Import->General/Existing Projectsinto Workspace 导入Hadoop的Eclip

TestNG 6.1.1 + eclipse Luna 4.4.0 + TestNG Eclipse 插件报错

相关信息: eclipse 32位 ---------------------------------- Eclipse Standard/SDK Version: Luna Release (4.4.0) Build id: 20140612-0600 jdk 1.8 本地安装 32位 ---------------------------------- java version "1.8.0_11" Java(TM) SE Runtime Environment (build 1.