准备工作 拷贝原来的模板
1 mkdir oozie-apps 2 cd oozie-apps/ 3 cp -r ../examples/apps/mar-reduce . 4 mv map-reduce mr-wordcount-wf
配置文件修改
workflow.xml :
1 <!-- 2 Licensed to the Apache Software Foundation (ASF) under one 3 or more contributor license agreements. See the NOTICE file 4 distributed with this work for additional information 5 regarding copyright ownership. The ASF licenses this file 6 to you under the Apache License, Version 2.0 (the 7 "License"); you may not use this file except in compliance 8 with the License. You may obtain a copy of the License at 9 10 http://www.apache.org/licenses/LICENSE-2.0 11 12 Unless required by applicable law or agreed to in writing, software 13 distributed under the License is distributed on an "AS IS" BASIS, 14 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 See the License for the specific language governing permissions and 16 limitations under the License. 17 --> 18 <workflow-app xmlns="uri:oozie:workflow:0.5" name="mr-wordcount-wf"> 19 <start to="mr-node-wordcount"/> 20 <action name="mr-node-wordcount"> 21 <map-reduce> 22 <job-tracker>${jobTracker}</job-tracker> 23 <name-node>${nameNode}</name-node> 24 <prepare> 25 <delete path="${nameNode}/${oozieDataRoot}/${outputDir}"/> 26 </prepare> 27 <configuration> 28 <property> 29 <name>mapred.mapper.new-api</name> 30 <value>true</value> 31 </property> 32 <property> 33 <name>mapred.reducer.new-api</name> 34 <value>true</value> 35 </property> 36 <property> 37 <name>mapreduce.job.queuename</name> 38 <value>${queueName}</value> 39 </property> 40 <property> 41 <name>mapreduce.job.map.class</name> 42 <value>com.ibeifeng.hadoop.senior.mapreduce.WordCount$WordCountMapper</value> 43 </property> 44 <property> 45 <name>mapreduce.job.reduce.class</name> 46 <value>com.ibeifeng.hadoop.senior.mapreduce.WordCount$WordCountReducer</value> 47 </property> 48 49 <property> 50 <name>mapreduce.map.output.key.class</name> 51 <value>org.apache.hadoop.io.Text</value> 52 </property> 53 <property> 54 <name>mapreduce.map.output.value.class</name> 55 <value>org.apache.hadoop.io.IntWritable</value> 56 </property> 57 <property> 58 <name>mapreduce.job.output.key.class</name> 59 <value>org.apache.hadoop.io.Text</value> 60 </property> 61 <property> 62 <name>mapreduce.job.output.value.class</name> 63 <value>org.apache.hadoop.io.IntWritable</value> 64 </property> 65 <property> 66 <name>mapreduce.input.fileinputformat.inputdir</name> 67 <value>${nameNode}/${oozieDataRoot}/${inputDir}</value> 68 </property> 69 <property> 70 <name>mapreduce.output.fileoutputformat.outputdir</name> 71 <value>${nameNode}/${oozieDataRoot}/${outputDir}</value> 72 </property> 73 </configuration> 74 </map-reduce> 75 <ok to="end"/> 76 <error to="fail"/> 77 </action> 78 <kill name="fail"> 79 <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> 80 </kill> 81 <end name="end"/> 82 </workflow-app>
job.properties :
1 # 2 # Licensed to the Apache Software Foundation (ASF) under one 3 # or more contributor license agreements. See the NOTICE file 4 # distributed with this work for additional information 5 # regarding copyright ownership. The ASF licenses this file 6 # to you under the Apache License, Version 2.0 (the 7 # "License"); you may not use this file except in compliance 8 # with the License. You may obtain a copy of the License at 9 # 10 # http://www.apache.org/licenses/LICENSE-2.0 11 # 12 # Unless required by applicable law or agreed to in writing, software 13 # distributed under the License is distributed on an "AS IS" BASIS, 14 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 # See the License for the specific language governing permissions and 16 # limitations under the License. 17 # 18 19 nameNode=hdfs://hadoop:8020 20 jobTracker=hadoop:8032 21 queueName=default 22 oozieAppsRoot=user/root/oozie-apps 23 oozieDataRoot=user/root/oozie/datas 24 25 oozie.wf.application.path=${nameNode}/${oozieAppsRoot}/mr-wordcount-wf/workflow.xml 26 27 inputDir=mr-wordcount-wf/input 28 outputDir=mr-wordcount-wf/output
把自己写的MapReduce的jar 放上去
把改好的文件放入到hdfs
1 /opt/cdh-5.3.6/hadoop-2.5.0-cdh5.3.6/bin/hadoop dfs -put oozie-apps/ oozie-apps
准备测试数据
在hdfs创建输入目录 并把刚创建的文件上传
1 /opt/cdh-5.3.6/hadoop-2.5.0-cdh5.3.6/bin/hadoop dfs -mkdir -p oozie/datas/mr-wordcount-wf/input 3 /opt/cdh-5.3.6/hadoop-2.5.0-cdh5.3.6/bin/hadoop dfs -put test-mapred.txt oozie/datas/mr-wordcount-wf/input
运行oozie
1 export OOZIE_URL=http://localhost:11000/oozie 2 bin/oozie job -config oozie-apps/mr-wordcount-wf/job.properties -run
查看输出结果
原文件 统计后的文件
http://blog.csdn.net/wiborgite/article/details/78585689
时间: 2024-10-11 02:49:28