Windows配置本地Hadoop运行环境

很多人喜欢用Windows本地开发Hadoop程序,这里是一个在Windows下配置Hadoop的教程。

首先去官网下载hadoop,这里需要下载一个工具winutils,这个工具是编译hadoop用的,下载完之后解压hadoop文件,然后把winutils.exe放到hadoop文件的bin目录下面

然后在hadoop/etc/hadoop下修改以下文件:

core-site.xml:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000/</value>
  </property>
  <property>
    <name>io.native.lib.available</name>
    <value>false</value>
  </property>
  <property>
    <name>hadoop.native.lib</name>
    <value>false</value>
  </property>
  <property>
    <name>io.compression.codecs</name>
    <value>org.apache.hadoop.io.compress.GzipCodec,
           org.apache.hadoop.io.compress.DefaultCodec,
           com.hadoop.compression.lzo.LzoCodec,
           com.hadoop.compression.lzo.LzopCodec,
           org.apache.hadoop.io.compress.BZip2Codec,
           org.apache.hadoop.io.compress.SnappyCodec
        </value>
</property>
<property>
    <name>io.compression.codec.lzo.class</name>
    <value>com.hadoop.compression.lzo.LzoCodec</value>
</property>

</configuration>

hdfs-site.xml:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
       <property>
               <name>dfs.replication</name>
                <value>1</value>
       </property>
       <property>
                <name>dfs.namenode.name.dir</name>
               <value>file:///D:/Hadoop/namenode</value>
       </property>
       <property>
               <name>dfs.datanode.data.dir</name>
               <value>file:///D:/Hadoop/datanode</value>
       </property>
</configuration>  

mapred-site.xml:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapred.compress.map.output</name>
        <value>true</value>
    </property>
    <property>
        <name>mapred.map.output.compression.codec</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property>
    <property>
        <name>mapred.child.env</name>
        <value>LD_LIBRARY_PATH=</value>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapred.compress.map.output</name>
        <value>true</value>
    </property>
    <property>
        <name>mapred.map.output.compression.codec</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property>
    <property>
        <name>mapred.child.env</name>
        <value>LD_LIBRARY_PATH=D:\hadoop-2.7.3-win64\lib</value>
    </property>
</configuration>

然后cmd到hadoop的bin目录下执行:

hdfs namenode -format

然后在sbin目录下执行:

start-all.cmd

然后浏览器打开http://localhost:8088:

执行hadoop命令:hadoop fs -ls /

空的,新建一个文件夹:hadoop fs -mkdir /data

然后查看:hadoop fs -ls /

这样就hadoop的本地伪分布式环境就配置好了。

原文地址:https://www.cnblogs.com/Kaivenblog/p/9311328.html

时间: 2024-08-01 03:38:09

Windows配置本地Hadoop运行环境的相关文章

win7下安装配置tomcat,java运行环境

1.下载JDK,安装 官网下载地址:http://java.sun.com/javase/downloads/index.jsp 下载后,安装,选择你想把JDK安装的目录: 比如:JDK安装目录:E:\java 其他博文有说还要装JRE,其实JDK安装好了,就不需要安装JRE了,本身就有了. 2.设置JDK环境变量 右击“计算机”,点击“属性”,点击弹出界面的左部分的“高级系统设置”,选择“高级”选项卡,点击下部的“环境变量” 新建2个变量,编辑1个变量,分别填入以下信息,如图所示: 新建: 变

vsCode怎么为一个前端项目配置ts的运行环境

vsCode为一个前端项目配置ts的运行环境,ts文件保存的时候自动编译成js文件: 假设此前端项目名称为Web:文件结构如图 1. 在根目录中新建一个".vscode"文件夹,里面建一个"tasks.json"文件,内容为: { // See https://go.microsoft.com/fwlink/?LinkId=733558 // for the documentation about the tasks.json format "versio

(转)Windows上搭建Kafka运行环境

转自:<Windows上搭建Kafka运行环境> 完整解决方案请参考: Setting Up and Running Apache Kafka on Windows OS 在环境搭建过程中遇到两个问题,在这里先列出来,以方便查询: 1. \Java\jre7\lib\ext\QTJava.zip was unexpected at this time. Process exited 解决方案: 1.1 右键点击“我的电脑” -> "高级系统设置" -> &quo

Windows上搭建Flume运行环境

1.如果没有安装过Java环境,则需首先安装JDK. 可参考<Windows上搭建Kafka运行环境>中的搭建环境安装JDK部分 2.官方下载Flume(当前为apache-flume-1.8.0-bin.tar.gz) 官方下载地址 官方用户手册 3.根据官方用户手册,创建一个简单例子监听44444端口的输入并在console中输出. ①进入apache-flume-1.8.0-bin\conf文件夹中创建一个example.conf文件. # example.conf: A single-

eyoucms本地安装运行环境

本地安装运行环境Eyoucms 采用 PHP + Mysql 架构,是一款对 SEO 非常友好.功能全面.安全稳定.支持多终端展示并且使用起来极其简单的企业建站系统. 本地使用 Eyoucms 搭建网站步骤: 安装「Eyoucms」运行环境:安装「Eyoucms」程序:1.安装环境1.1.下载环境包(http://www.eyoucms.com/download/) 1.2.安装环境 下载解压后,双击 .exe 的文件,进入安装流程,一直点击下一步操作: 1.3.运行 phpstudy 找到 p

Windows上搭建hadoop开发环境

前言 Windows下运行Hadoop,通常有两种方式:一种是用VM方式安装一个Linux操作系统,这样基本可以实现全Linux环境的Hadoop运行:另一种是通过Cygwin模拟Linux环境.后者的好处是使用比较方便,安装过程也简单,本篇文章是介绍第二种方式Cygwin模拟Linux环境. 准备工作 (1)安装JDK1.6或更高版本,安装时注意,最好不要安装到带有空格的路径名下,例如:Programe Files,否则在配置Hadoop的配置文件时会找不到JDK. (2)hadoop官网下载

hadoop运行环境

1.准备Linux环境 1.0点击VMware快捷方式,右键打开文件所在位置 -> 双击vmnetcfg.exe -> VMnet1 host-only ->修改subnet ip 设置网段:192.168.1.0 子网掩码:255.255.255.0 -> apply -> ok 回到windows --> 打开网络和共享中心 -> 更改适配器设置 -> 右键VMnet1 -> 属性 -> 双击IPv4 -> 设置windows的IP:1

Linux/Windows配置stm32免费开发环境详细流程

系统:linux mint 18.3 xfce,windows10 stm32开发板:正点原子mini板(stm32f103rc) 烧写器:stlink v2 如果是JLINK的可以参考这篇 需要软件: 链接:https://pan.baidu.com/s/1nxgh1VF 密码:rvzu 2018/4/1更新 系统:manjaro xfce 64bit stm32型号:f407vet6 烧写器:stlink v2 基于arch Linux的manjaro安装软件更加简单方便,步骤和下列初版教程

Windows上搭建Kafka运行环境

完整解决方案请参考: Setting Up and Running Apache Kafka on Windows OS 在环境搭建过程中遇到两个问题,在这里先列出来,以方便查询: 1. \Java\jre7\lib\ext\QTJava.zip was unexpected at this time. Process exited 解决方案: 1.1 右键点击“我的电脑” -> "高级系统设置" -> "环境变量" 1.2 查看CLASSPATH的值里