recap basic command(hadoop)

  • file operations

hdfs fs -ls

hdfs fs -lsr 包括子目录文件

hdfs fs  -rmr

bin/hadoop fs -put path1  path2  上传hdfs 下path1 到本机 path2

bin/hadoop fs -get path1  path2  下载hdfs 下path1 到本机 path2

bin/hadoop dfs -copyFromLocal 源路径 路径

bin/hadoop dfs -getmerge

hadoop fs -du URI [URI …]  显示目录中所有文件的大小

hadoop fs -expunge 清空回收站

  • management and update

hdfs dfsadmin -report  查看HDFS的基本统计信息

hdfs dfsadmin -safemode leave 退出safemode safemode下文件不允许修改

start-balacer.sh  负载

  • mr job

bin/hadoop jar *.jar [jobMainClass] [jobArgs] 运行MRjob

bin/hadoop job -kill job_201005310937_0053

http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_shell.html

bin/hadoop 查看更多命令

Most commands print help when invoked w/o parameters.

时间: 2024-08-09 09:54:57

recap basic command(hadoop)的相关文章

basic command

1,su - db2inst 2,db2 connect to ids 3,db2stop force 4,db2 list db directory 5,db2 list tables for all 6,db2 describe table tablename 7,db2 "restore db JVC from ." 8,select * from (select a.*  , rownumber() over(order by rate desc) as rn   from (

linux shell basic command

Learning basic Linux commands Command Description $ ls This command is used to check the contents ofthe directory. $ pwd This command is used to check the presentworking directory. $ mkdir work We will work in a separate directory calledwork in our h

unix basic command

1. get start Command Example Description ls ls ls -a ls -l 输出目录文件 输出文件包括隐藏文件 输出文件详细信息 pwd pwd show present working dir cd cd dir cd .. cd ./a/b cd ~icey/a cd /home/username/documents change dir 返回上一个目录 到当前目录下的a/b文件夹中 到某个用户的子文件夹 从根目录开始 mkdir mkdir ice

mysql basic command

mysql的command以分号来分隔 1. 连接数据库 mysql -uroot -p 然后输入密码即可 2. 查看数据库 show databases; 3. 改变数据库 use database_name; 4.查看所有的表格 show tables; 5.查询某一表格的schema信息 describe table_name;

windows server basic command

运行操作 CMD命令:开始->运行->键入cmd或command(在命令行里可以看到系统版本.文件系统版本) CMD命令锦集 1. gpedit.msc-----组策略 2. sndrec32-------录音机 3. Nslookup-------IP地址侦测器 ,是一个 监测网络中 DNS 服务器是否能正确实现域名解析的命令行工具. 它在 Windows NT/2000/XP 中均可使用 , 但在 Windows 98 中却没有集成这一个工具. 4. explorer-------打开资源

Django basic command

https://docs.djangoproject.com/en/1.8/intro/tutorial01/ 1. get django version python -c "import django; print(django.get_version())" 2. run server python manage.py runserver 3. create a project django-admin startproject [project name] 4. create

linux basic command

1. strace  strace -f -tt -o /tmp/gwenjie_vemkd_strace.log -p 5913 & -f 打印线程 2. sed  cat vemkdperf.log | awk '{print $13, $19, $25}' | sed -s 's/</ /g'| sed -s 's/>/ /g' > vvv.txt

Hadoop 2.X : 分布式安装

原文: http://disi.unitn.it/~lissandrini/notes/installing-hadoop-on-ubuntu-14.html This guide is shows step by step how to set up a multi nod cluster with Hadoop and HDFS 2.4.1 on Ubuntu 14.04. It is an update, and takes many parts from previous guides

Hadoop学习日志- install hadoop

资料来源 : http://www.tutorialspoint.com/hadoop/hadoop_enviornment_setup.htm Hadoop 安装 创建新用户 $ su password: # useradd hadoop -g root # passwd hadoop New passwd: Retype new passwd 修改/etc/sudoers 赋予sudo 权限 设置ssh SSH Setup and Key Generation SSH setup is re