Windows7+Eclipse环境下Hbase Java客户端的开发

本文展示如何在Windows的环境下通过Eclipse构建Hbase的客户端开发

  1. 构建Hbase集群,请参考:Centos
    下Hbase0.98.10-hadoop2 集群的配置
  2. 在Eclipse中创建Maven的工程
  3. 将集群的hbase-site.xml文件放到工程的classes目录下

  4. 配置操作系统的C:\windows\system32\drivers\etc文件,将Hbase集群的IP以及域名配置到该文件中
    192.168.40.108   hadoop108
    192.168.40.148   hadoop148
    192.168.40.104   hadoop104
    192.168.40.107   hadoop107
    192.168.40.105   hadoop105
  5. 编写Maven的pom.xml文件,依赖内容如下
    	<dependencies>
    
    		<dependency>
    			<groupId>org.apache.avro</groupId>
    			<artifactId>avro</artifactId>
    			<version>1.7.7</version>
    		</dependency>
    
    		<dependency>
    			<groupId>org.apache.avro</groupId>
    			<artifactId>avro-tools</artifactId>
    			<version>1.7.7</version>
    		</dependency>
    
    		<dependency>
    			<groupId>org.apache.avro</groupId>
    			<artifactId>avro-maven-plugin</artifactId>
    			<version>1.7.7</version>
    		</dependency>
    		<dependency>
    			<groupId>org.apache.avro</groupId>
    			<artifactId>avro-compiler</artifactId>
    			<version>1.7.7</version>
    		</dependency>
    
    		<dependency>
    			<groupId>org.apache.hbase</groupId>
    			<artifactId>hbase-client</artifactId>
    			<version>0.98.8-hadoop1</version>
    		</dependency>
    
    		<dependency>
    			<groupId>org.apache.hbase</groupId>
    			<artifactId>hbase</artifactId>
    			<version>0.90.2</version>
    		</dependency>
    		<dependency>
    			<groupId>org.apache.hadoop</groupId>
    			<artifactId>hadoop-core</artifactId>
    			<version>1.2.1</version>
    		</dependency>
    
    		<dependency>
    			<groupId>junit</groupId>
    			<artifactId>junit</artifactId>
    			<version>3.8.1</version>
    			<scope>test</scope>
    		</dependency>
    	</dependencies>
  6. 编辑Java源码
    package com.eric.hbase;
    
    import java.io.IOException;
    import java.util.ArrayList;
    import java.util.List;
    
    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.hbase.HBaseConfiguration;
    import org.apache.hadoop.hbase.HColumnDescriptor;
    import org.apache.hadoop.hbase.HTableDescriptor;
    import org.apache.hadoop.hbase.KeyValue;
    import org.apache.hadoop.hbase.MasterNotRunningException;
    import org.apache.hadoop.hbase.ZooKeeperConnectionException;
    import org.apache.hadoop.hbase.client.Delete;
    import org.apache.hadoop.hbase.client.Get;
    import org.apache.hadoop.hbase.client.HBaseAdmin;
    import org.apache.hadoop.hbase.client.HTable;
    import org.apache.hadoop.hbase.client.Put;
    import org.apache.hadoop.hbase.client.Result;
    import org.apache.hadoop.hbase.client.ResultScanner;
    import org.apache.hadoop.hbase.client.Scan;
    import org.apache.hadoop.hbase.util.Bytes;
    
    public class BaseOperation {
    
    	private static final String TABLE_NAME = "demo_table";
    
    	public static Configuration conf = null;
    	public HTable table = null;
    	public HBaseAdmin admin = null;
    
    	static {
    		conf = HBaseConfiguration.create();
    		System.out.println(conf.get("hbase.zookeeper.quorum"));
    	}
    
    	/**
    	 * 创建一张表
    	 */
    	public static void creatTable(String tableName, String[] familys)
    			throws Exception {
    		HBaseAdmin admin = new HBaseAdmin(conf);
    		if (admin.tableExists(tableName)) {
    			System.out.println("table already exists!");
    		} else {
    			HTableDescriptor tableDesc = new HTableDescriptor(tableName);
    			for (int i = 0; i < familys.length; i++) {
    				tableDesc.addFamily(new HColumnDescriptor(familys[i]));
    			}
    			admin.createTable(tableDesc);
    			System.out.println("create table " + tableName + " ok.");
    		}
    	}
    
    	/**
    	 * 删除表
    	 */
    	public static void deleteTable(String tableName) throws Exception {
    		try {
    			HBaseAdmin admin = new HBaseAdmin(conf);
    			admin.disableTable(tableName);
    			admin.deleteTable(tableName);
    			System.out.println("delete table " + tableName + " ok.");
    		} catch (MasterNotRunningException e) {
    			e.printStackTrace();
    		} catch (ZooKeeperConnectionException e) {
    			e.printStackTrace();
    		}
    	}
    
    	/**
    	 * 插入一行记录
    	 */
    	public static void addRecord(String tableName, String rowKey,
    			String family, String qualifier, String value) throws Exception {
    		try {
    			HTable table = new HTable(conf, tableName);
    			Put put = new Put(Bytes.toBytes(rowKey));
    			put.add(Bytes.toBytes(family), Bytes.toBytes(qualifier),
    					Bytes.toBytes(value));
    			table.put(put);
    			System.out.println("insert recored " + rowKey + " to table "
    					+ tableName + " ok.");
    		} catch (IOException e) {
    			e.printStackTrace();
    		}
    	}
    
    	/**
    	 * 删除一行记录
    	 */
    	public static void delRecord(String tableName, String rowKey)
    			throws IOException {
    		HTable table = new HTable(conf, tableName);
    		List list = new ArrayList();
    		Delete del = new Delete(rowKey.getBytes());
    		list.add(del);
    		table.delete(list);
    		System.out.println("del recored " + rowKey + " ok.");
    	}
    
    	/**
    	 * 查找一行记录
    	 */
    	public static void getOneRecord(String tableName, String rowKey)
    			throws IOException {
    		HTable table = new HTable(conf, tableName);
    		Get get = new Get(rowKey.getBytes());
    		Result rs = table.get(get);
    		for (KeyValue kv : rs.raw()) {
    			System.out.print(new String(kv.getRow()) + " ");
    			System.out.print(new String(kv.getFamily()) + ":");
    			System.out.print(new String(kv.getQualifier()) + " ");
    			System.out.print(kv.getTimestamp() + " ");
    			System.out.println(new String(kv.getValue()));
    		}
    	}
    
    	/**
    	 * 显示所有数据
    	 */
    	public static void getAllRecord(String tableName) {
    		try {
    			HTable table = new HTable(conf, tableName);
    			Scan s = new Scan();
    			ResultScanner ss = table.getScanner(s);
    			for (Result r : ss) {
    				for (KeyValue kv : r.raw()) {
    					System.out.print(new String(kv.getRow()) + " ");
    					System.out.print(new String(kv.getFamily()) + ":");
    					System.out.print(new String(kv.getQualifier()) + " ");
    					System.out.print(kv.getTimestamp() + " ");
    					System.out.println(new String(kv.getValue()));
    				}
    			}
    		} catch (IOException e) {
    			e.printStackTrace();
    		}
    	}
    
    	public static void main(String[] agrs) {
    		try {
    			String tablename = "scores";
    			String[] familys = { "grade", "course" };
    			BaseOperation.creatTable(tablename, familys);
    
    			// add record zkb
    			BaseOperation.addRecord(tablename, "zkb", "grade", "", "5");
    			BaseOperation.addRecord(tablename, "zkb", "course", "", "90");
    			BaseOperation.addRecord(tablename, "zkb", "course", "math", "97");
    			BaseOperation.addRecord(tablename, "zkb", "course", "art", "87");
    			// add record baoniu
    			BaseOperation.addRecord(tablename, "baoniu", "grade", "", "4");
    			BaseOperation
    					.addRecord(tablename, "baoniu", "course", "math", "89");
    
    			System.out.println("===========get one record========");
    			BaseOperation.getOneRecord(tablename, "zkb");
    
    			System.out.println("===========show all record========");
    			BaseOperation.getAllRecord(tablename);
    
    			System.out.println("===========del one record========");
    			BaseOperation.delRecord(tablename, "baoniu");
    			BaseOperation.getAllRecord(tablename);
    
    			System.out.println("===========show all record========");
    			BaseOperation.getAllRecord(tablename);
    		} catch (Exception e) {
    			e.printStackTrace();
    		}
    	}
    
    }
    
  7. 运行程序,输出如下:
    hadoop107,hadoop108,hadoop104
    log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
    table already exists!
    insert recored zkb to table scores ok.
    insert recored zkb to table scores ok.
    insert recored zkb to table scores ok.
    insert recored zkb to table scores ok.
    insert recored baoniu to table scores ok.
    insert recored baoniu to table scores ok.
    ===========get one record========
    zkb course: 1425258910718 90
    zkb course:art 1425258910727 87
    zkb course:math 1425258910722 97
    zkb grade: 1425258910705 5
    ===========show all record========
    baoniu course:math 1425258910734 89
    baoniu grade: 1425258910730 4
    zkb course: 1425258910718 90
    zkb course:art 1425258910727 87
    zkb course:math 1425258910722 97
    zkb grade: 1425258910705 5
    ===========del one record========
    del recored baoniu ok.
    zkb course: 1425258910718 90
    zkb course:art 1425258910727 87
    zkb course:math 1425258910722 97
    zkb grade: 1425258910705 5
    ===========show all record========
    zkb course: 1425258910718 90
    zkb course:art 1425258910727 87
    zkb course:math 1425258910722 97
    zkb grade: 1425258910705 5
    
时间: 2024-10-12 10:50:50

Windows7+Eclipse环境下Hbase Java客户端的开发的相关文章

【Redis】redis集群与非集群环境下的jedis客户端通用开发

非集群环境下 package com.chiwei.redis; import java.util.ArrayList; import java.util.List; import org.junit.Test; import redis.clients.jedis.Jedis; import redis.clients.jedis.JedisPool; import redis.clients.jedis.JedisPoolConfig; import redis.clients.jedis.

java 在centos6.5+eclipse环境下调用opencv实现sift算法

java 在centos6.5+eclipse环境下调用opencv实现sift算法,代码如下: import org.opencv.core.Core; import org.opencv.core.Mat; import org.opencv.core.MatOfKeyPoint; import org.opencv.highgui.Highgui; import org.opencv.features2d.*; public class ExtractSIFT{ public static

android开发——Eclipse环境下代码编辑最常用快捷键集锦(来了就不能空手而归)

Ctrl+D:删除光标所在行 Ctrl+/ :注释选中行 :Ctrl+\:注销选中行 Ctrl+Shift+/:注释选中的java或xml代码块: Ctrl+Shift+\:注销选中的Java或xml代码块.(形式:/*      */ 或 <!--      -->) shift + alt + j或/**+Enter(回车键):添加javadoc头注释,形如/** * * * * * */(个人更习惯用/**+Enter(回车键)) Ctrl+K:向前查找与当前选定内容相同的代码(如查找与

IntelliJ和eclipse环境下的Hello World

1. IntelliJ环境下的Hello World 1. 启动IntelliJ IDE,选择File->New->Project 选择Java如果没有出现Project SDK,则选择New,JDK 选择New->JDK->指向java SDK安装的文件夹(一般位于C:\Program Files (x86)\Java\jdk(版本号))   之后直接选择Next(不从模板创建)->选择项目文件夹和项目名(Hello World) 选择Project->SDK->

Eclipse环境下添加package到工程的classpath

ABSTRACT: 在Eclipse环境下,Import是在CLASSPATH路径和工程目录中搜寻package(.class文件), 若希望import第三方的package,需要将package的顶层路径添加到CLASSPATH.然而, 通过系统环境变量设置的CLASSPATH不能被Eclipse识别,会报unresolve error,原因是 工程的CLASSPATH被Eclipse重写,需要通过GUI或.classpath文件来设置. 1.文件准备 .class文件(.jar也可) 注意

详细的图文教程来实现 eclipse环境下如何配置tomcat,并且把项目部署到Tomcat服务器上

很多初学,尤其自学JavaWeb的朋友首次在eclipse下配置tomcat时,总会有种难下手的感觉,在此,通过图文解说的方法,最直观的向大家演示一遍该配置过程. 第一部分:eclipse环境下如何配置tomcat 1.下载并成功安装Eclipse和Tomcat 2.打开Eclipse,单击“window”菜单,选择下方的“Preferences” . 3. 点击 Add 添加Tomcat. 4. 选中自己安装的tomcat路径. 5. 选择jdk 版本. 6. 选择自己的jdk版本. 7. 点

Eclipse环境下JBoss调试,解决引用的工程不被部署的问题

其实算是一个很小的经验,在eclipse环境下进行jboss的部署,因为要定义某公共包的问题,将代码down下来做了个工程,部署时发现jboss提示:class not found! 从jboss部署目录中没有发现该类,在lib中也没有发现对应的jar包,考虑是编译时正确但运行时错误,原因就是没有部署. 后台经过同事指点,得知需要修改project的Deployment Assembly,需要将引用的工程通过jar的形式引入到jboss中.如图示: 重新部署,debug启动即可. Eclipse

JAVA环境下利用solrj二次开发SOlR搜索的环境部署常见错误

问题一:出现控制台坏的响应错误一Bad request 控制台出现错误如下: Bad Request request: http://hostIP:8983/solr/update?wt=javabin&version=1 解决方法: 出现以上错误的原因是,solr服务器上配置的Field和javabean提交的Field不能对应, 导致solr服务器找不到域,拒绝访问. 打开SOLR_HOME下的conf文件夹找到schema.xml文件,在其中添加对应的域. 例如以下代码添加了:title,

Eclipse 环境下安装PhoneGap开发插件

phoneGap开发跨所有移动平台软件已经成为未来移动终端开发的总趋势,如何在大家所熟悉的Eclipse IDE中快速安装PhoneGap开发插件,介绍如下: 点击help-->install new software-->add连接:http://svn.codespot.com/a/eclipselabs.org/mobile-web-development-with-phonegap/tags/r1.2.91/download/ 然后一路next,选择重新启动Eclipse,出现左边所示