Extraction-Transformation-Loading的缩写,中文名称为数据提取、转换和加载。
将数据从ORACLE中抽取数据,经过hive进行分析转换,最后存放到ORACLE中去。
本案例是纯demo级别,练手使用
一、需求
将emp和dept表的数据分析最后存放到result表。
emp和dept表均为oracle自带的表,表结构如下:
emp表
EMPNO | NUMBER(4) |
ENAME | VARCHAR2(10) |
JOB | VARCHAR2(9) |
MGR | NUMBER(4) |
HIREDATE | DATE |
SAL | NUMBER(7,2) |
COMM | NUMBER(7,2) |
DEPTNO | NUMBER(2) |
dept表
DEPTNO | NUMBER(2) |
DNAME | VARCHAR2(14) |
LOC | VARCHAR2(13) |
result表
EMPNO | 员工号 |
ENAME | 员工姓名 |
COMMN | 津贴 |
DNAME | 部门号 |
二、数据准备
创建hive表
create table emp_etl( empno int, ename string, job string, mgr int, hiredate string, sal double, comm double, deptno int ) row format delimited fields terminated by ‘\t‘ lines terminated by ‘\n‘ stored as textfile; create table dept_etl( deptno int, dname string, loc string ) row format delimited fields terminated by ‘\t‘ lines terminated by ‘\n‘ stored as textfile; create table tmp_result_etl( empno int, ename string, comm double, dname string ) row format delimited fields terminated by ‘\t‘ lines terminated by ‘\n‘ stored as textfile; create table result_etl( empno int, ename string, comm double, dname string ) row format delimited fields terminated by ‘\t‘ lines terminated by ‘\n‘ stored as textfile;
导入hive数据
sqoop import --connect jdbc:oracle:thin:@192.168.1.107:1521/ORCL \ --username SCOTT --password TIGER \ --table EMP \ --hive-overwrite --hive-import --hive-table emp_etl \ --null-string ‘‘ --null-non-string ‘0‘ \ --fields-terminated-by ‘\t‘ --lines-terminated-by ‘\n‘ -m 3; sqoop import --connect jdbc:oracle:thin:@192.168.1.107:1521/ORCL \ --username SCOTT --password TIGER \ --table DEPT \ --hive-overwrite --hive-import --hive-table dept_etl \ --null-string ‘‘ --null-non-string ‘0‘ \ --fields-terminated-by ‘\t‘ --lines-terminated-by ‘\n‘ -m 3;
三、实现方式
在hive中分析处理,将结果导出到HDFS中,再使用SQOOP将HDFS结果导入到数据库。
1)抽取:ORACLE数据抽取到HIVE。参见前面两步。
2)转换:将查询结果插入到hive表中
INSERT OVERWRITE TABLE result_etl select a.empno, a.ename, a.comm, b.dname FROM emp_etl a join dept_etl b on (a.deptno = b.deptno);
3)转换:将数据导入到HDFS文件系统中
INSERT OVERWRITE DIRECTORY ‘RESULT_ETL_HIVE‘ SELECT * from result_etl;
4)加载:将HDFS系统中的数据加载到ORACLE中(结果表需要手工创建)
创建ORACLE表用于存放ETL结果
CREATE TABLE RESULT_ETL2( empno INT, ename VARCHAR(10), COMM DOUBLE, dname VARCHAR(14) );
sqoop export --connect jdbc:oracle:thin:@192.168.1.107:1521/ORCL \ --username SCOTT --password TIGER \ --table RESULT_ETL2 \ --export-dir /user/hadoop/RESULT_ETL_HIVE \ --fields-terminated-by ‘\001‘ \ -m 2;
或者将所有的脚本(除ORACLE创建表外)放到shell文件中,一并执行
#!/bin/sh . /etc/profile set -x hql="INSERT OVERWRITE TABLE result_etl select a.empno, a.ename, a.comm, b.dname FROM emp_etl a join dept_etl b on (a.deptno = b.deptno) " hive -e "$hql" sqoop export --connect jdbc:oracle:thin:@192.168.1.107:1521/ORCL \ --username SCOTT --password TIGER \ --table RESULT_ETL2 \ --export-dir /user/hadoop/RESULT_ETL_HIVE \ --fields-terminated-by ‘\001‘ \ -m 2;
时间: 2024-12-31 03:51:17