【sqoop】安装配置测试sqoop1

1.1.1 下载sqoop1:
sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz

1.1.2 解压并查看目录:

[[email protected] ~]$ tar -zxvf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz --解压
[[email protected] ~]$ cd sqoop-1.4.7.bin__hadoop-2.6.0
[[email protected] sqoop-1.4.7.bin__hadoop-2.6.0]$ ls -ll -查看目录
total 2020
drwxr-xr-x. 2 hadoop hadoop    4096 Dec 19  2017 bin
-rw-rw-r--. 1 hadoop hadoop   55089 Dec 19  2017 build.xml
-rw-rw-r--. 1 hadoop hadoop   47426 Dec 19  2017 CHANGELOG.txt
-rw-rw-r--. 1 hadoop hadoop    9880 Dec 19  2017 COMPILING.txt
drwxr-xr-x. 2 hadoop hadoop     150 Dec 19  2017 conf
drwxr-xr-x. 5 hadoop hadoop     169 Dec 19  2017 docs
drwxr-xr-x. 2 hadoop hadoop      96 Dec 19  2017 ivy
-rw-rw-r--. 1 hadoop hadoop   11163 Dec 19  2017 ivy.xml
drwxr-xr-x. 2 hadoop hadoop    4096 Dec 19  2017 lib
-rw-rw-r--. 1 hadoop hadoop   15419 Dec 19  2017 LICENSE.txt
-rw-rw-r--. 1 hadoop hadoop     505 Dec 19  2017 NOTICE.txt
-rw-rw-r--. 1 hadoop hadoop   18772 Dec 19  2017 pom-old.xml
-rw-rw-r--. 1 hadoop hadoop    1096 Dec 19  2017 README.txt
-rw-rw-r--. 1 hadoop hadoop 1108073 Dec 19  2017 sqoop-1.4.7.jar
-rw-rw-r--. 1 hadoop hadoop    6554 Dec 19  2017 sqoop-patch-review.py
-rw-rw-r--. 1 hadoop hadoop  765184 Dec 19  2017 sqoop-test-1.4.7.jar
drwxr-xr-x. 7 hadoop hadoop      73 Dec 19  2017 src
drwxr-xr-x. 4 hadoop hadoop     114 Dec 19  2017 testdata

1.2 配置sqoop——mysql连接器:
下载mysql-connector-java-8.0.16.jar,并将其拷贝至sqoop安装目录下的lib文件夹里

1.3 配置sqoop环境变量:

[[email protected] sqoop-1.4.7.bin__hadoop-2.6.0]$ cd conf
[[email protected] conf]$ ls -ll
total 28
-rw-rw-r--. 1 hadoop hadoop 3895 Dec 19  2017 oraoop-site-template.xml
-rw-rw-r--. 1 hadoop hadoop 1404 Dec 19  2017 sqoop-env-template.cmd
-rwxr-xr-x. 1 hadoop hadoop 1345 Dec 19  2017 sqoop-env-template.sh
-rw-rw-r--. 1 hadoop hadoop 6044 Dec 19  2017 sqoop-site-template.xml
-rw-rw-r--. 1 hadoop hadoop 6044 Dec 19  2017 sqoop-site.xml

1.3.1 拷贝复制sqoop-env.sh样本,并添加hadoop、hbase、hive、zookeeper的安装目录(注:没有的就不添加)

[[email protected] conf]$ cp sqoop-env-template.sh sqoop-env.sh
[[email protected] conf]$ gedit sqoop-env.sh
修改的内容:
#Set path to where bin/hadoop is available
export HADOOP_COMMON_HOME=/home/hadoop/hadoop-3.2.0

#Set path to where hadoop-*-core.jar is available
export HADOOP_MAPRED_HOME=/home/hadoop/hadoop-3.2.0

#set the path to where bin/hbase is available
export HBASE_HOME=/home/hadoop/hbase-2.2.1

#Set the path to where bin/hive is available
export HIVE_HOME=/home/hadoop/apache-hive-3.1.2-bin

#Set the path for where zookeper config dir is
export ZOOCFGDIR=/home/hadoop/apache-zookeeper-3.5.5

1.3.2 配置linux环境变量

[[email protected] conf]$ gedit ~/.bash_profile
[[email protected] conf]$ source ~/.bash_profile
新添加:
#sqoop
export SQOOP_HOME=/home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0
export PATH=$PATH:$SQOOP_HOME/bin

1.4 验证sqoop是否安装成功

[[email protected] sqoop-1.4.7.bin__hadoop-2.6.0]$ bin/sqoop help --执行该命令,看到如下信息就表示成功
Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
Error: Could not find or load main class org.apache.hadoop.hbase.util.GetJavaProperty
2019-09-29 23:38:28,571 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
usage: sqoop COMMAND [ARGS]Available commands:  codegen            Generate code to interact with database records  create-hive-table  Import a table definition into Hive  eval               Evaluate a SQL statement and display the results  export             Export an HDFS directory to a database table  help               List available commands  import             Import a table from a database to HDFS  import-all-tables  Import tables from a database to HDFS  import-mainframe   Import datasets from a mainframe server to HDFS  job                Work with saved jobs  list-databases     List available databases on a server  list-tables        List available tables in a database  merge              Merge results of incremental imports  metastore          Run a standalone Sqoop metastore  version            Display version information

See ‘sqoop help COMMAND‘ for information on a specific command.[[email protected] sqoop-1.4.7.bin__hadoop-2.6.0]$

1.5 测试sqoop与mysql的连接
sqoop list-tables --username User --password ‘User_123456‘ --connect jdbc:mysql://localhost:3306/hahive /**方式1**/
sqoop list-tables --username User -P --connect jdbc:mysql://localhost:3306/hahive  /**方式2**/
Enter password:

如果能执行以上操作,可以看见结果,说明sqoop能连接mysql。

原文地址:https://www.cnblogs.com/CQ-LQJ/p/11617097.html

时间: 2024-07-31 11:33:48

【sqoop】安装配置测试sqoop1的相关文章

Sqoop安装配置及数据导入导出

前置条件 已经成功安装配置Hadoop和Mysql数据库服务器,如果将数据导入或从Hbase导出,还应该已经成功安装配置Hbase. 下载sqoop和Mysql的JDBC驱动 sqoop-1.2.0-CDH3B4.tar.gz :http://archive.cloudera.com/cdh/3/sqoop-1.2.0-CDH3B4.tar.gz mysql-connector-java-5.1.28 安装sqoop [[email protected] ~]$ tar -zxvf sqoop-

Hadoop学习之第八章节:Sqoop安装配置

一.Sqoop介绍 Sqoop是一个用来将Hadoop(Hive.HBase)和关系型数据库中的数据相互转移的工具,可以将一个关系型数据库(例如:MySQL ,Oracle ,Postgres等)中的数据导入到Hadoop的HDFS中,也可以将HDFS的数据导入到关系型数据库中. Sqoop目前已经是Apache的顶级项目了,目前版本是1.4.4 和 Sqoop2 1.99.3,本文以1.4.4的版本为例讲解基本的安装配置和简单应用的演示. 版本为: sqoop-1.4.6.bin__hadoo

memcache缓存服务器(安装-配置-测试篇)

部署环境: 主机 ip地址 操作系统 nginx 172.16.1.100 CentOS 7.3 php+memcache 172.16.1.110 CentOS 7.3 Mysql 172.16.1.120 CentOS 7.3 memcached 172.16.1.130 CentOS 7.3 一, 环境准备: 搭建LNMP环境(动态解析) 1,安装nginx 1)安装依赖工具包: [[email protected] ~]# yum -y install gcc* pcre-devel o

Wowza 部署 安装 配置 测试 直播

下载,最好用快的IP下好后传到需要的节点上,下面链接不能下载的情况下百度谷歌必应找资源,jdk旧版在oracle需登录方可下载 JDK1.6 wget -c http://dl.download.csdn.net/down10/20150107/6c6851a8b5f7e95752d71fe7d24f0ee9.bin?response-content-disposition=attachment%3Bfilename%3D%22jdk-6u45-linux-x64.bin%22&OSSAcces

Git 安装 & 配置 & 测试

安装地址: https://git-scm.com/downloads 配置Git上个人的用户名称和电子邮件地址,用来记录提交人的信息: $ git config --global user.name "your name" $ git config --global user.email "[email protected]" -- 查看所有配置信息 $ git config -l 原文地址:https://www.cnblogs.com/newweipeng/p

Sqoop的安装与测试

[部署安装] # Sqoop是一个用来将Hadoop和关系型数据库中的数据相互转移的工具,可以将一个关系型数据库(例如 : MySQL ,Oracle ,Postgres等)中的数据导进到Hadoop的HDFS中,也可以将HDFS的数据导进到关系型数据库中. # 部署Sqoop到13.33,参考文档: Sqoop安装配置及演示 http://www.micmiu.com/bigdata/sqoop/sqoop-setup-and-demo/ # Sqoop只需要部署一份,目前部署在13.33,和

elk集群安装配置详解

#  一:简介 ``` Elasticsearch作为日志的存储和索引平台: Kibana 用来从 Elasticsearch获取数据,进行数据可视化,定制数据报表: Logstash 依靠强大繁多的插件作为日志加工平台: Filebeat 用来放到各个主机中收集指定位置的日志,将收集到日志发送到 Logstash: Log4j 直接与 Logstash 连接,将日志直接 Logstash(当然此处也可以用 Filebeat 收集 tomcat 的日志). ``` ####  port ```

Tengine+keepalived 安装配置

Tengine+keepalived 安装配置 测试环境 系统:Centos-6.5-x86_64 主IP:192.168.1.220 备IP:192.168.219 VIP:192.168.1.226 一.安装Tengine 1.1创建目录并下载安装包 mkdir -p /data/ops/{app,packages,scripts} cd /data/ops/packages/ wget http://tengine.taobao.org/download/tengine-2.1.0.tar

Openvpn安装配置参考示例收藏版

1.配置安装环境: [[email protected] ~]# yum install -y gcc gcc-c++ pam-devel openssl penssl-devel [[email protected] ~]# yum install -y lrzsz    (SecureCRT 上传下载) 2.关闭SELINUX.清除防火墙设置 [[email protected] ~]#vi /etc/selinx/config #     disabled - No SELinux pol