赞
踩
目录
一、sqoop的导入(import从MySQL导入hadoop)使用
二、sqoop的导出(export从hadoop导出到MySQL)使用
create table user1(name varchar(40),age int);
insert into user1 values("张三",18);
insert into user1 values("李华",21);
start-all.sh
[hadoop@HadoopMaster ~]$ jps
3329 SecondaryNameNode
3572 ResourceManager
3884 Jps
3085 NameNode
[hadoop@HadoopMaster ~]$
- sqoop import \
- --connect jdbc:mysql://192.168.43.1:3306/dbtest \
- --username root \
- --password 123456 \
- --table user1 \
- --target-dir /mydata/ \
- --m 1
查看hdfs对应路径下:
[hadoop@HadoopMaster sqoop-1.4.5]$ hdfs dfs -ls /mydata
2023-06-26 23:07:44,159 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 2 items
-rw-r--r-- 2 hadoop supergroup 0 2023-06-26 23:06 /mydata/_SUCCESS
-rw-r--r-- 2 hadoop supergroup 20 2023-06-26 23:06 /mydata/part-m-00000
成功!
[hadoop@HadoopMaster sqoop-1.4.5]$ hdfs dfs -cat /mydata/part-m-00000
2023-06-26 23:09:44,593 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
张三,18
李华,21
- sqoop import \
- --connect jdbc:mysql://192.168.43.1:3306/dbtest \
- --username root \
- --password 123456 \
- --table user1 \
- --hive-import \
- --hive-table hive_db03.user1 \
- --m 1
-->也可以指定hive的路径
- sqoop import \
- --connect jdbc:mysql://192.168.43.1:3306/dbtest \
- --username root \
- --password 123456 \
- --table user1 \
- --target-dir /user/hive/warehouse/hive_db03.db/user1 \
- --delete-target-dir \
- --fields-terminated-by ',' \
- --m 1
hive> use hive_db03;
OK
Time taken: 0.019 seconds
hive> select * from user1;
OK张三 18
李华 21
Time taken: 1.048 seconds, Fetched: 2 row(s)
[hadoop@HadoopMaster ~]$ hdfs dfs -cat /user/hive/warehouse/hive_db03.db/user1/part-m-00000
2023-06-26 23:58:24,695 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
张三,18
李华,21
[hadoop@HadoopMaster ~]$ vim stu
[hadoop@HadoopMaster ~]$ cat stu
001,小明,20
002,小红,22
003,小王,23
上传到hdfs上:
hdfs dfs -put ./stu /mydata
[hadoop@HadoopMaster ~]$ hdfs dfs -cat /mydata/stu
2023-06-27 00:05:07,450 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
001,小明,20
002,小红,22
003,小王,23
create table stu(id char(20),name varchar(40),age int);
sqoop命令:
- sqoop export \
- --connect jdbc:mysql://192.168.43.1:3306/dbtest \
- --username root \
- --password 123456 \
- --table stu \
- --export-dir /mydata/stu \
- --fields-terminated-by ',' \
- --m 1
mysql> select * from stu;
+------+--------+------+
| id | name | age |
+------+--------+------+
| 001 | 小明 | 20 |
| 002 | 小红 | 22 |
| 003 | 小王 | 23 |
+------+--------+------+
3 rows in set (0.00 sec)
(hive导出到mysql更换路径就可以)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。