A export from HIVE to MYSQL using sqoop

 A export from HIVE to MYSQL using sqoop(hands on example explanation)

---------------------------------------
 In MYSQL  command line :
mysql>create table sales(custid int, prodno int, quantity int, datestring varchar(20),salesid int);

check to see the table is created :
mysql>describe sales;
---------------------------
Go to $SQOOP_HOME/bin directory 

$SQOOP_HOME/bin> ./sqoop export  --connect jdbc:mysql://192.x.x.x/test
--table sales  --username biadmin  --password  biadmin
 --export-dir /user/hive/warehouse/customersales.db/sales/Sales.csv
 --input-fields-terminated-by , --input-lines-terminated-by \n -m 1



___
1)test is the database in mysql, -- table sales is sales table in mysql
2)export-dir is the data file location in hdfs
3) --username biadmin  --password  biadmin  is the username and password to connect to mysql database
4)--input-fields-terminated-by , --input-lines-terminated-by \n  :in data file Sales.csv fields are terminated by , and end of line terminated by /n
------------
Once you have loaded go to myql
mysql> select * from sales; and check the rows
------
Even when the mysql table "sales" is short of columns count in  "Sales.csv" present in  hdfs , the above statement still works but it loads only the columns that you described from left to right , leaving others away.




No comments:

Post a Comment