In the Sqoop Command 1 , we have gone through some of the basic commands that are used in Sqoop. In this blog post , we will have other commands and how they are effectively used in Sqoop.
Import all Tables :- This command will import all the tables from the mysql into the HDFS directory.
Syntax :- sqoop import-all-tables --connect jdbc:mysql://localhost/database --username root -P --warehouse-dir -tablename
Exclude table while importing :- This command will exclude the tables that are not necessary for importing.
Syntax :- sqoop import-all-tables --connect jdbc:mysql://localhost/database --exclude-tables tablename --username student -P --warehouse-dir tablename
This command will import all the tables excluding the one which are mentioned in the --exclude command .
Note :- If we are notusing the primary key then we annot acheive the parallelism and in that case we need to specify the mapper -m 1.
Filter while importing data :- In Sqoop , we ccan filter out the data while importing the table into the HDFS directory.
Syntax :- sqoop import --connect jdbc:mysql://localhost/database --username -P --table tablename --target-dir dirname -m 1 --where "col= 'xx'"
Sqoop Hive Import :- We can directly import data into hive through sqoop .
Syntax :- sqoop import --connect jdbc:mysql://localhost/database --table tablename --hive-import --username root -P -m 1
Import all Tables :- This command will import all the tables from the mysql into the HDFS directory.
Syntax :- sqoop import-all-tables --connect jdbc:mysql://localhost/database --username root -P --warehouse-dir -tablename
Exclude table while importing :- This command will exclude the tables that are not necessary for importing.
Syntax :- sqoop import-all-tables --connect jdbc:mysql://localhost/database --exclude-tables tablename --username student -P --warehouse-dir tablename
This command will import all the tables excluding the one which are mentioned in the --exclude command .
Note :- If we are notusing the primary key then we annot acheive the parallelism and in that case we need to specify the mapper -m 1.
Filter while importing data :- In Sqoop , we ccan filter out the data while importing the table into the HDFS directory.
Syntax :- sqoop import --connect jdbc:mysql://localhost/database --username -P --table tablename --target-dir dirname -m 1 --where "col= 'xx'"
Sqoop Hive Import :- We can directly import data into hive through sqoop .
Syntax :- sqoop import --connect jdbc:mysql://localhost/database --table tablename --hive-import --username root -P -m 1
No comments:
Post a Comment