Created 05-19-2016 08:14 AM
Hi,
for table in $@
do
sqoop import \ --connect "jdbc:jtds:sqlserver://xxxxxxxxx:xxxxxx;databaseName=xxxxxx;user=xxxxxx;password=xxxxxxxx;instance=xxxxxx" \ --driver net.sourceforge.jtds.jdbc.Driver \ --username xxxx \ --table $table \ --hive-import \ --hive-table xxxxxxxxx.$table -m 1
done
If I want to import the views how should I change my script please
Created 05-19-2016 10:13 AM
Want to get a detailed solution you have to login/registered on the community
Register/LoginCreated 05-19-2016 10:09 AM
You could use the same script to import the data from your views: Sqoop will fetch the data from your view and store it into Hive/HDFS.
If you don't want to import the data but just want to create a view on Hive, then take the definition of your view in SQLserver (DDL) and create the same view in Hive (some few adaptations might be needed, check the documentation (https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Create/Drop/Al... ).
A recommendation I would also give you, is to do the Sqoop commands in parallel. Otherwise, if you have many tables and you use "-m 1", it will take a lot of time. You can check the script I wrote for that:
https://community.hortonworks.com/articles/23602/sqoop-fetching-lot-of-tables-in-parallel.html
Created 05-23-2016 09:15 AM
Thank you for your suggestions
Created 05-19-2016 10:13 AM
Want to get a detailed solution you have to login/registered on the community
Register/LoginCreated 05-23-2016 09:14 AM
Created 08-13-2019 12:44 PM
@alain TSAFACK can u share the script how u imported view using sqoop to hdfs