Created 11-22-2016 07:12 AM
1)sqoop import --connect jdbc:mysql://xxx.xxx.x.xxx/seema --driver com.mysql.jdbc.Driver --username hadoop --password big_data --table st --hive-import --hive-table st -m 1 --incremental lastmodified --check-column ts --last-value '2016-11-21 10:38:12' -m 1 --merge-key id
this 1) query working
2)sqoop import --connect jdbc:mysql://192.168.1.254/saleema --driver com.mysql.jdbc.Driver --username big_data --password big_data --table st --hive-import --hive-table st -m 1 --incremental lastmodified --check-column ts --last-value ' >= 2016-11-21 10:38:12' -m 1 --merge-key id
3)sqoop import --connect jdbc:mysql://192.168.1.254/saleema --driver com.mysql.jdbc.Driver --username big_data --password big_data --table st --hive-import --hive-table st -m 1 --incremental lastmodified --check-column ts --last-value ' <= 2016-11-21 10:38:12' -m 1 --merge-key id
2 & 3 is not working
how to execute this query
please suggest me
thanks in advance
swathi.T
Created 01-05-2017 02:43 PM
Instead of using the "-incremental" approach, you could use the "--query" or "--where" and bake in your own incremental logic.
Created 01-05-2017 02:43 PM
Instead of using the "-incremental" approach, you could use the "--query" or "--where" and bake in your own incremental logic.
Created 01-05-2017 06:47 PM
thanks clukasik,
i solved and executed with where and query approach
Eg: sqoop import --connect $connectors//$origServer/$origDatabase --driver $drivers --username $username --password $password --query "select a.* from $origTable a where CAST(ts as DATE)>='$startdate' and CAST(ts as DATE)<='$enddate' AND \$CONDITIONS" --hive-import --hive-database $hiveDatabase --hive-table $myTable -m 1 --fields-terminated-by '\t' --incremental lastmodified --check-column ts --merge-key id --target-dir $targetTmpHdfsDir/$myTable
thanks with regards,
swathi.T
Created 03-14-2017 04:16 PM
Hi - could you share your entire UNIX script that using sqoop Statement..