Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Can I do sqoop incremental loads directly into a hive table ?

Can I do sqoop incremental loads directly into a hive table ?

New Contributor

sqoop import --connect "jdbc:jtds:sqlserver://jxx/tmwus;useNTLMv2=true;domain=CROWLEY" --table expedite_audit_tbl --username xx --password xx --incremental lastmodified --check-column updated_dt --hcatalog-database tmwus --hcatalog-table expedite_audit_tbl --create-hcatalog-table --hcatalog-storage-stanza "stored as orc" --last-value '2018-05-22 00:00:00' --split-by ord_hdrnumber -num-mappers 1

I am trying the following command. However, i receive the following error

18/05/23 15:26:12 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [expedite_audit_tbl] AS t WHERE 1=0 18/05/23 15:26:12 ERROR tool.ImportTool: Imported Failed: There is no column found in the target table expedite_audit_tbl. Please ensure that your table name is correct.

I know the table exists as I am able to do the following

sqoop import --connect "jdbc:jtds:sqlserver://xx;useNTLMv2=true;domain=CROWLEY" --query "select * from tmwus.dbo.expedite_audit_tbl WHERE updated_dt<='2018-05-22 00:00:00' AND \$CONDITIONS" --username xx--password xx--hcatalog-database tmwus --hcatalog-table expedite_audit_tbl --create-hcatalog-table --hcatalog-storage-stanza "stored as orc" --split-by ord_hdrnumber -num-mappers 12