So I found that if i force a query using --query it will work, but if I let sqoop generate the query it does not.
Example --query "select * from \"BI0/TCUSTOMER\"" will work, but referring to the table in the same way by using the --table \"BI0/TCUSTOMER\" argument does not work. So the problem must be when sqoop generates the SQL query on its own, it is not syntactically correct.
If there was a way to see the query it was generating that would be very helpful to troubleshoot it, but as far as I know there is not.
Yes I tried the above. That somewhat fixed my problem, it seems to accept the syntax now, but I believe it is now literally interpreting the quotes as part of the table name, so when it goes to SAP to find that table, it can't find it because the quotes aren't actually there in the SAP table name...
16/06/08 15:10:31 ERROR tool.ImportTool: Encountered IOException running import job: java.io.FileNotFoundException: /tmp/sqoop-hdfs/compile/d26f755016cd4c734711de4a2550ca46/SCHEMA."/BI0/TCUSTOMER".jar (No such file or directory)
I can confirm what
@Josh Persinger is saying. The only way to get tables with forward slashes ('/') (and actually colons too) in the tablename from SAP into hadoop hdfs/hive is by using the --query statement.
Some other things I found out when importing from SAP HANA:
a table name can be something like 'MSG\TABLENAME' or even worse: '[SCHEMA]::database.[TABLENAME]'. Just make sure you put the complete tablename between escaped double quotes:
\"/SOMETING/TABLENAME\" or \"[SCHEMA]::database.[TABLENAME]\"
we needed to add there where clause '\$CONDITIONS' even though we did a select * without any filters.
when limiting the result with a where clause the values have to be between single quotes:
eg. WHERE DDLANGUAGE='E'
SAP columns can contain empty values called SPACE (not the same as NULL) (shown as a '?' in the webIDE). If you want to exclude them use the where clause <>'' (just two singlequotes following each other):
When making your command more readible I had to keep one extra parameter after the --query parameter. When I moved the --hive-import to the next line the command would fail (I think due to the ending quotes of the query.