Support Questions

Find answers, ask questions, and share your expertise

Problems importing tables from SAP that have / characters in the table name

avatar
Rising Star

Is there a way to make sqoop accept / characters in table and field names?

I have tried the following:

leaving the name as-is: --table /BI0/TCUSTOMER

wrapping in quotes: --table "/BI0/TCUSTOMER"

escaping each /: --table "\/BI0\/TCUSTOMER"

It produces this error each time for all three options above:

com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [257]: sql syntax error: incorrect syntax near "\": line 1 col 17 (at pos 17)

16 REPLIES 16

avatar
Super Guru

So it seems that SAP JDBC error is gone but somehow sqoop is not resolving the double quotes. Will do search locally and get back to you.

avatar
Rising Star

Any updates on this?

avatar
Super Guru

Sorry didn't get cycle to work on this. I will try it in couple of days. Also lets wait if other experts comment on this.

avatar
Rising Star

That still doesn't work.

Here's my script:

sqoop --options-file options.txt \

--table "SCHEMA\".\"/BI0/TCUSTOMER"

--hive-import

--hive-overwrite

--split-by CUSTOMER

--hive-database DATABASE

Produces this error:

com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [257]: sql syntax error: incorrect syntax near "/": line 1 col 26 (at pos 26)

avatar
Contributor

Can you try it with backqoutes? Something like this - \`tablename\`

avatar
Rising Star

I tried that and got this error:

ERROR manager.SqlManager: Error executing statement: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [257]: sql syntax error: incorrect syntax near "AS": line 1 col 19 (at pos 19) com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [257]: sql syntax error: incorrect syntax near "AS": line 1 col 19 (at pos 19)

avatar
Rising Star

I can confirm what @Josh Persinger is saying. The only way to get tables with forward slashes ('/') (and actually colons too) in the tablename from SAP into hadoop hdfs/hive is by using the --query statement.

Some other things I found out when importing from SAP HANA:

  1. a table name can be something like 'MSG\TABLENAME' or even worse: '[SCHEMA]::database.[TABLENAME]'. Just make sure you put the complete tablename between escaped double quotes:

    eg: \"/SOMETING/TABLENAME\" or \"[SCHEMA]::database.[TABLENAME]\"

  2. we needed to add there where clause '\$CONDITIONS' even though we did a select * without any filters.
  3. when limiting the result with a where clause the values have to be between single quotes:

    eg. WHERE DDLANGUAGE='E'

  4. SAP columns can contain empty values called SPACE (not the same as NULL) (shown as a '?' in the webIDE). If you want to exclude them use the where clause <>'' (just two singlequotes following each other):

    WHERE DDLANGUAGE<>''

  5. When making your command more readible I had to keep one extra parameter after the --query parameter. When I moved the --hive-import to the next line the command would fail (I think due to the ending quotes of the query.

The result should look something like this:

sqoop import --connect "jdbc:sap://[SERVER]:30015/?currentschema=[SCHEMA]" \

--username [USERNAME] \

--password-file file:///[PATH]/.pwd_file \

--driver com.sap.db.jdbc.Driver \

--query "select * from \"[/TABLENAME/WITH/FORWARDSLASHES]\" WHERE DDLANGUAGE='E' and [COLUMNNAME]<>'' and \$CONDITIONS" --hive-import \

--hive-database [HIVEDATABASE] \

--hive-table [TABLENAME] \

--hive-delims-replacement ' ' \

--fields-terminated-by '|' \

--lines-terminated-by '\n' \

--hive-overwrite --num-mappers 1