Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

spark jdbc oracle long string fields and truncate

Highlighted

spark jdbc oracle long string fields and truncate

New Contributor

I want to use sparks JDBC connection to write a data frame to oracle. My data frame has a string column which is very long. At least longer than the default 255 characters the spark will allocate when creating the schema. How can I still write to the oracle table? Does it work when I manually create the schema first with CLOB datatypes? If yes, how can I get spark to only `TRUNCATE` instead of overwrite the table?

1 REPLY 1

Re: spark jdbc oracle long string fields and truncate

Super Collaborator

Hi @Georg Heiler,

this can be possible by specifying the datatypes of the JDBC columns so that when when the table get created it create with appropriate datatype.

This has beed addressed in spark v2.2 where you can use target datatypes in JDBC.

but in previous version still not possible as the jira stated (https://issues.apache.org/jira/browse/SPARK-10849)