I want to use sparks JDBC connection to write a data frame to oracle. My data frame has a string column which is very long. At least longer than the default 255 characters the spark will allocate when creating the schema.
How can I still write to the oracle table? Does it work when I manually create the schema first with CLOB datatypes? If yes, how can I get spark to only `TRUNCATE` instead of overwrite the table?