Member since
06-26-2018
8
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5741 | 07-31-2018 03:36 PM |
08-01-2018
07:18 AM
This validation is intentionally added in spark with SPARK-15279. As it doesn't make sense to provide DELIMITERS for ORC | PARQUET files.
... View more
07-31-2018
03:36 PM
The solution was to use Maven Shade - as seen originally - and use the relocate classes option.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<relocations>
<relocation>
<pattern>okio</pattern>
<shadedPattern>com.shaded.okio</shadedPattern>
</relocation>
</relocations>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</plugin>
Make sure to use the relocation option on the package name of the classes which conflict with the ones present in the /jars directory in Spark. This will create a 'private copy' of this dependency for your application, with no potential of interference with the underlying spark dependencies. Only watch out for making your spark application too large if you add too many dependencies like this.
... View more