Member since
10-05-2018
1
Post
0
Kudos Received
0
Solutions
10-05-2018
09:17 AM
@slachterman I
am facing some issues with PySpark code and some places i see there are
compatibility issues so i wanted to check if that is probably the
issue. Even otherwise it is better to check these compatibility problems
upfraont i guess. So i wanted to know some things. I am on 2.3.1
spark and 3.6.5 python, do we know if there is a compatibility issue
with these? Do i upgrade to 3.7.0 (which i am planning) or downgrade to
<3.6? What in your opinion is more sensible? Info: versions.. Spark --> spark-2.3.1-bin-hadoop2.7.. all installed according to instructions in python spark course venkatesh@venkatesh-VirtualBox:~$ java -version</li><li>openjdk version "10.0.1"2018-04-17</li><li>OpenJDKRuntimeEnvironment(build 10.0.1+10-Ubuntu-3ubuntu1)</li><li>OpenJDK64-BitServer VM (build 10.0.1+10-Ubuntu-3ubuntu1, mixed mode)</li></ol> I work MacOS and Linux.
... View more