05-14-2015 09:23 AM
sowen, thanks for your response. When I go to the Spark UI and look at the classpath entries under the Environment tab everything there has "System Classpath" as its source. I set JAVA_HOME in my ~/bash_profile but that's it for custimizations that I can think of. Are there any logs that can show me more about what it's doing when starting the shell? The console message don't tell me where it looks for configuration or these initial commands that it seems to be running and I can't identifiy anything like that in the web UI.
The basic problem seems to be that it can't find the HiveConf class. Why would that be and how do I address it?
05-15-2015 02:30 AM
it's strage because I have 2 CDH 5.4 clusters, in one I don't have the issue and in the other I have the Hive context issue except in the node where Hive is installed (the other are gateway nodes and still doesn't work ok).
That makes me think the issue is not so much about libraries but about configuration, I have seen a known issue about the hive configuration in the documention but I haven't been able to make the workaround work for me:
05-21-2016 06:36 AM - edited 05-21-2016 06:38 AM
@TS "The spark.eventLog.dir parameter had to be reset to the proper value."
How to reset that parameter? Where to find it? Can you give the steps please?