Member since
08-11-2014
481
Posts
92
Kudos Received
72
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3454 | 01-26-2018 04:02 AM | |
7090 | 12-22-2017 09:18 AM | |
3538 | 12-05-2017 06:13 AM | |
3858 | 10-16-2017 07:55 AM | |
11233 | 10-04-2017 08:08 PM |
07-26-2015
12:42 AM
Yes that shows the problem directly. You function has a reference to the instance of the outer class cc, and that is not serializable. You'll probably have to locate how your function is using the outer class and remove that. Or else the outer class cc has to be serializable.
... View more
07-25-2015
08:32 AM
You would really have to show more of the error, like, what is not serializable? typically that will point you to the problem. Something in the closure of your function is being dragged along when it's serialized, and it is not serializable.
... View more
07-21-2015
04:42 AM
Ah, I think I'm mistaken. Try this; note the capitalization: import sqlContext.implicits._
... View more
07-21-2015
04:32 AM
I meant in the import; you're missing the implicits, I think. import org.apache.spark.sql.SQLContext.implicits._
... View more
07-21-2015
04:14 AM
I think you're missing the package name. org.apache.spark.sql.SQLContext...
... View more
07-10-2015
11:57 AM
It's already complete and in 1.4. This was the initial JIRA https://issues.apache.org/jira/browse/SPARK-4924
... View more
07-06-2015
12:51 AM
Yes that could also be a cause. Is it possible to run the process inside the firewall? certainly the MapReduce jobs are intended to be managed by the Computation Layer from within the cluster.
... View more
07-03-2015
05:46 AM
2 Kudos
If it were me, I'd download the source for 1.4.0 and build for the exact CDH artifacts to be safest. See http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html Then just try running the local copy of bin/spark-shell etc from that distribution. You need to use YARN masters. I won't 100% guarantee that works but see no reason it wouldn't. The build flags are probably like -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.4.3 -Pyarn
... View more
07-03-2015
05:34 AM
1 Kudo
PS I should say too that you should be able to use 1.4 with CDH 5.4 and have it generally work; this requires a little bit of understanding of how to get a build on a machine and run from that build, but otherwise it's a YARN app and modulo some dependency issues at the edge maybe, should just work.
... View more
07-03-2015
05:31 AM
1 Kudo
Presumably CDH 5.5, since a new minor release is needed to update a minor release of a component in general. There aren't timeframes for this, but you can see CDH is typically on a 4-6 month minor release cycle and 5.4 was out 2 months ago.
... View more