Member since
12-08-2015
7
Posts
5
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3183 | 12-28-2015 11:01 AM | |
1980 | 12-08-2015 07:16 AM |
01-18-2016
01:14 PM
2 Kudos
I think this should make it work: export
SPARK_CLASSPATH=/usr/lib/hbase/lib/hbaseprotocol.jar:/etc/hbase/conf:$(hbase
classpath) or you add the HBase dependency with this: --driver-class-path hbase-dependency at the spark-submit.
... View more
12-28-2015
11:01 AM
Hi I had the same problem, this means that spark can not find the R directory. The Hortonworks Spark version comes without the R directory:
drwxr-xr-x 2 root root 4096 Dec 9 13:33 lib drwxr-xr-x 6 root root 4096 Dec 3 21:30 python drwxr-xr-x 3 root root 4096 Jul 8 22:58 R My solution was to download the Spark standalone version and copy the R directory to my Hortonworks Spark client.
... View more
12-08-2015
07:16 AM
1 Kudo
Hi, yes you are right, .hbck is for offline meta repair. I guess you know the hbck tool: To check to see if your HBase cluster has corruptions, run hbck against your HBase cluster: $ ./bin/hbase hbck
At the end of the commands output it prints OK or tells you the number of INCONSISTENCIES present. As you said after the deletion you don't have the option to do a offline meta repair, but in my opinion I would still keep the fail and delete some other stuff.
... View more