Member since
09-08-2016
21
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3332 | 09-27-2016 07:18 AM |
02-09-2017
05:52 AM
I tried both methods and just the first one with %spark.dep or %dep is working. Not the latter you signaled, however much more interesting for me of course. For the latter method I declared a repository with ID: "hortonwork" and URL: "http://repo.hortonworks.com/content/groups/public/" I edited the spark interpreter and added this line to dependency: com.hortonworks:shc-core:1.0.1-1.6-s_2.10 (I checked manually that the jar is indeed here.)
... View more
02-08-2017
01:34 PM
It works ! Thanks. Do you have any idea about the cause however ?
... View more
02-08-2017
12:27 PM
Hello, I am working on a cluster with an installation of HDP-2.5. I am trying to load dependency in zeppelin using the traditional way (I tried to register the repo in zeppelin and to add the dependency in the spark interpreter but it is not working better than this method so...) and to execute some code at the interpreter level. The execution of the div in order to include the dependency and the execution of the code is systematically failing. I am left with a non working zeppelin. Note: I reinstalled it but it did not change anything. Note2: zeppelin logs are available in attachment.
... View more
Labels:
- Labels:
-
Apache Zeppelin
02-07-2017
12:01 PM
Thanks for your advice it seems this is the problem. As a test I ran the example of the shc connector here
with --master yarn-cluster and --master yarn-client and this was the
problem. The quorum are respectively found/not found in each test. So
spark doest not have the file in its path when working as a client.
... View more
02-07-2017
12:00 PM
Thanks for your advice it seems this is the problem. As a test I ran the example of the shc connector here with --master yarn-cluster and --master yarn-client and this was the problem. The quorum are respectively found/not found in each test. So spark doest not have the file in its path when working as a client.
... View more
02-07-2017
11:56 AM
No this is absolutely not the problem. It does not guarantee in any manner that the spark job will take it into account. See answer to @anatva for a proper answer to this. Further more my post indicate that --files option is used with the correct files passed.
... View more
02-05-2017
01:46 PM
I am going to do it. I had production problem until now that kept me out of the problem. It is not close and I will try with your advice. Thanks.
... View more
02-02-2017
06:01 AM
I don't see no script element in these files. What do you mean ?
... View more
02-01-2017
11:22 AM
My zookeeper is running green on the ambari. I am able to hbase shell from the node where I launch the spark-shell. No problem. Here it is: hbase-site.xml,
hive-site.xml
... View more
02-01-2017
11:16 AM
I thought that's what I did 😉 What is the purpose of adding the files to spark-shell by --files option if it is not to add it to the spark classpath.
You said: "can you add the hbase-site.xml, hive-site.xml to SPARK_CLASSPATH and retry ?"
How do you do this ? Note: please see the next post for hive-site.xml and hbase-site.xml
Many thanks for your answer.
... View more