Member since
09-29-2015
58
Posts
34
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
972 | 09-29-2016 01:39 PM | |
2183 | 06-21-2016 03:10 PM | |
6647 | 05-16-2016 07:12 PM | |
8739 | 04-08-2016 02:06 PM | |
1165 | 04-08-2016 01:56 PM |
01-12-2017
02:40 PM
yarn-client is not supported as per new guidelines, only yarn-cluster mode is supported within livy interpreter from Zeppelin
... View more
01-12-2017
02:39 PM
1 Kudo
yarn-client is not supported, only yarn-cluster is supported
... View more
01-11-2017
02:33 PM
@elkan li which user are you logged in as when launching? Can you try an su hive then launch the hive cli? It seems the issue is with the launching user
... View more
01-11-2017
02:30 PM
1 Kudo
@sagar pavan the Diagnostic message indicates the Users AM resource limit is exceeded. Please review the capacity scheduler's AM resource limit and raise it from the default 20%, this should allow the AM container to be launched
... View more
12-21-2016
03:24 PM
3 Kudos
livy documentation states to add jars with the interpreter key value livy.spark.jars. This is applicable when running livy interpreter in yarn-cluster mode only (livy.spark.master). This can be done in local or yarn-client mode as well however by adding the jars to the /usr/hdp/<version>/livy/repl-jars directory. This will add the jars to the spark submit command run by livy when executing within a notebook in modes other than yarn-cluster.
... View more
Labels:
12-13-2016
05:27 PM
This does not look like an issue with the jar being included but rather an issue with the import statement. I breifly looked on google and see similar descriptions stating to try org.mongodb. I would focus on the import statement more than the inclusion of the jar for livy.
... View more
12-13-2016
04:25 PM
@Mickaël GERVAIS check to make sure livy interpreter is listed in the interpreter bindings for the notebook. Also, set DEBUG on the livy server and check in the livy out file produced on the server. Finally, make sure you have restarted livy and zeppelin to pick up the changes. I tested and it did work for me.
... View more
12-13-2016
02:22 PM
The jars should be able to be added by using the parameter key livy.spark.jars and pointing to an hdfs location in the livy interpreter settings. This does not seem to work. I had to place the needed jar in the following directory on the livy server: /usr/hdp/2.5.3.0-37/livy/repl-jars
... View more
09-29-2016
01:39 PM
@ARUN Yes, you can use node labels and queues together. Here is some documentation regarding that: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.6/bk_yarn_resource_mgt/content/configuring_node_labels.html
... View more
09-27-2016
11:53 AM
2 Kudos
That is not a good solution as you are saying yarn does not need any healthy disks to function. You essentially disable the health check with this. There is an underlying problem with the disks and that is the reason for it being marked unhealthy.
... View more