Member since
10-24-2015
171
Posts
379
Kudos Received
23
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2641 | 06-26-2018 11:35 PM | |
4348 | 06-12-2018 09:19 PM | |
2874 | 02-01-2018 08:55 PM | |
1443 | 01-02-2018 09:02 PM | |
6754 | 09-06-2017 06:29 PM |
06-02-2017
07:39 PM
6 Kudos
@Rajesh Reddy, As per stack trace , it looks like spark job does not find proper hive Jars. You can try couple of solutions to resolve this issue. 1) Copy hive-site.xml to $SPARK_HOME/conf dir ( or /etc/spark/conf) 2) set spark.driver.extraClassPath and point to hive-exec-*.jar
... View more
06-02-2017
07:16 PM
@Nilay Jain, in this case you will need to add python interpreter. Refer to below link to add python interpreter in zeppelin https://community.hortonworks.com/questions/77411/python-interpreter-not-configured-in-zeppelin-on-h.html
... View more
06-01-2017
10:31 PM
2 Kudos
@Nilay Jain, you can use below document to setup python interpreter in zeppelin https://zeppelin.apache.org/docs/0.6.2/interpreter/python.html
... View more
05-30-2017
06:59 PM
1 Kudo
@zhoussen, so if application with "livy-session-60-zahglq2y" tag is alive and running fine. You need to update the livy app lookup timeout to be more than 60 secs. It seems that livy believes that yarn application was not started within 60 sec. set livy.server.yarn.app-lookup-timeout to may be 300 sec.
... View more
05-30-2017
06:29 PM
2 Kudos
@zhoussen, As per livy logs, The spark application was not started correctly. In order to find out its root cause, please check the spark application logs. Steps to follow: 1) Check the status of yarn cluster. ( List running applications) 2) Run livy paragraph as user2 3) Check if new application is launched in Yarn. If a new application is launched, check its status and application log for further debugging.
... View more
05-25-2017
06:58 PM
3 Kudos
@Sushant ,There are mainly two ways to enable Dynamic Resource Allocation. 1) Spark level Enable DRA in spark. Follow below setup instruction to enable DRA. This way all spark application will run with DRA enabled. https:///content/supportkb/49510/how-to-enable-dynamic-resource-allocation-in-spark.html 2) Zeppelin level You can choose to enable DRA in Zeppelin interpreter. You can use Livy interpreter to run spark-shell or pyspark jobs. https://zeppelin.apache.org/docs/0.6.1/interpreter/livy.html
... View more
05-12-2017
10:28 PM
1 Kudo
@Pradeep kumar, you should have JAVA_HOME set in spark-env.sh. Can you please check what is the value of JAVA_HOME in spark-env.sh?
... View more
05-11-2017
08:27 PM
6 Kudos
@Sudeep Mishra, In secure env, you need to add all Hbase dependent jars to SPARK CLASSPATH. Add this configuration to spark-env.sh. export SPARK_CLASSPATH=<List of Hbase jars separated by 😆
... View more
05-08-2017
06:12 PM
1 Kudo
@Param NC, 1. Can you try setting spark.yarn.stagingDir to hdfs:///user/tmp/ ? 2. Can you please share which spark config are you trying to set which require RM address?
... View more
04-18-2017
12:39 AM
1 Kudo
@Rukmini Iyer, can you please check spark history logs?
... View more