Member since
05-05-2016
13
Posts
4
Kudos Received
0
Solutions
09-12-2017
05:36 AM
1 Kudo
Hi @msumbul, Thanks for your reply, This issue was mainly due to multiple version of spark was installed and was not properly configured. Due to that, spark-shell was getting called from spark 1.5 and spark-submit was getting called from spark 1.3. I installed spark 2.2 and made 2.2 as major version. And issue is resolved. Thanks again.
... View more
09-09-2017
11:27 AM
When I try to create hiveContext from spark-shell, I get it done but same statements when packed in jar(scala source) throws :Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/Dataset when I try to run it with spark-submit. i am running it with: spark-submit --class com.mycom.MYApp --master local[4] --driver-class-path /usr/hdp/2.3.6.0-3796/spark/tmplib/spark-hive_2.10-1.5.2.jar /tmp/temp/test.jar code: def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("sparkSQL")
val sparkContext = new SparkContext(sparkConf)
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sparkContext)
val sqlContext = new org.apache.spark.sql.SQLContext(sparkContext)
val hdf = hiveContext.sql("select * from contracts")
hdf.printSchema()
val mdf = sqlContext.read.format("com.mongodb.spark.sql.DataSource").option("uri", "mongodb://username:password@host:27008/DBname.collectionname").load()
mdf.printSchema()
} I already tried running spark-submit with --driver-class-path also with --jars and --files /path/to/hive-site.xml but no luck 😞 I am running spark 1.5.2 and scala 2.10 Help please.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
06-15-2016
12:35 PM
Oops! my bad... Did it 🙂
... View more
06-15-2016
12:25 PM
@Rahul Pathak I already tried this, but after submitting bootstrap request, it never returns a failure or success response, I always get a running status, although, I carried on further and saw that ambari agent was installed successfully(despite of not getting a failure/success response). And like this my job is done. 🙂 Thanks man
... View more
06-15-2016
05:13 AM
@Sagar Shimpi Many thanks for your reply, but the page u refereed, starts with "Ensure the host is registered properly" and, I need a way to register a host with ssh key and password, so that ambari server can install ambari agent on new host. without ssh key and password, ambari server fails to to so, In ambari UI there is an option to put a ssh key and password but how to do that using curl command ?
... View more
06-14-2016
02:04 PM
1 Kudo
I am automating the process of recovery of a failed host in a cluster(using python), and I need to add a host to the cluster then I will install components on that. Ambari has API to add a host e.g. requests.post("
http://<ambari-host-IP>:<port>/api/v1/clusters/<CLUSTERNAME>/hosts/<HOSTNAME>", auth=self.auth, headers=self.headers, proxies=self.proxyDict) but it dosent takes ssh key and password. I dont want to explicitly install abmari agent using YUM(and changing agent.ini config file)because server version and agent version may not match. Do we have a way to add ssh key and ssh password to above post command. Or, is there any other automated way to add and register a host to an existing cluster through a python prograam ?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
05-18-2016
09:19 AM
@Sagar Shimpi ThanksMan, it helped 🙂 For all two NAMENODE in my cluster I can initialize a ini file with component and port number, and later on after a node crashes, I can use this ini file for further recovery process... Thanks again 🙂
... View more
05-18-2016
08:45 AM
Even when a host crashes, can I know on which port NAMENODE was listening on the crashed host ?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop