Member since
08-04-2017
22
Posts
2
Kudos Received
0
Solutions
10-08-2017
04:57 PM
Hi, I am trying to submit a spark pi example in yarn-client mode and i am getting a weird error. HDP 2.4.2 and ambari 2.2.2 command : ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 4g --executor-memory 2g --executor-cores 4 --queue default lib/spark-examples*.jar 100 Error: INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://clusername/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 1
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$anonfun$setEnvFromInputString$1.apply(YarnSparkHadoopUtil.scala:440)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$anonfun$setEnvFromInputString$1.apply(YarnSparkHadoopUtil.scala:438)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.setEnvFromInputString(YarnSparkHadoopUtil.scala:438)
at org.apache.spark.deploy.yarn.Client$anonfun$setupLaunchEnv$6.apply(Client.scala:643)
at org.apache.spark.deploy.yarn.Client$anonfun$setupLaunchEnv$6.apply(Client.scala:641)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.deploy.yarn.Client.setupLaunchEnv(Client.scala:641)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:732)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:143)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1079)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1139)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) I am also getting same error when i try to restart the spark thrift server from ambari. Please help me fix this issue
... View more
Labels:
- Labels:
-
Apache Spark
08-16-2017
03:47 AM
Well I fixed this issue eventually by checking the ambari database hosts and one more table relating to hosts and have seen inconsistencies in the table.i have deleted the entries using the curl commands and restarted ambari and hosts tab works now. Before doing this I have checked the browser developer tools and seen the response where it is failing from the up part which also helped me to figure this issue. Hope it helps someone
... View more
08-15-2017
04:09 PM
yes i did that and when i click hive view it asks ldap password. My problem is my ldap password changes for every 4 hours, once i login in to hive view it does not ask for password anymore, well i guess hive view does not refresh when ldap password changes.how to fix this ?
... View more
08-15-2017
03:51 PM
we are currently using Hive view version 1.0 and ambari 2.2.2.0. I have integrated with LDAP and whenever i click on hive view it asks for credentials and i can successfully login and query it. But my ldap password changes for every 4 hours and hive view does not refresh or ask for my ldap credentials again which result in 401 error. Can some one please help me how should i fix this issue. It works fine as long as the initial password is valid. I have tried to clear the browser cache but did not work. How exactly does hive view authenticate
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
08-09-2017
12:18 AM
@nshelke How can we rectify these inconsistencies in ambari db
... View more
08-09-2017
12:16 AM
@nshelke I ran select * from hosts and i have some unwanted hosts in the table which defnitely show that it is inconsistent. Can you please help me as in what should i do?
... View more
08-09-2017
12:14 AM
@nshelke I ran the select * from hosts. I have found some hostnames which should not be present in the hosts table. How should i proceed in correcting them.
... View more
08-08-2017
11:51 PM
It did not help me, as my trnasport mode is http. my HS2 doesn't run on 10000.
... View more
08-08-2017
09:22 PM
There is no kerberos and ssl is not enabled for my case. Please help me
... View more
08-08-2017
07:20 PM
1 Kudo
Transport mode: http hive server 2 running port 10001, hive service check from ambari passed succesfully url : "jdbc:hive2://hostname:port/default;transportMode=http;httppath=cliservice" able to login through hive cli . When i try with beeline i get an error saying Error: Could not open client transport with JDBC Uri: jdbc:hive2://host:port/defaultt;transportMode=http;httpPath=cliservice: Could not create http connection to jdbc:hive2://host:port/default;transportMode=http;httpPath=cliservice. HTTP Response code: 401 (state=08S01,code=0) Can some one please shed some light of what is happening here. How should i solve this. I checked hive server 2 logs but there is nothing related to this.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
08-08-2017
02:39 PM
I can see from the other Rest API calls one of the decommissioned node is still present and status is UNKNOWN.Does that mean the host table in ambari db is stale ?? If yes what needs to be done
... View more
08-08-2017
02:32 PM
"Clusters" : { "cluster_name" : "prodHadoop01",
"health_report" : {
"Host/stale_config" : 1,
"Host/maintenance_state" : 15,
"Host/host_state/HEALTHY" : 15,
"Host/host_state/UNHEALTHY" : 0,
"Host/host_state/HEARTBEAT_LOST" : 1,
"Host/host_state/INIT" : 0,
"Host/host_status/HEALTHY" : 9,
"Host/host_status/UNHEALTHY" : 0,
"Host/host_status/UNKNOWN" : 1,
"Host/host_status/ALERT" : 6
},
"total_hosts" : 16,
"version" : "HDP-2.4"
}
}
... View more
08-08-2017
02:10 PM
@Jay SenSharma @nshelke any inputs to solve would greatly help me. Thanks
... View more
08-08-2017
02:09 PM
capture.png . These are the errors i get when i look in to the ambari metrics. capture.png The error when i enable the developer tools in chrome. My ambari db is oracle database. I will have a look in to that and i will check hosts table meanwhile
... View more
08-08-2017
02:05 PM
Ambari is https so i guess i have to give $Ambari_HOSTANME = hostname:8443. please let me know if i have to give $clustername field value which is equal to dfs.nameservices value in hdfs config so that i can post the rest api response
... View more
08-08-2017
12:38 AM
Hi, we are using ambari 2.2.2.0 and we are unable to see the hosts when i click on hosts tab on the ambari UI. it loads forever or it says no hosts available. I have restarted ambari server and ambari agents capture.pngbut no luck. Please guide me to fix this issue
... View more
Labels:
- Labels:
-
Apache Ambari
08-04-2017
06:30 PM
I do not know what went wrong but i removed the hive client and installed it again and it worked this time. Thanks
... View more
08-04-2017
05:16 PM
yes @Deepesh,I launched both beeline from the same machine. if there is a corrupted install of hive client, please suggest me a solution as in what steps needs to be taken. Also I am using ambari 2.2.2.0 and hdp 2.4.2... when i click on hive service, I do not see the jdbc url down and also the quick links option. Thanks
... View more
08-04-2017
03:44 PM
I have given url correctly. But it doesn't work and still stays the same error
... View more
08-04-2017
01:45 AM
what i usually do is beeline > !connect jdbc:hive2://hostname:port/default with " " and withh out.I tried both but it did not work.
... View more
08-04-2017
12:25 AM
1 Kudo
ERROR: Url i used to connect to beeline is jdbc:hihve2://hostname:10000/default .I am using oracle as my hive metastore database. ojdbc6 jar is downloaded and kept at /usr/hdp/2.../hive/lib.My transport mode is binary.After adding these jars when i am logged in as root i can connect to beeline with my ldap uid and pwd. But when i am logged with user id in linux machine and try to connect to beeline. I am getting the same error again
... View more
Labels:
- Labels:
-
Apache Hive