Member since
12-30-2015
26
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6688 | 02-02-2017 09:05 PM |
02-02-2017
09:05 PM
1 Kudo
The lack of Vcore caused the problem. I found one container pending in yarn resource pool UI.And I increase the Vcore value per node. It works
... View more
01-22-2017
10:53 PM
1 Kudo
Hi, I have one cluster with 10 node for testing.The master node have 3 cores and 25 Gb memory,others have 2 cores and 13Gb memory. without rich resource. When I submit spark program within terminal .It runs all ok.But when I run the same spark program in HUE using workflow,it failed. I have tried increasing the container maxminux memory to 8Gb by https://community.cloudera.com/t5/Batch-Processing-and-Workflow/Oozie-sqoop-action-in-CDH-5-2-Heart-beat-issue/m-p/22475#M833 . But it does not work. Here is the logs ,Any one has any good idea?: ``` >>> Invoking Spark class now >>>
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1485152326646_0002 finished with failed status
org.apache.spark.SparkException: Application application_1485152326646_0002 finished with failed status
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1035)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1082)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:256)
at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:207)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:49)
at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:231)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Oozie Launcher failed, finishing Hadoop job gracefully ```
... View more
Labels:
- Labels:
-
Apache Oozie
-
Cloudera Hue
-
Cloudera Manager
12-28-2016
05:57 PM
delete the dir is unsafe. After I restart the hdfs cluster,the error message gone away.
... View more
10-07-2016
10:52 PM
I have the same problem . I check the catalog server log and found that the MySQL driver hive using is not consistent to The MySQL-server. Fix it ,and all is ok.
... View more
09-13-2016
01:34 AM
One more issues. My spark service 's gateway role all in gray color in instances page.The gateway instance can not be enabled. Any idea? spark instance page
... View more
09-13-2016
01:08 AM
Thanks. I am lost when seeing the parcels list page . There is spark 0.9 version parcel for cdh 4.
... View more
09-12-2016
08:54 PM
I got spark 0.9 parcel in parcel manager page.How to install spark 1.6 parcel? Anyboy has an idea?
... View more
Labels: