<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: org.apache.oozie.action.ActionExecutorException: JA009: Cannot initialize Cluster. in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/org-apache-oozie-action-ActionExecutorException-JA009-Cannot/m-p/120442#M83210</link>
    <description>&lt;P&gt;You would have to make sure that the mapreduce.framework.name is set correctly ( yarn I suppose )  and the mapred files are there but first please verify that your nameNode parameter is set correctly. HDFS is very exact about it and requires the hdfs:// in front. &lt;/P&gt;&lt;P&gt;So hdfs://nameonode:8020 instead of namenode:8020&lt;/P&gt;</description>
    <pubDate>Thu, 28 Apr 2016 23:57:03 GMT</pubDate>
    <dc:creator>bleonhardi</dc:creator>
    <dc:date>2016-04-28T23:57:03Z</dc:date>
    <item>
      <title>org.apache.oozie.action.ActionExecutorException: JA009: Cannot initialize Cluster.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/org-apache-oozie-action-ActionExecutorException-JA009-Cannot/m-p/120441#M83209</link>
      <description>&lt;P&gt;I am using ambari installation and trying to run a coordinator oozie job that imports data to hive using sqoop.&lt;/P&gt;&lt;P&gt;I have them installed, up and running on server. &lt;/P&gt;&lt;P&gt;My workflow.xml looks like this:&lt;/P&gt;&lt;PRE&gt;  &amp;lt;workflow-app name="once-a-day" xmlns="uri:oozie:workflow:0.1"&amp;gt;
&amp;lt;start to="sqoopAction"/&amp;gt;
        &amp;lt;action name="sqoopAction"&amp;gt;
                &amp;lt;sqoop xmlns="uri:oozie:sqoop-action:0.2"&amp;gt;
                        &amp;lt;job-tracker&amp;gt;${jobTracker}&amp;lt;/job-tracker&amp;gt;
                        &amp;lt;name-node&amp;gt;${nameNode}&amp;lt;/name-node&amp;gt;
                    &amp;lt;command&amp;gt;import-all-tables --connect jdbc:mysql://HOST_NAME/erp --username hiveusername --password hivepassword
                        --&amp;lt;/command&amp;gt;
                &amp;lt;/sqoop&amp;gt;
                &amp;lt;ok to="end"/&amp;gt;
                &amp;lt;error to="killJob"/&amp;gt;
        &amp;lt;/action&amp;gt;
&amp;lt;kill name="killJob"&amp;gt;
            &amp;lt;message&amp;gt;"Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}"&amp;lt;/message&amp;gt;
        &amp;lt;/kill&amp;gt;
&amp;lt;end name="end" /&amp;gt;
&amp;lt;/workflow-app&amp;gt;&lt;/PRE&gt;&lt;P&gt;I get this error:&lt;/P&gt;&lt;P&gt;How do I fix this? I have tried everything suggested on Internet but nothing fixes it&lt;/P&gt;&lt;PRE&gt;[0001059-160427195624911-oozie-oozi-W] ACTION[0001059-160427195624911-oozie-oozi-W@sqoopAction] Error starting action [sqoopAction]. ErrorType [TRANSIENT], ErrorCode [JA009], Message [JA009: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.]
org.apache.oozie.action.ActionExecutorException: JA009: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.oozie.action.ActionExecutor.convertExceptionHelper(ActionExecutor.java:456)
at org.apache.oozie.action.ActionExecutor.convertException(ActionExecutor.java:436)
at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:1139)
at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1293)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:250)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:64)
at org.apache.oozie.command.XCommand.call(XCommand.java:286)
at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:175)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
at org.apache.hadoop.mapreduce.Cluster.&amp;lt;init&amp;gt;(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.&amp;lt;init&amp;gt;(Cluster.java:75)
at org.apache.hadoop.mapred.JobClient.init(JobClient.java:475)
at org.apache.hadoop.mapred.JobClient.&amp;lt;init&amp;gt;(JobClient.java:454)
at org.apache.oozie.service.HadoopAccessorService$3.run(HadoopAccessorService.java:462)
at org.apache.oozie.service.HadoopAccessorService$3.run(HadoopAccessorService.java:460)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.oozie.service.HadoopAccessorService.createJobClient(HadoopAccessorService.java:460)
at org.apache.oozie.action.hadoop.JavaActionExecutor.createJobClient(JavaActionExecutor.java:1336)
at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:1087)
... 8 more&lt;/PRE&gt;</description>
      <pubDate>Thu, 28 Apr 2016 18:42:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/org-apache-oozie-action-ActionExecutorException-JA009-Cannot/m-p/120441#M83209</guid>
      <dc:creator>sim6</dc:creator>
      <dc:date>2016-04-28T18:42:59Z</dc:date>
    </item>
    <item>
      <title>Re: org.apache.oozie.action.ActionExecutorException: JA009: Cannot initialize Cluster.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/org-apache-oozie-action-ActionExecutorException-JA009-Cannot/m-p/120442#M83210</link>
      <description>&lt;P&gt;You would have to make sure that the mapreduce.framework.name is set correctly ( yarn I suppose )  and the mapred files are there but first please verify that your nameNode parameter is set correctly. HDFS is very exact about it and requires the hdfs:// in front. &lt;/P&gt;&lt;P&gt;So hdfs://nameonode:8020 instead of namenode:8020&lt;/P&gt;</description>
      <pubDate>Thu, 28 Apr 2016 23:57:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/org-apache-oozie-action-ActionExecutorException-JA009-Cannot/m-p/120442#M83210</guid>
      <dc:creator>bleonhardi</dc:creator>
      <dc:date>2016-04-28T23:57:03Z</dc:date>
    </item>
  </channel>
</rss>

