Member since
10-30-2016
33
Posts
4
Kudos Received
0
Solutions
10-30-2016
05:27 PM
Hi All, I am new to oozie trying to run my simple word count spark job through oozie ,i am successfully submitting the job, but i am getting this exception Launcher exception: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SparkMain not found, Here is my workflow.xml and job.properties i followed these link to run the spark job https://community.hortonworks.com/articles/48920/how-to-run-spark-action-in-oozie-of-hdp-230.html we tried a sample hive job through oozie we successfully executed that, for this i am getting this error.please let me know the root cause of this exception, for this i have tried different test cases,we copied spark-assembly-jar to usr/oozie/share/lib although i am getting this exception,error.txt please find attached error log aso Workflow.xml
<workflow-app xmlns='uri:oozie:workflow:0.5' name='Sparkjob'>
<start to='spark-node' />
<action name='spark-node'>
<spark xmlns="uri:oozie:spark-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<master>local[*]</master>
<mode>client</mode>
<name>Spark-FileCopy</name>
<class>org.examples.WordCounte</class>
<jar>${nameNode}/anji/oozie/lib/Spark_test.jar</jar>
<arg>${input}</arg>
<arg>${output}</arg>
</spark>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Workflow failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]
</message>
</kill>
<end name='end' />
</workflow-app>
job.properties:nameNode=hdfs://nn:8020
jobTracker=jT:8032
queueName=default
input =${nameNode}/sample2.txt
output=${nameNode}/output3
oozie.system.lib.path = true
oozie.libpath=${nameNode}/user/oozie/share/lib
oozie.action.sharelib.for.spark=${namenode}/user/oozie/share/lib/spark/
dryrun=False
oozie.wf.application.path=hdfs://quickstart.cloudera:8020/anji/oozie/
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Oozie
-
Apache Spark
09-19-2016
12:35 PM
Hi, I want to create and load data into Hive Table through sparkQL using scala code(i have to built jar and execute through spark-submit) please help me ,it's very thankful to me
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
07-15-2016
04:07 PM
1 Kudo
Hi, I just want to know why hive supports insert,update,delete in ORC file format only .I am using hive 0.12 version .i want to know how it is accepting ACID properties with ORC file only.
... View more
Labels:
- Labels:
-
Apache Hive
07-01-2016
06:29 AM
Please Go with this link http://hortonworks.com/blog/deploying-hadoop-cluster-amazon-ec2-hortonworks/
... View more
06-28-2016
05:28 PM
Hi All, i have 3 node nifi cluster(1 Master and 2 Slaves) and 3 node hadoop cluster ,if i created dataflow send or receive data to hdfs 1.How internally Nifi works? 2.is Nifi uses MapReduce ? 3.is there any load balancing algorithms it uses? 4.How Master and slaves co-ordinates with each other? 5. is Nifi stores any data internally ? please explain with simple architecture
... View more
Labels:
- Labels:
-
Apache NiFi
06-28-2016
02:58 PM
Thank you @Bryan Bende
... View more
06-28-2016
02:57 PM
Thank you @Jobin George
... View more
06-28-2016
02:57 PM
Thank you @milind pandit
... View more
06-28-2016
02:56 PM
Thank you Zblaco
... View more
06-27-2016
09:44 AM
1 Kudo
Hi All, i have created 2 node nifi cluster, and 3 node hadoop cluster(hortonwork), Now i want to interact with hdfs from Nifi for this what are the configuration properties i have to set. is have to copy core-site.xml into nifi bin? , then how to create processor for copy data from one hdfs directory to another . please explain with simple example.
... View more
Labels:
- Labels:
-
Apache NiFi
- « Previous
-
- 1
- 2
- Next »