Member since
02-01-2016
71
Posts
36
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3444 | 06-27-2019 10:09 AM | |
1493 | 01-27-2017 05:22 AM | |
1703 | 01-06-2017 05:05 AM | |
2498 | 11-17-2016 05:37 AM | |
2936 | 03-03-2016 12:28 PM |
03-06-2017
09:57 AM
@Jay SenSharma HI Jay, Currently we have Spark 1.6 and 2.0 both installed on the cluster , and currently we are using spark 2.0 for our operations by using export SPARK_MAJOR_VERSION=2 After that, if we use Spark-submit command there was no issue and it is successful, unfortunately Issue is with Oozie only
... View more
03-06-2017
09:23 AM
I see that spark-assembly jar is no longer required for spark 2.0 but when we are using spark action , receiving the below error ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User class threw exception: java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/Dataset;
java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/Dataset;
at comm.common.Acct$.getAcctDF(Acct.scala:13)
at comm.billlogic.SparkExecute$.execute(SparkExecute.scala:45)
at comm.billlogic.BillLogic$.main(BillLogic.scala:26)
at comm.billlogic.BillLogic.main(BillLogic.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.yarn.ApplicationMaster$anon$2.run(ApplicationMaster.scala:559)
... View more
03-06-2017
09:11 AM
Currently , we are using Spark 2.0 on HDP 2.5.3 and its working absolutely fine, when i submit the job manually. We want to automate the same using OOZIE and for the same we require spark-assembly jar for spark 2.0, unfortunately can't find the same in /usr/hdp/current/spark2-client/jars/ Unable to find the same in Maven Repository also.
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Spark
01-27-2017
05:22 AM
unfortunately , I don't any way for configuring sqoop incr jobs from oozie. So, had to write a shell script for the same. #!/bin/bash
if [ $# -ne 1 ]; then echo "Not enough arguments to start the job,Synatx is Incr_jobs.sh arg1" else
sqoop job --exec $1 > sqooplog.txt 2>&1 fi grep -i "Merge MapReduce job failed" sqooplog.txt
if [ `echo $?` -eq 0 ]
then echo "`echo $1` job failed" | mailx -s "`echo $1` job failed" -a sqooplog.txt <mailid> else
echo "`echo $1` job completed successfully" | mailx -s "`echo $1` job completed successfully" <mailid>
fi
... View more
01-24-2017
05:54 AM
My workflow.xml looks this way ... <action name="sqoop-incr">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<command>sqoop job --exec IncLocations</command>
</sqoop>
<ok to="end"/> Error Workflow Failed. Failing node [sqoop-incr]
2017-01-23 23:10:14,977 WARN ParameterVerifier:523 - SERVER[] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] The application does not define formal parameters in its XML definition
... View more
01-24-2017
05:53 AM
Can we schedule a sqoop incremental job ( sqoop job --exec IncJob) using oozie ?
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Sqoop
01-06-2017
05:08 AM
@Ed Berezitsky Thank you and currently we are using '\001' as the delimiter in place of '||'
... View more
01-06-2017
05:07 AM
We are not looking at HDFS as an intermediate storage as we will be processsing the files using SPARK SQL . @Venkat Ranganathan
... View more
01-06-2017
05:05 AM
1 Kudo
Currently sqoop incremental does not support composite primary key as per the design,but an alternative would be to concatenate the composite primary key
concat(PK1,PK2) as UniqueId while importing and using the UniqueId as --merger-key instead of a single primary key.
... View more
01-05-2017
10:04 AM
sqoop incremental import using composite primary key as merge-key
... View more
Labels:
- Labels:
-
Apache Sqoop