Member since
02-18-2016
18
Posts
13
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6897 | 04-05-2016 01:01 AM | |
3184 | 03-31-2016 02:39 AM | |
4339 | 02-19-2016 01:56 AM |
04-05-2016
01:01 AM
Thanks, I have solved this problem. Refer to : https://community.hortonworks.com/questions/25121/oozie-execute-sqoop-falls.html#answer-25290
... View more
03-31-2016
02:39 AM
2 Kudos
Hi, I got the right result!!!! The reason is hive-site.xml contained dirty config that can not be contained in the workflow.xml. So we only need the basic config: ------------------------------------------ <property>
<name>ambari.hive.db.schema.name</name>
<value>hive</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://cluster2.new:9083</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/apps/hive/warehouse</value>
</property>
<property>
<name>hive.zookeeper.quorum</name>
<value>cluster2.new:2181,cluster3.new:2181,cluster1.new:2181,cluster4.new:2181,cluster5.new:2181</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://cluster1.new/hive?createDatabaseIfNotExist=true</value>
</property> ------------------------------------------
... View more
03-31-2016
01:25 AM
I can run successful through the sqoop. I ran oozie on the command line. "oozie job -oozie http://xxxxxx:11000:oozie -config ./job.properties -run"
... View more
03-30-2016
06:04 AM
2 Kudos
Hi: I used oozie to execute sqoop which import data from mysql to hive. But the job was killed. The error log said: ---------------------------------------------- 16313 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 1.0645 KB in 13.3807 seconds (81.4604 bytes/sec)
2016-03-30 10:54:21,743 INFO [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(184)) - Transferred 1.0645 KB in 13.3807 seconds (81.4604 bytes/sec)
16315 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 51 records.
2016-03-30 10:54:21,745 INFO [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(186)) - Retrieved 51 records.
16323 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `area` AS t LIMIT 1
2016-03-30 10:54:21,753 INFO [main] manager.SqlManager (SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.* FROM `area` AS t LIMIT 1
16328 [main] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
2016-03-30 10:54:21,758 INFO [main] hive.HiveImport (HiveImport.java:importTable(195)) - Loading uploaded data into Hive
Intercepting System.exit(1) ----------------------------------------------- I searched the error from the internet, someone said "the hive-site.xml is missing, not in workflow.xml, or not correctly configured." I have uploaded hive-site.xml in HDFS /tmp/ and added <file>/tmp/hive-site.xml#hive-site.xml</file> but the error is still exist. So what can I do next ? Need someone can help me!!! --------------------------------------- <workflow-app xmlns="uri:oozie:workflow:0.2" name="sqoop-wf">
<start to="sqoop-node"/>
<action name="sqoop-node">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="${nameNode}/user/${wf:user()}/area"/>
</prepare>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<command>import --connect jdbc:mysql://cluster1.new:3306/crmdemo --username root --password xxxxx --table area --hive-import --hive-table default.area</command>
<file>/tmp/hive-site.xml#hive-site.xml</file>
</sqoop>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> ---------------------------------------hive-site.xml
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Sqoop
03-30-2016
01:20 AM
Oh, you give me the URL which teach me how to find the error log. But my question is : I found the error log, but I couldn't locate the question from the error log. The log said: -------------------------------------------------- 14732[main] INFO org.apache.sqoop.hive.HiveImport-Loading uploaded data intoHive 2016-03-2916:32:39,962 INFO [main] hive.HiveImport(HiveImport.java:importTable(195))-Loading uploaded data intoHive InterceptingSystem.exit(1) <<<Invocation of Mainclass completed <<< FailingOozieLauncher,Mainclass[org.apache.oozie.action.hadoop.SqoopMain],exit code [1] ------------------------------------------------------------------------- The job had finished and didn't say any error information, but it said "InterceptingSystem.exit(1)" finally, I don't know why the job failed.
... View more
03-30-2016
01:08 AM
Please to see the comment that I have responded to Kuldeep Kulkarni. I have added the hive-site.xml <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <command>import --hive-import --connect jdbc:mysql://xxxx:3306/xxxx --username xx --password xxxx --table area</command> <file>hive-site.xml#hive-site.xml</file> </sqoop> Now, I encounter the question "InterceptingSystem.exit(1)". Can you help me?
... View more
03-29-2016
08:47 AM
1 Kudo
Oh, This question have been solved. I forgot to input packages under the hive_home/lib. But, I found another question. Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] The Log said: ------------------------------------------------------- 2016-03-29 16:32:29,564 WARN [main] util.MRApps (MRApps.java:parseDistributedCacheArtifacts(610)) - cache file (mapreduce.job.cache.files) hdfs://cluster1.new/user/root/share/lib/sqoop/mysql-connector-java-5.1.38-bin.jar conflicts with cache file (mapreduce.job.cache.files) hdfs://cluster1.new/user/root/.staging/job_1459235663418_0020/libjars/mysql-connector-java-5.1.38-bin.jar This will be an error in Hadoop 2.0
2016-03-29 16:32:29,801 INFO [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(274)) - Submitted application application_1459235663418_0020
2016-03-29 16:32:29,823 INFO [main] mapreduce.Job (Job.java:submit(1294)) - The url to track the job: http://cluster2.new:8088/proxy/application_1459235663418_0020/
2016-03-29 16:32:29,824 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1339)) - Running job: job_1459235663418_0020
2016-03-29 16:32:33,869 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1360)) - Job job_1459235663418_0020 running in uber mode : false
2016-03-29 16:32:33,870 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) - map 0% reduce 0%
2016-03-29 16:32:38,898 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) - map 100% reduce 0%
2016-03-29 16:32:39,904 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1378)) - Job job_1459235663418_0020 completed successfully
2016-03-29 16:32:39,938 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1385)) - Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=1377552
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=545
HDFS: Number of bytes written=1090
HDFS: Number of read operations=16
HDFS: Number of large read operations=0
HDFS: Number of write operations=8
Job Counters
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=8075
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=8075
Total vcore-seconds taken by all map tasks=8075
Total megabyte-seconds taken by all map tasks=16537600
Map-Reduce Framework
Map input records=51
Map output records=51
Input split bytes=545
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=160
CPU time spent (ms)=3120
Physical memory (bytes) snapshot=923926528
Virtual memory (bytes) snapshot=14719983616
Total committed heap usage (bytes)=886046720
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=1090
14714 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 1.0645 KB in 11.9988 seconds (90.8424 bytes/sec)
2016-03-29 16:32:39,944 INFO [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(184)) - Transferred 1.0645 KB in 11.9988 seconds (90.8424 bytes/sec)
14716 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 51 records.
2016-03-29 16:32:39,946 INFO [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(186)) - Retrieved 51 records.
14727 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `area` AS t LIMIT 1
2016-03-29 16:32:39,957 INFO [main] manager.SqlManager (SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.* FROM `area` AS t LIMIT 1
14732 [main] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
2016-03-29 16:32:39,962 INFO [main] hive.HiveImport (HiveImport.java:importTable(195)) - Loading uploaded data into Hive
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] ------------------------------------------------------------ I don't know how to find the error place. The log didn't say anything except "Intercepting System.exit(1)". Had anyone encountered this problem?
... View more
03-29-2016
05:32 AM
I have added hive-site.xml to the directory of workflow.xml add <file>hive-site.xml#hive-site.xml</file> to workflow.xml, I get the same errors. So what can I do next? ----------------------------------------------------- <start to="sqoop-node"/>
<action name="sqoop-node">
<sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node> <prepare> <delete path="${nameNode}/user/${wf:user()}/area"/> </prepare> <configuration> <property>
<name>mapred.job.queue.name</name> <value>${queueName}</value>
</property>
</configuration>
<command>import --hive-import --connect jdbc:mysql://xxxx:3306/xxxx --username xx --password xxxx --table area</command>
<file>hive-site.xml#hive-site.xml</file> </sqoop>
<ok to="end"/> <error to="fail"/> </action> <kill name="fail"> <message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/>
</workflow-app>
-----------------------------------------------------
... View more
03-29-2016
02:19 AM
Thanks for your response. I use the HDP2.3 and don't change the config file. the hive-site.xml include the hdfs But I don't know how to add reference file to my workflow. Can you tell me? -------------------------------------------------- <workflow-app xmlns="uri:oozie:workflow:0.2" name="sqoop-wf"> <start to="sqoop-node"/> <action name="sqoop-node"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <command>import --hive-import --connect jdbc:mysql://xxxxxx:3306/xxxx --username root --password xxxxx --table area</command> </sqoop> <ok to="end"/> <error to="fail"/> </action> <kill name="fail"> <message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/> </workflow-app> --------------------------------------------------
... View more