Member since
03-01-2017
15
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5127 | 05-28-2017 04:48 PM |
07-30-2017
03:20 PM
I am trying to import an Oracle table using below sqoop command. The column REPORT_XML_DATA is of the form BLOB. When trying to export the below query, I am getting Error: java.io.IOException: SQLException in nextKeyValue
Caused by: java.sql.SQLException: Invalid column type: getString not implemented for class oracle.jdbc.driver.T4CBlobAccessor
sqoop import --connect jdbc:oracle:thin:@xxxx:3000:ABCD --username TEST --password-file /user/test/mypassword.pwd --query "select REPORT_XML_DATA from mydb.table_name WHERE \$CONDITIONS" --map-column-java REPORT_XML_DATA=String --target-dir /user/test/sqoop/table_name --m 1 --hive-import --hive-table table_name --fields-terminated-by '\0174' --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N'
Error: 17/07/30 11:17:07 INFO mapreduce.Job: Task Id : attempt_1501271060971_7271_m_000000_1, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.sql.SQLException: Invalid column type: getString not implemented for class oracle.jdbc.driver.T4CBlobAccessor
at oracle.jdbc.driver.Accessor.unimpl(Accessor.java:414)
at oracle.jdbc.driver.BlobAccessor.getString(BlobAccessor.java:335)
at oracle.jdbc.driver.OracleResultSetImpl.getString(OracleResultSetImpl.java:1297)
at org.apache.sqoop.lib.JdbcWritableBridge.readString(JdbcWritableBridge.java:71)
at com.cloudera.sqoop.lib.JdbcWritableBridge.readString(JdbcWritableBridge.java:61)
at QueryResult.readFields(QueryResult.java:90)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:244)
... View more
Labels:
- Labels:
-
Apache Sqoop
05-28-2017
04:48 PM
If any one is interested, this can be achieved by setting hive.cli.errors.ignore=true hive --hiveconf hive.cli.errors.ignore=true -f myscript.sql
... View more
05-28-2017
12:45 PM
I have multiple queries in a hql file (say 10, every query ending with 😉 which I am running from a shell script. When a query in between fails (say query #5), the queries after 5 do not execute, and the hive job is completed. How can I do error handling to make sure that queries from 6 to 10 run even though query 5 fails?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
05-24-2017
09:35 AM
Perfect!!! Worked like a charm... Thank you
... View more
05-23-2017
07:39 PM
Hi @Bala Vignesh N V I am getting the below error: Logging initialized using configuration in file:/etc/hive/2.4.3.0-227/0/hive-log4j.properties
NoViableAltException(307@[])
at org.apache.hadoop.hive.ql.parse.HiveParser.type(HiveParser.java:38618)
at org.apache.hadoop.hive.ql.parse.HiveParser.colType(HiveParser.java:38375)
at org.apache.hadoop.hive.ql.parse.HiveParser.columnNameType(HiveParser.java:38059)
at org.apache.hadoop.hive.ql.parse.HiveParser.columnNameTypeList(HiveParser.java:36183)
at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:5222)
at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2648)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1658)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1117)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:316)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1202)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1250)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1139)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1129)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:314)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:428)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:717)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: ParseException line 1:60 cannot recognize input near ')' 'stored' 'as' in column type
... View more
05-23-2017
05:30 PM
2 Kudos
Hi All, I am trying to create a table in Hive from a txt file using a shell script in this format. My t_cols.txt has data as below: id string, name string, city string, lpd timestamp I want to create hive table whose columns should be coming from this text file. This is how my hive query looks like: table_cols=`cat t_cols.txt`
hive --hiveconf t_name=${table_cols} -e 'create table leap_frog_snapshot.LINKED_OBJ_TRACKING (\${hiveconf:t_name}) stored as orc tblproperties ("orc.compress"="SNAPPY") ; ' This is not working somehow. Can someone explain how to achieve this?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
03-07-2017
07:07 PM
You are right. It was error with hive. Setting hive.exec.dynamic.partition.mode=nonstrict resolved the issue. Thank you
... View more
03-06-2017
09:05 PM
I am trying to run oozie workflow from Hue. I am trying to run a hive job in the workflow and that seem to be working but overall I am getting error in the oozie workflow. Error: Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10096] I have a jdbc driver in hdfs and HDP 2.5.3 Hue version: 2.6.1-37 Log: 2017-03-06 15:47:46,557 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@:start:] Start action [0000005-170304185753616-oozie-oozi-W@:start:] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2017-03-06 15:47:46,557 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@:start:] [***0000005-170304185753616-oozie-oozi-W@:start:***]Action status=DONE
2017-03-06 15:47:46,557 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@:start:] [***0000005-170304185753616-oozie-oozi-W@:start:***]Action updated in DB!
2017-03-06 15:47:46,611 INFO WorkflowNotificationXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@:start:] No Notification URL is defined. Therefore nothing to notify for job 0000005-170304185753616-oozie-oozi-W@:start:
2017-03-06 15:47:46,611 INFO WorkflowNotificationXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000005-170304185753616-oozie-oozi-W
2017-03-06 15:47:46,627 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] Start action [0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2017-03-06 15:47:48,278 INFO HiveActionExecutor:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] checking action, hadoop job ID [job_1488578860595_0014] status [RUNNING]
2017-03-06 15:47:48,281 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] [***0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean***]Action status=RUNNING
2017-03-06 15:47:48,281 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] [***0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean***]Action updated in DB!
2017-03-06 15:47:48,287 INFO WorkflowNotificationXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] No Notification URL is defined. Therefore nothing to notify for job 0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean
2017-03-06 15:48:51,019 INFO CallbackServlet:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] callback for action [0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean]
2017-03-06 15:48:51,151 INFO HiveActionExecutor:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] Trying to get job [job_1488578860595_0014], attempt [1]
2017-03-06 15:48:51,239 INFO HiveActionExecutor:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] Hadoop Jobs launched : [job_1488578860595_0015]
2017-03-06 15:48:51,241 INFO HiveActionExecutor:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] action completed, external ID [job_1488578860595_0014]
2017-03-06 15:48:51,255 WARN HiveActionExecutor:523 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10096]
2017-03-06 15:48:51,281 INFO ActionEndXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] ERROR is considered as FAILED for SLA
2017-03-06 15:48:51,320 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@kill] Start action [0000005-170304185753616-oozie-oozi-W@kill] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2017-03-06 15:48:51,320 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@kill] [***0000005-170304185753616-oozie-oozi-W@kill***]Action status=DONE
2017-03-06 15:48:51,320 INFO ActionStartXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[eedc] GROUP[-] TOKEN[] APP[test] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@kill] [***0000005-170304185753616-oozie-oozi-W@kill***]Action updated in DB!
2017-03-06 15:48:51,373 INFO WorkflowNotificationXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@kill] No Notification URL is defined. Therefore nothing to notify for job 0000005-170304185753616-oozie-oozi-W@kill
2017-03-06 15:48:51,374 INFO WorkflowNotificationXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000005-170304185753616-oozie-oozi-W
2017-03-06 15:48:51,374 INFO WorkflowNotificationXCommand:520 - SERVER[phcv-hddev01.corp.incresearch.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000005-170304185753616-oozie-oozi-W] ACTION[0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean] No Notification URL is defined. Therefore nothing to notify for job 0000005-170304185753616-oozie-oozi-W@daily_snapshot_clean workflow.xml <workflow-app name="test" xmlns="uri:oozie:workflow:0.4">
<start to="daily_snapshot_clean"/>
<action name="daily_snapshot_clean">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<script>/user/eedc/oozie/sbi/hive/daily/daily_snapshot_clean_sbi_master_start.hql</script>
</hive>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
... View more
Labels: