Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

MapReduce job stops working with Hbase after upgrad from 5.3 to 5.5.2

MapReduce job stops working with Hbase after upgrad from 5.3 to 5.5.2

New Contributor

I ran into this error message:

 

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:512)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:755)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

2 REPLIES 2

Re: MapReduce job stops working with Hbase after upgrad from 5.3 to 5.5.2

Master Guru
> Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.

Like the error suggests, could you please look at or share with us the full failing map attempt's task log?

The caused-by message seen after that is more of an after-error (initialization has failed for some other reason, but this message does not tell us why).

Re: MapReduce job stops working with Hbase after upgrad from 5.3 to 5.5.2

New Contributor

What I posted is almost the whole log. The code worked fine with CDH 5.3. This error only occurs when we tried to upgrade to 5.5.2. Here is the complete error log. Campaign table is a hbase table. The operation is trying to load data into this table.

 

[info] application - [BEGIN][1456954066226] [jingding279.pivotlinkqa.com|Fake User] Data Load for Campaign
[info] application - [jingding279.pivotlinkqa.com|Fake User] Loading table [Campaign], rule [LoadRuleForCampaignTable][7fb4afbe-584c-4401-995b-bd1a068093de] with delimiter [,]
[info] application - [jingding279.pivotlinkqa.com|Fake User] Header Row - CampaignID, CampaignName, CampaignCost, BeginDate, EndDate
[debug] application - Column Definitions changed, persisting table definition for Campaign
[info] application - [BEGIN][1456954066342] [jingding279.pivotlinkqa.com|Fake User] Deleting records from table [Campaign] using Map Reduce
Job: jingding279.pivotlinkqa.com|Fake User: Delete from Table Campaign Job has 1 tasks
    Task: Delete Task, Table: Campaign
        will run after nothing
    Input: Input: From table: "Campaign"
    Output: Output: Table: Campaign
Setting input table to: Campaign
Initializing output table to: Campaign
[warn] application - Could not find BUILD_NUMBER as a jar resource, will look for a conf directory
[INFO] [03/02/2016 13:27:47.396] [application-akka.actor.default-dispatcher-3] [akka://application/user/pivotlink_jobManager/$a] Message [com.pivotlink.compute.mapreduce.JobPolling$PollForStatus$] from Actor[akka://application/user/pivotlink_jobManager#-670224327] to Actor[akka://application/user/pivotlink_jobManager/$a#-1900609835] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
[warn] application - Could not find BUILD_NUMBER as a jar resource, will look for a conf directory
[warn] application - Could not find BUILD_NUMBER as a jar resource, will look for a conf directory
[warn] application - Could not find BUILD_NUMBER as a jar resource, will look for a conf directory
Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

[info] application - [END][1456954066226] [SUCCESS] [jingding279.pivotlinkqa.com|Fake User] Running job : Delete from table output_lovs Duration: 00:00:23.834
Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

[INFO] [03/02/2016 13:28:15.055] [application-akka.actor.default-dispatcher-5] [akka://application/user/pivotlink_jobManager/$b] Message [com.pivotlink.compute.mapreduce.JobPolling$PollForStatus$] from Actor[akka://application/user/pivotlink_jobManager#-670224327] to Actor[akka://application/user/pivotlink_jobManager/$b#1794477460] was not delivered. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

[error] application - [jingding279.pivotlinkqa.com|Fake User] Exception [assertion failed: Delete job failed for Campaign] thrown while loading, rolling back loaded data for table [Campaign] and site [jingding279.pivotlinkqa.com]
java.lang.AssertionError: assertion failed: Delete job failed for Campaign
    at scala.Predef$.assert(Predef.scala:179)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply$mcV$sp(BaseTable.scala:119)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:270)
    at utils.CPMLogger.infoTimed(LoggingUtils.scala:69)
    at models.BaseTable.deleteByPrefix(BaseTable.scala:113)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$3$$anonfun$apply$2.apply$mcV$sp(DataLoadServiceImpl.scala:287)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$3$$anonfun$apply$2.apply(DataLoadServiceImpl.scala:283)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$3$$anonfun$apply$2.apply(DataLoadServiceImpl.scala:283)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:270)
    at utils.CPMLogger.infoTimed(LoggingUtils.scala:69)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$3.apply(DataLoadServiceImpl.scala:283)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$3.apply(DataLoadServiceImpl.scala:278)
    at scala.Option.foreach(Option.scala:236)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply$mcV$sp(DataLoadServiceImpl.scala:278)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply(DataLoadServiceImpl.scala:189)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply(DataLoadServiceImpl.scala:189)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:270)
    at utils.CPMLogger.infoTimed(LoggingUtils.scala:69)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1.apply(DataLoadServiceImpl.scala:189)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2$$anonfun$apply$mcV$sp$1.apply(DataLoadServiceImpl.scala:180)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService.services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack(DataLoadServiceImpl.scala:129)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply$mcV$sp(DataLoadServiceImpl.scala:180)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply(DataLoadServiceImpl.scala:175)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply(DataLoadServiceImpl.scala:175)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[info] application - [BEGIN][1456954107624] [jingding279.pivotlinkqa.com|Fake User] Deleting records from table [Campaign] using Map Reduce
Job: jingding279.pivotlinkqa.com|Fake User: Delete from Table Campaign Job has 1 tasks
    Task: Delete Task, Table: Campaign
        will run after nothing
    Input: Input: From table: "Campaign"
    Output: Output: Table: Campaign
Setting input table to: Campaign
Initializing output table to: Campaign
Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:389)
    at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:158)
    ... 8 more

java.util.concurrent.ExecutionException: Boxed Error
    at scala.concurrent.impl.Promise$.resolver(Promise.scala:55)
    at scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$$resolveTry(Promise.scala:47)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:244)
    at scala.concurrent.Promise$class.complete(Promise.scala:55)
    at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23)
    at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.AssertionError: assertion failed: Delete job failed for Campaign
    at scala.Predef$.assert(Predef.scala:179)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply$mcV$sp(BaseTable.scala:119)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:270)
    at utils.CPMLogger.infoTimed(LoggingUtils.scala:69)
    at models.BaseTable.deleteByPrefix(BaseTable.scala:113)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack$2.apply$mcV$sp(DataLoadServiceImpl.scala:142)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack$2.apply(DataLoadServiceImpl.scala:139)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack$2.apply(DataLoadServiceImpl.scala:139)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:270)
    at utils.CPMLogger.infoTimed(LoggingUtils.scala:69)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService.services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack(DataLoadServiceImpl.scala:139)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply$mcV$sp(DataLoadServiceImpl.scala:180)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply(DataLoadServiceImpl.scala:175)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply(DataLoadServiceImpl.scala:175)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    ... 6 more
[info] application - [END][1456954066226] [FAILURE] [jingding279.pivotlinkqa.com|Fake User] Data Load for Campaign Duration: 00:01:15.933
java.util.concurrent.ExecutionException: Boxed Error
    at scala.concurrent.impl.Promise$.resolver(Promise.scala:55)
    at scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$$resolveTry(Promise.scala:47)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:244)
    at scala.concurrent.Promise$class.complete(Promise.scala:55)
    at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23)
    at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.AssertionError: assertion failed: Delete job failed for Campaign
    at scala.Predef$.assert(Predef.scala:179)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply$mcV$sp(BaseTable.scala:119)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113)
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:270)
    at utils.CPMLogger.infoTimed(LoggingUtils.scala:69)
    at models.BaseTable.deleteByPrefix(BaseTable.scala:113)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack$2.apply$mcV$sp(DataLoadServiceImpl.scala:142)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack$2.apply(DataLoadServiceImpl.scala:139)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack$2.apply(DataLoadServiceImpl.scala:139)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288)
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:270)
    at utils.CPMLogger.infoTimed(LoggingUtils.scala:69)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService.services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$updateTableStatusWithRollBack(DataLoadServiceImpl.scala:139)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply$mcV$sp(DataLoadServiceImpl.scala:180)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply(DataLoadServiceImpl.scala:175)
    at services.impl.DefaultDataLoadServiceComponent$DefaultDataLoadService$$anonfun$services$impl$DefaultDataLoadServiceComponent$DefaultDataLoadService$$loadData$2$$anonfun$2.apply(DataLoadServiceImpl.scala:175)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    ... 6 more
[error] application - [jingding279.pivotlinkqa.com|Fake User] Data Load Failed for Campaign table with load rule LoadRuleForCampaignTable
()
java.util.concurrent.ExecutionException: Boxed Error
    at scala.concurrent.impl.Promise$.resolver(Promise.scala:55) ~[org.scala-lang.scala-library-2.10.5.jar:na]
    at scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$$resolveTry(Promise.scala:47) ~[org.scala-lang.scala-library-2.10.5.jar:na]
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:244) ~[org.scala-lang.scala-library-2.10.5.jar:na]
    at scala.concurrent.Promise$class.complete(Promise.scala:55) ~[org.scala-lang.scala-library-2.10.5.jar:na]
    at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153) ~[org.scala-lang.scala-library-2.10.5.jar:na]
Caused by: java.lang.AssertionError: assertion failed: Delete job failed for Campaign
    at scala.Predef$.assert(Predef.scala:179) ~[org.scala-lang.scala-library-2.10.5.jar:na]
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply$mcV$sp(BaseTable.scala:119) ~[cpm.cpm-2.6.9-DEV.jar:2.6.9-DEV]
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113) ~[cpm.cpm-2.6.9-DEV.jar:2.6.9-DEV]
    at models.BaseTable$$anonfun$deleteByPrefix$2.apply(BaseTable.scala:113) ~[cpm.cpm-2.6.9-DEV.jar:2.6.9-DEV]
    at utils.CPMLogger$LogLevel.timed(LoggingUtils.scala:288) ~[cpm.cpm-2.6.9-DEV.jar:2.6.9-DEV]