Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

avatar
Explorer

Hi Everyone, 

i am trying to create Hive table with data from mongodb. 

I have installed mongodb in cloudera quickstart.  loaded json data. sample:

 

{
	"_id" : ObjectId("59a47286cfa9a3a73e51e749"),
	"theaterId" : 1034,
	"location" : {
		"address" : {
			"street1" : "3404 W 13th St",
			"city" : "Grand Island",
			"state" : "NE",
			"zipcode" : "68803"
		},
		"geo" : {
			"type" : "Point",
			"coordinates" : [
				-98.382164,
				40.933624
			]
		}
	}
}

 

 i have create Hive table as below 

 

CREATE EXTERNAL TABLE IF NOT EXISTS THEATER 
(
id STRING,
theaterId INT,
loc STRUCT<address:STRUCT<street1:STRING,city:STRING,city:STRING,city:STRING,street2:STRING>,
            geo:STRUCT<type:STRING,coordinates:ARRAY<DOUBLE>>
          >
)
STORED BY 'com.mongodb.hadoop.hive.MongoStorageHandler'
WITH SERDEPROPERTIES ('mongo.columns.mapping'='{
"id":"_id",
"theaterId":"theaterId",
"loc":"location"
}')
TBLPROPERTIES('mongo.uri'='mongodb://localhost:27017/movie_flix.theater');

when i run simple select i am able to run the application successfully. 

Screen Shot 2019-12-18 at 4.53.25 PM.png

 

but when i run select loc.address FROM theater; 

i am getting below error

FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

 

Did anyone face the same issue?

 

Thanks

2 ACCEPTED SOLUTIONS

avatar
Explorer

@EricL 

Thanks for responding, i was able to resolve this error by placing .jar files in /usr/lib/hadoop-yarn path and restarting the cluster.

 

1. bson-3.12.0.jar

2. mongodb-driver-3.12.0.jar

3. mongodb-driver-core-3.12.0.jar

4. mongo-hadoop-core-2.0.2.jar

5. mongo-hadoop-hive-2.0.2.jar

 

Thanks

View solution in original post

avatar
Super Guru
@sagittarian ,

Thanks for sharing the finding and solution. Yes, the error message means that the JDBC driver JAR files were missing. So adding them can resolve the issue.

Cheers
Eric

View solution in original post

5 REPLIES 5

avatar
Super Guru
@sagittarian ,

Can you please share the full stacktrace of the error? Both from HS2 log and yarn application logs?

Thanks
Eric

avatar
Explorer

@EricL 

Thanks for responding, i was able to resolve this error by placing .jar files in /usr/lib/hadoop-yarn path and restarting the cluster.

 

1. bson-3.12.0.jar

2. mongodb-driver-3.12.0.jar

3. mongodb-driver-core-3.12.0.jar

4. mongo-hadoop-core-2.0.2.jar

5. mongo-hadoop-hive-2.0.2.jar

 

Thanks

avatar
Explorer

Hi @EricL,

 

If we are using company laptop and not allowed to place these files, is there any other solutions?

Thanks.

avatar
Explorer

@EricL 

here is the full log for this error

 

note: I am not sure whether this is the exact log

2019-12-18 13:34:19,469 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for application appattempt_1576701227741_0008_000001
2019-12-18 13:34:20,151 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens:
2019-12-18 13:34:20,151 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (org.apache.hadoop.yarn.security.AMRMTokenIdentifier@5d69b59e)
2019-12-18 13:34:20,530 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter
2019-12-18 13:34:20,532 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter is org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter
2019-12-18 13:34:21,308 WARN [main] org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-12-18 13:34:21,482 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.jobhistory.EventType for class org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler
2019-12-18 13:34:21,483 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.JobEventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobEventDispatcher
2019-12-18 13:34:21,484 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.TaskEventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$TaskEventDispatcher
2019-12-18 13:34:21,486 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.TaskAttemptEventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$TaskAttemptEventDispatcher
2019-12-18 13:34:21,486 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventType for class org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler
2019-12-18 13:34:21,492 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.speculate.Speculator$EventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$SpeculatorEventDispatcher
2019-12-18 13:34:21,493 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.rm.ContainerAllocator$EventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerAllocatorRouter
2019-12-18 13:34:21,494 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncher$EventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerLauncherRouter
2019-12-18 13:34:21,540 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system [hdfs://quickstart.cloudera:8020]
2019-12-18 13:34:21,561 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system [hdfs://quickstart.cloudera:8020]
2019-12-18 13:34:21,589 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system [hdfs://quickstart.cloudera:8020]
2019-12-18 13:34:21,603 INFO [main] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Emitting job history data to the timeline server is not enabled
2019-12-18 13:34:21,659 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.JobFinishEvent$Type for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobFinishEventHandler
2019-12-18 13:34:21,866 INFO [main] org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2019-12-18 13:34:21,916 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2019-12-18 13:34:21,916 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MRAppMaster metrics system started
2019-12-18 13:34:21,928 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Adding job token for job_1576701227741_0008 to jobTokenSecretManager
2019-12-18 13:34:22,061 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Not uberizing job_1576701227741_0008 because: not enabled;
2019-12-18 13:34:22,106 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Input size for job job_1576701227741_0008 = 1. Number of splits = 1
2019-12-18 13:34:22,106 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Number of reduces for job job_1576701227741_0008 = 0
2019-12-18 13:34:22,106 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1576701227741_0008Job Transitioned from NEW to INITED
2019-12-18 13:34:22,107 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: MRAppMaster launching normal, non-uberized, multi-container job job_1576701227741_0008.
2019-12-18 13:34:22,143 INFO [main] org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 100
2019-12-18 13:34:22,156 INFO [Socket Reader #1 for port 43753] org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 43753
2019-12-18 13:34:22,199 INFO [main] org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl: Adding protocol org.apache.hadoop.mapreduce.v2.api.MRClientProtocolPB to the server
2019-12-18 13:34:22,200 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2019-12-18 13:34:22,200 INFO [IPC Server listener on 43753] org.apache.hadoop.ipc.Server: IPC Server listener on 43753: starting
2019-12-18 13:34:22,203 INFO [main] org.apache.hadoop.mapreduce.v2.app.client.MRClientService: Instantiated MRClientService at quickstart.cloudera/10.0.2.15:43753
2019-12-18 13:34:22,282 INFO [main] org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2019-12-18 13:34:22,292 INFO [main] org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2019-12-18 13:34:22,299 INFO [main] org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.mapreduce is not defined
2019-12-18 13:34:22,338 INFO [main] org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2019-12-18 13:34:22,345 INFO [main] org.apache.hadoop.http.HttpServer2: Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter) to context mapreduce
2019-12-18 13:34:22,345 INFO [main] org.apache.hadoop.http.HttpServer2: Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter) to context static
2019-12-18 13:34:22,348 INFO [main] org.apache.hadoop.http.HttpServer2: adding path spec: /mapreduce/*
2019-12-18 13:34:22,348 INFO [main] org.apache.hadoop.http.HttpServer2: adding path spec: /ws/*
2019-12-18 13:34:22,358 INFO [main] org.apache.hadoop.http.HttpServer2: Jetty bound to port 56166
2019-12-18 13:34:22,358 INFO [main] org.mortbay.log: jetty-6.1.26.cloudera.4
2019-12-18 13:34:22,388 INFO [main] org.mortbay.log: Extract jar:file:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.6.0-cdh5.13.0.jar!/webapps/mapreduce to /tmp/Jetty_0_0_0_0_56166_mapreduce____.rvyfuf/webapp
2019-12-18 13:34:22,766 INFO [main] org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:56166
2019-12-18 13:34:22,768 INFO [main] org.apache.hadoop.yarn.webapp.WebApps: Web app /mapreduce started at 56166
2019-12-18 13:34:23,062 INFO [main] org.apache.hadoop.yarn.webapp.WebApps: Registered webapp guice modules
2019-12-18 13:34:23,066 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.speculate.DefaultSpeculator: JOB_CREATE job_1576701227741_0008
2019-12-18 13:34:23,067 INFO [main] org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 3000
2019-12-18 13:34:23,068 INFO [Socket Reader #1 for port 57335] org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 57335
2019-12-18 13:34:23,073 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2019-12-18 13:34:23,075 INFO [IPC Server listener on 57335] org.apache.hadoop.ipc.Server: IPC Server listener on 57335: starting
2019-12-18 13:34:23,108 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: nodeBlacklistingEnabled:true
2019-12-18 13:34:23,108 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: maxTaskFailuresPerNode is 3
2019-12-18 13:34:23,108 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: blacklistDisablePercent is 33
2019-12-18 13:34:23,171 INFO [main] org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/10.0.2.15:8030
2019-12-18 13:34:23,257 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: maxContainerCapability: <memory:2816, vCores:2>
2019-12-18 13:34:23,257 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: queue: root.users.cloudera
2019-12-18 13:34:23,264 INFO [main] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Upper limit on the thread pool size is 500
2019-12-18 13:34:23,264 INFO [main] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: The thread pool initial size is 10
2019-12-18 13:34:23,272 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1576701227741_0008Job Transitioned from INITED to SETUP
2019-12-18 13:34:23,274 INFO [CommitterEvent Processor #0] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: JOB_SETUP
2019-12-18 13:34:23,282 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1576701227741_0008Job Transitioned from SETUP to RUNNING
2019-12-18 13:34:23,313 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1576701227741_0008_m_000000 Task Transitioned from NEW to SCHEDULED
2019-12-18 13:34:23,315 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_0 TaskAttempt Transitioned from NEW to UNASSIGNED
2019-12-18 13:34:23,316 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: mapResourceRequest:<memory:128, vCores:1>
2019-12-18 13:34:23,337 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Event Writer setup for JobId: job_1576701227741_0008, File: hdfs://quickstart.cloudera:8020/user/cloudera/.staging/job_1576701227741_0008/job_1576701227741_0008_1.jhist
2019-12-18 13:34:23,772 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system [hdfs://quickstart.cloudera:8020]
2019-12-18 13:34:23,884 INFO [IPC Server handler 0 on 43753] org.apache.hadoop.mapreduce.v2.app.client.MRClientService: Getting task report for MAP   job_1576701227741_0008. Report-size will be 1
2019-12-18 13:34:23,941 INFO [IPC Server handler 0 on 43753] org.apache.hadoop.mapreduce.v2.app.client.MRClientService: Getting task report for REDUCE   job_1576701227741_0008. Report-size will be 0
2019-12-18 13:34:24,264 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:0 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:0 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:24,294 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:2560, vCores:2> knownNMs=1
2019-12-18 13:34:26,313 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
2019-12-18 13:34:26,339 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_1576701227741_0008_01_000002 to attempt_1576701227741_0008_m_000000_0
2019-12-18 13:34:26,342 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:26,404 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The job-jar file on the remote FS is hdfs://quickstart.cloudera:8020/user/cloudera/.staging/job_1576701227741_0008/job.jar
2019-12-18 13:34:26,407 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The job-conf file on the remote FS is /user/cloudera/.staging/job_1576701227741_0008/job.xml
2019-12-18 13:34:26,419 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Adding #0 tokens and #1 secret keys for NM use for launching container
2019-12-18 13:34:26,419 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Size of containertokens_dob is 1
2019-12-18 13:34:26,419 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Putting shuffle token in serviceData
2019-12-18 13:34:26,457 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_0 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
2019-12-18 13:34:26,462 INFO [ContainerLauncher #0] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_1576701227741_0008_01_000002 taskAttempt attempt_1576701227741_0008_m_000000_0
2019-12-18 13:34:26,464 INFO [ContainerLauncher #0] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1576701227741_0008_m_000000_0
2019-12-18 13:34:26,517 INFO [ContainerLauncher #0] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1576701227741_0008_m_000000_0 : 13562
2019-12-18 13:34:26,518 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1576701227741_0008_m_000000_0] using containerId: [container_1576701227741_0008_01_000002 on NM: [quickstart.cloudera:8041]
2019-12-18 13:34:26,521 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_0 TaskAttempt Transitioned from ASSIGNED to RUNNING
2019-12-18 13:34:26,522 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1576701227741_0008_m_000000 Task Transitioned from SCHEDULED to RUNNING
2019-12-18 13:34:27,346 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:2048, vCores:1> knownNMs=1
2019-12-18 13:34:28,123 INFO [Socket Reader #1 for port 57335] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1576701227741_0008 (auth:SIMPLE)
2019-12-18 13:34:28,141 INFO [IPC Server handler 0 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1576701227741_0008_m_000002 asked for a task
2019-12-18 13:34:28,141 INFO [IPC Server handler 0 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1576701227741_0008_m_000002 given task: attempt_1576701227741_0008_m_000000_0
2019-12-18 13:34:29,250 FATAL [IPC Server handler 1 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1576701227741_0008_m_000000_0 - exited : java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:29,251 INFO [IPC Server handler 1 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_0: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:29,252 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_0: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:29,258 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_0 TaskAttempt Transitioned from RUNNING to FAIL_FINISHING_CONTAINER
2019-12-18 13:34:29,267 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 1 failures on node quickstart.cloudera
2019-12-18 13:34:29,269 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_1 TaskAttempt Transitioned from NEW to UNASSIGNED
2019-12-18 13:34:29,270 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added attempt_1576701227741_0008_m_000000_1 to list of failed maps
2019-12-18 13:34:29,349 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:0 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:29,352 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:2048, vCores:1> knownNMs=1
2019-12-18 13:34:30,359 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_1576701227741_0008_01_000002
2019-12-18 13:34:30,360 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
2019-12-18 13:34:30,360 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_0: 
2019-12-18 13:34:30,360 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_0 TaskAttempt Transitioned from FAIL_FINISHING_CONTAINER to FAILED
2019-12-18 13:34:30,360 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning container Container: [ContainerId: container_1576701227741_0008_01_000003, NodeId: quickstart.cloudera:8041, NodeHttpAddress: quickstart.cloudera:8042, Resource: <memory:512, vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service: 10.0.2.15:8041 }, ] to fast fail map
2019-12-18 13:34:30,360 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from earlierFailedMaps
2019-12-18 13:34:30,360 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_1576701227741_0008_01_000003 to attempt_1576701227741_0008_m_000000_1
2019-12-18 13:34:30,361 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:30,361 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_1 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
2019-12-18 13:34:30,362 INFO [ContainerLauncher #1] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_COMPLETED for container container_1576701227741_0008_01_000002 taskAttempt attempt_1576701227741_0008_m_000000_0
2019-12-18 13:34:30,365 INFO [ContainerLauncher #2] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_1576701227741_0008_01_000003 taskAttempt attempt_1576701227741_0008_m_000000_1
2019-12-18 13:34:30,365 INFO [ContainerLauncher #2] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1576701227741_0008_m_000000_1
2019-12-18 13:34:30,378 INFO [ContainerLauncher #2] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1576701227741_0008_m_000000_1 : 13562
2019-12-18 13:34:30,378 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1576701227741_0008_m_000000_1] using containerId: [container_1576701227741_0008_01_000003 on NM: [quickstart.cloudera:8041]
2019-12-18 13:34:30,378 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_1 TaskAttempt Transitioned from ASSIGNED to RUNNING
2019-12-18 13:34:31,366 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:2048, vCores:1> knownNMs=1
2019-12-18 13:34:32,025 INFO [Socket Reader #1 for port 57335] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1576701227741_0008 (auth:SIMPLE)
2019-12-18 13:34:32,037 INFO [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1576701227741_0008_m_000003 asked for a task
2019-12-18 13:34:32,037 INFO [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1576701227741_0008_m_000003 given task: attempt_1576701227741_0008_m_000000_1
2019-12-18 13:34:33,199 FATAL [IPC Server handler 1 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1576701227741_0008_m_000000_1 - exited : java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:33,200 INFO [IPC Server handler 1 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_1: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:33,202 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_1: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:33,203 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_1 TaskAttempt Transitioned from RUNNING to FAIL_FINISHING_CONTAINER
2019-12-18 13:34:33,204 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 2 failures on node quickstart.cloudera
2019-12-18 13:34:33,204 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_2 TaskAttempt Transitioned from NEW to UNASSIGNED
2019-12-18 13:34:33,206 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added attempt_1576701227741_0008_m_000000_2 to list of failed maps
2019-12-18 13:34:33,371 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:0 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:33,374 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:2048, vCores:1> knownNMs=1
2019-12-18 13:34:34,378 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_1576701227741_0008_01_000003
2019-12-18 13:34:34,378 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
2019-12-18 13:34:34,378 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning container Container: [ContainerId: container_1576701227741_0008_01_000004, NodeId: quickstart.cloudera:8041, NodeHttpAddress: quickstart.cloudera:8042, Resource: <memory:512, vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service: 10.0.2.15:8041 }, ] to fast fail map
2019-12-18 13:34:34,378 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from earlierFailedMaps
2019-12-18 13:34:34,378 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_1: 
2019-12-18 13:34:34,378 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_1576701227741_0008_01_000004 to attempt_1576701227741_0008_m_000000_2
2019-12-18 13:34:34,378 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_1 TaskAttempt Transitioned from FAIL_FINISHING_CONTAINER to FAILED
2019-12-18 13:34:34,378 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:34,379 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_2 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
2019-12-18 13:34:34,380 INFO [ContainerLauncher #3] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_COMPLETED for container container_1576701227741_0008_01_000003 taskAttempt attempt_1576701227741_0008_m_000000_1
2019-12-18 13:34:34,384 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_1576701227741_0008_01_000004 taskAttempt attempt_1576701227741_0008_m_000000_2
2019-12-18 13:34:34,384 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1576701227741_0008_m_000000_2
2019-12-18 13:34:34,397 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1576701227741_0008_m_000000_2 : 13562
2019-12-18 13:34:34,398 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1576701227741_0008_m_000000_2] using containerId: [container_1576701227741_0008_01_000004 on NM: [quickstart.cloudera:8041]
2019-12-18 13:34:34,398 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_2 TaskAttempt Transitioned from ASSIGNED to RUNNING
2019-12-18 13:34:35,381 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:2048, vCores:1> knownNMs=1
2019-12-18 13:34:36,009 INFO [Socket Reader #1 for port 57335] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1576701227741_0008 (auth:SIMPLE)
2019-12-18 13:34:36,019 INFO [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1576701227741_0008_m_000004 asked for a task
2019-12-18 13:34:36,020 INFO [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1576701227741_0008_m_000004 given task: attempt_1576701227741_0008_m_000000_2
2019-12-18 13:34:37,087 FATAL [IPC Server handler 3 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1576701227741_0008_m_000000_2 - exited : java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:37,087 INFO [IPC Server handler 3 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_2: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:37,089 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_2: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:37,090 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_2 TaskAttempt Transitioned from RUNNING to FAIL_FINISHING_CONTAINER
2019-12-18 13:34:37,091 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_3 TaskAttempt Transitioned from NEW to UNASSIGNED
2019-12-18 13:34:37,091 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 3 failures on node quickstart.cloudera
2019-12-18 13:34:37,091 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: Blacklisted host quickstart.cloudera
2019-12-18 13:34:37,092 INFO [Thread-52] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added attempt_1576701227741_0008_m_000000_3 to list of failed maps
2019-12-18 13:34:37,384 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:0 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:37,387 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:0, vCores:0> knownNMs=1
2019-12-18 13:34:37,387 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: Update the blacklist for application_1576701227741_0008: blacklistAdditions=1 blacklistRemovals=0
2019-12-18 13:34:37,387 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: Ignore blacklisting set to true. Known: 1, Blacklisted: 1, 100%
2019-12-18 13:34:38,390 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: Update the blacklist for application_1576701227741_0008: blacklistAdditions=0 blacklistRemovals=1
2019-12-18 13:34:38,390 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_1576701227741_0008_01_000004
2019-12-18 13:34:38,390 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:0 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:38,390 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_2: 
2019-12-18 13:34:38,390 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_2 TaskAttempt Transitioned from FAIL_FINISHING_CONTAINER to FAILED
2019-12-18 13:34:38,391 INFO [ContainerLauncher #5] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_COMPLETED for container container_1576701227741_0008_01_000004 taskAttempt attempt_1576701227741_0008_m_000000_2
2019-12-18 13:34:39,396 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
2019-12-18 13:34:39,396 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning container Container: [ContainerId: container_1576701227741_0008_01_000005, NodeId: quickstart.cloudera:8041, NodeHttpAddress: quickstart.cloudera:8042, Resource: <memory:512, vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service: 10.0.2.15:8041 }, ] to fast fail map
2019-12-18 13:34:39,396 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from earlierFailedMaps
2019-12-18 13:34:39,396 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_1576701227741_0008_01_000005 to attempt_1576701227741_0008_m_000000_3
2019-12-18 13:34:39,396 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:39,397 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_3 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
2019-12-18 13:34:39,398 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_1576701227741_0008_01_000005 taskAttempt attempt_1576701227741_0008_m_000000_3
2019-12-18 13:34:39,398 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1576701227741_0008_m_000000_3
2019-12-18 13:34:39,409 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1576701227741_0008_m_000000_3 : 13562
2019-12-18 13:34:39,410 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1576701227741_0008_m_000000_3] using containerId: [container_1576701227741_0008_01_000005 on NM: [quickstart.cloudera:8041]
2019-12-18 13:34:39,410 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_3 TaskAttempt Transitioned from ASSIGNED to RUNNING
2019-12-18 13:34:40,399 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1576701227741_0008: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:2048, vCores:1> knownNMs=1
2019-12-18 13:34:40,954 INFO [Socket Reader #1 for port 57335] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1576701227741_0008 (auth:SIMPLE)
2019-12-18 13:34:40,966 INFO [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1576701227741_0008_m_000005 asked for a task
2019-12-18 13:34:40,966 INFO [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1576701227741_0008_m_000005 given task: attempt_1576701227741_0008_m_000000_3
2019-12-18 13:34:42,223 FATAL [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1576701227741_0008_m_000000_3 - exited : java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:42,223 INFO [IPC Server handler 4 on 57335] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_3: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:42,225 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1576701227741_0008_m_000000_3: Error: java.lang.ClassNotFoundException: com.mongodb.MongoClientURI
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:241)
	at com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit.readFields(HiveMongoInputFormat.java:311)
	at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:172)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
	at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
	at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2019-12-18 13:34:42,226 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_3 TaskAttempt Transitioned from RUNNING to FAIL_FINISHING_CONTAINER
2019-12-18 13:34:42,229 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1576701227741_0008_m_000000 Task Transitioned from RUNNING to FAILED
2019-12-18 13:34:42,229 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Num completed Tasks: 1
2019-12-18 13:34:42,229 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Job failed as tasks failed. failedMaps:1 failedReduces:0
2019-12-18 13:34:42,230 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1576701227741_0008Job Transitioned from RUNNING to FAIL_ABORT
2019-12-18 13:34:42,234 INFO [CommitterEvent Processor #1] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: JOB_ABORT
2019-12-18 13:34:42,240 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1576701227741_0008Job Transitioned from FAIL_ABORT to FAILED
2019-12-18 13:34:42,242 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: We are finishing cleanly so this is the last retry
2019-12-18 13:34:42,242 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify RMCommunicator isAMLastRetry: true
2019-12-18 13:34:42,242 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: RMCommunicator notified that shouldUnregistered is: true
2019-12-18 13:34:42,242 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify JHEH isAMLastRetry: true
2019-12-18 13:34:42,242 INFO [Thread-69] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: JobHistoryEventHandler notified that forceJobCompletion is true
2019-12-18 13:34:42,242 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Calling stop for all the services
2019-12-18 13:34:42,243 INFO [Thread-69] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopping JobHistoryEventHandler. Size of the outstanding queue size is 0
2019-12-18 13:34:42,277 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying hdfs://quickstart.cloudera:8020/user/cloudera/.staging/job_1576701227741_0008/job_1576701227741_0008_1.jhist to hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008-1576704856456-cloudera-select+loc.address+FROM+theater%28Stage%2D1%29-1576704882229-0-0-FAILED-root.users.cloudera-1576704863268.jhist_tmp
2019-12-18 13:34:42,303 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied to done location: hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008-1576704856456-cloudera-select+loc.address+FROM+theater%28Stage%2D1%29-1576704882229-0-0-FAILED-root.users.cloudera-1576704863268.jhist_tmp
2019-12-18 13:34:42,307 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying hdfs://quickstart.cloudera:8020/user/cloudera/.staging/job_1576701227741_0008/job_1576701227741_0008_1_conf.xml to hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008_conf.xml_tmp
2019-12-18 13:34:42,340 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied to done location: hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008_conf.xml_tmp
2019-12-18 13:34:42,350 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008.summary_tmp to hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008.summary
2019-12-18 13:34:42,354 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008_conf.xml_tmp to hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008_conf.xml
2019-12-18 13:34:42,356 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008-1576704856456-cloudera-select+loc.address+FROM+theater%28Stage%2D1%29-1576704882229-0-0-FAILED-root.users.cloudera-1576704863268.jhist_tmp to hdfs://quickstart.cloudera:8020/user/history/done_intermediate/cloudera/job_1576701227741_0008-1576704856456-cloudera-select+loc.address+FROM+theater%28Stage%2D1%29-1576704882229-0-0-FAILED-root.users.cloudera-1576704863268.jhist
2019-12-18 13:34:42,358 INFO [Thread-69] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopped JobHistoryEventHandler. super.stop()
2019-12-18 13:34:42,359 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: KILLING attempt_1576701227741_0008_m_000000_3
2019-12-18 13:34:42,373 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1576701227741_0008_m_000000_3 TaskAttempt Transitioned from FAIL_FINISHING_CONTAINER to FAILED
2019-12-18 13:34:42,379 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Setting job diagnostics to Task failed task_1576701227741_0008_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

2019-12-18 13:34:42,380 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: History url is <a href="http://quickstart.cloudera:19888/jobhistory/job/job_1576701227741_0008" target="_blank">http://quickstart.cloudera:19888/jobhistory/job/job_1576701227741_0008</a>
2019-12-18 13:34:42,386 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Waiting for application to be successfully unregistered.
2019-12-18 13:34:43,392 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Final Stats: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:0 RackLocal:0
2019-12-18 13:34:43,393 INFO [Thread-69] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Deleting staging directory hdfs://quickstart.cloudera:8020 /user/cloudera/.staging/job_1576701227741_0008
2019-12-18 13:34:43,401 INFO [Thread-69] org.apache.hadoop.ipc.Server: Stopping server on 57335
2019-12-18 13:34:43,408 INFO [IPC Server listener on 57335] org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 57335
2019-12-18 13:34:43,412 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: Stopping IPC Server Responder
2019-12-18 13:34:43,412 INFO [TaskHeartbeatHandler PingChecker] org.apache.hadoop.mapreduce.v2.app.TaskHeartbeatHandler: TaskHeartbeatHandler thread interrupted
2019-12-18 13:34:43,413 INFO [Ping Checker] org.apache.hadoop.yarn.util.AbstractLivelinessMonitor: TaskAttemptFinishingMonitor thread interrupted
2019-12-18 13:34:48,414 INFO [Thread-69] org.apache.hadoop.ipc.Server: Stopping server on 43753
2019-12-18 13:34:48,415 INFO [IPC Server listener on 43753] org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 43753
2019-12-18 13:34:48,415 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: Stopping IPC Server Responder
2019-12-18 13:34:48,424 INFO [Thread-69] org.mortbay.log: Stopped HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:0

avatar
Super Guru
@sagittarian ,

Thanks for sharing the finding and solution. Yes, the error message means that the JDBC driver JAR files were missing. So adding them can resolve the issue.

Cheers
Eric