Member since
11-22-2019
16
Posts
0
Kudos Received
0
Solutions
01-14-2021
09:41 PM
Hi Aakulov, Thanks for your advice, I checked it could be the connection error after restart my hive services the error above resolve but I observe the new error code as below: Could you advice ? 2021-01-15 13:40:44,503 WARN [Timer-Driven Process Thread-5] o.apache.nifi.processors.hive.PutHiveQL PutHiveQL[id=f821e7aa-0176-1000-9088-506b00a72e66] Administratively yielding PutHiveQL_Listings after rolling back due to org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=52458c75-cdb5-415d-968b-d7b65fe6e54b,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1610689175761-2338, container=default, section=290], offset=139, length=139],offset=0,name=52458c75-cdb5-415d-968b-d7b65fe6e54b,size=139] due to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.StatsTask:
2021-01-15 13:40:44,503 ERROR [Timer-Driven Process Thread-5] o.apache.nifi.processors.hive.PutHiveQL PutHiveQL[id=f821e7aa-0176-1000-9088-506b00a72e66] Failed to process session due to org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=52458c75-cdb5-415d-968b-d7b65fe6e54b,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1610689175761-2338, container=default, section=290], offset=139, length=139],offset=0,name=52458c75-cdb5-415d-968b-d7b65fe6e54b,size=139] due to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.StatsTask: org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=52458c75-cdb5-415d-968b-d7b65fe6e54b,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1610689175761-2338, container=default, section=290], offset=139, length=139],offset=0,name=52458c75-cdb5-415d-968b-d7b65fe6e54b,size=139] due to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.StatsTask
org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=52458c75-cdb5-415d-968b-d7b65fe6e54b,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1610689175761-2338, container=default, section=290], offset=139, length=139],offset=0,name=52458c75-cdb5-415d-968b-d7b65fe6e54b,size=139] due to java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.StatsTask
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnGroupError$2(ExceptionHandler.java:226)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnError$1(ExceptionHandler.java:179)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$new$4(PutHiveQL.java:226)
at org.apache.nifi.processor.util.pattern.Put.putFlowFiles(Put.java:60)
at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:103)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:295)
at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:120)
at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
at org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:295)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1174)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.StatsTask
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)
at org.apache.hive.jdbc.HivePreparedStatement.execute(HivePreparedStatement.java:98)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at sun.reflect.GeneratedMethodAccessor789.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at com.sun.proxy.$Proxy227.execute(Unknown Source)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$null$3(PutHiveQL.java:254)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
... 18 common frames omitted
2021-01-15 13:40:45,504 INFO [Timer-Driven Process Thread-8] o.a.nifi.dbcp.hive.HiveConnectionPool HiveConnectionPool[id=ffd816d0-0176-1000-e684-cec6f778cb48] Simple Authentication
2021-01-15 13:40:45,504 INFO [Timer-Driven Process Thread-8] hive.ql.parse.ParseDriver Parsing command: insert into transformed_db.tbl_airbnb_listing_transformed
select a.*, 20210113 partition_date_id from
staging_db.etbl_raw_airbnb_listing a
2021-01-15 13:40:45,505 INFO [Timer-Driven Process Thread-8] hive.ql.parse.ParseDriver Parse Completed
... View more
01-14-2021
05:36 AM
Hi All,
I have try to connect one simple insert workflow to hive.
But I encounter some error, could someone help on this ?
2021-01-14 21:30:14,245 ERROR [Timer-Driven Process Thread-4] o.apache.nifi.processors.hive.PutHiveQL PutHiveQL[id=f821e7aa-0176-1000-9088-506b00a72e66] org.apache.nifi.processors.hive.PutHiveQL$$Lambda$1066/699859650@1c9df9e2 failed to process due to org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=acf2bb1a-10fa-4dc5-8e5f-c12bf706f45f,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1610610213856-985, container=default, section=985], offset=89099, length=139],offset=0,name=acf2bb1a-10fa-4dc5-8e5f-c12bf706f45f,size=139] due to java.sql.SQLException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed); rolling back session: org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=acf2bb1a-10fa-4dc5-8e5f-c12bf706f45f,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1610610213856-985, container=default, section=985], offset=89099, length=139],offset=0,name=acf2bb1a-10fa-4dc5-8e5f-c12bf706f45f,size=139] due to java.sql.SQLException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
org.apache.nifi.processor.exception.ProcessException: Failed to process StandardFlowFileRecord[uuid=acf2bb1a-10fa-4dc5-8e5f-c12bf706f45f,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1610610213856-985, container=default, section=985], offset=89099, length=139],offset=0,name=acf2bb1a-10fa-4dc5-8e5f-c12bf706f45f,size=139] due to java.sql.SQLException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnGroupError$2(ExceptionHandler.java:226)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.lambda$createOnError$1(ExceptionHandler.java:179)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$new$4(PutHiveQL.java:226)
at org.apache.nifi.processor.util.pattern.Put.putFlowFiles(Put.java:60)
at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:103)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:295)
at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:120)
at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
at org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:295)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1174)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.sql.SQLException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:262)
at org.apache.hive.jdbc.HivePreparedStatement.execute(HivePreparedStatement.java:98)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
at sun.reflect.GeneratedMethodAccessor658.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at com.sun.proxy.$Proxy214.execute(Unknown Source)
at org.apache.nifi.processors.hive.PutHiveQL.lambda$null$3(PutHiveQL.java:254)
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
... 18 common frames omitted
Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:147)
at org.apache.thrift.transport.TTransport.write(TTransport.java:107)
at org.apache.thrift.transport.TSaslTransport.writeLength(TSaslTransport.java:391)
at org.apache.thrift.transport.TSaslTransport.flush(TSaslTransport.java:499)
at org.apache.thrift.transport.TSaslClientTransport.flush(TSaslClientTransport.java:37)
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65)
at org.apache.hive.service.cli.thrift.TCLIService$Client.send_ExecuteStatement(TCLIService.java:219)
at org.apache.hive.service.cli.thrift.TCLIService$Client.ExecuteStatement(TCLIService.java:211)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:253)
... 30 common frames omitted
Caused by: java.net.SocketException: Broken pipe (Write failed)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:145)
... 38 common frames omitted
Best Regards,
Choon Kiat
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache NiFi
12-20-2020
06:46 AM
Additional Info: Best Regards, CK
... View more
12-20-2020
05:55 AM
Hi All, I facing some issues to run the hive on Dbeaver when running the query: Select count(*) I observer the error code as below: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:335) ~[hive-service-3.1.2.jar:3.1.2]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) ~[hive-service-3.1.2.jar:3.1.2]
at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) ~[hive-service-3.1.2.jar:3.1.2]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:316) ~[hive-service-3.1.2.jar:3.1.2]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_275]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_275]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) ~[hadoop-common-2.7.2.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:329) ~[hive-service-3.1.2.jar:3.1.2]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_275]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_275]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_275]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_275]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_275]
... View more
Labels:
- Labels:
-
Apache Hive
03-17-2020
08:07 PM
Hi Eric, After copy that two policy jar file to " $JAVA_HOME/jre/lib/security" still got the error code as below: Mar 18 10:53:22.385 ERROR 30 com.cloudera.hiveserver2.exceptions.ExceptionConverter.toSQLException: [Cloudera][HiveJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
java.sql.SQLException: [Cloudera][HiveJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
at com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.cloudera.hiveserver2.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106)
at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98)
at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106)
at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63)
at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1)
at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103)
Caused by: com.cloudera.hiveserver2.support.exceptions.GeneralException: [Cloudera][HiveJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
... 30 more
Caused by: javax.security.auth.login.LoginException: Unable to obtain Principal Name for authentication
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.promptForName(Unknown Source)
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Unknown Source)
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.login(Unknown Source)
at java.base/javax.security.auth.login.LoginContext.invoke(Unknown Source)
at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source)
at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.login.LoginContext.invokePriv(Unknown Source)
at java.base/javax.security.auth.login.LoginContext.login(Unknown Source)
at com.cloudera.hiveserver2.jdbc.kerberos.Kerberos.getSubjectViaTicketCache(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.cloudera.hiveserver2.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106)
at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98)
at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106)
at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63)
at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1)
at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63) After that, I try to declare the KRB5CCNAME on Environment Variable and observation different error as below: Mar 18 11:06:20.072 DEBUG 31 com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport: Kerberos subject retrieved via ticket cache lookup
Mar 18 11:06:20.222 ERROR 31 com.cloudera.hiveserver2.exceptions.ExceptionConverter.toSQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.
java.sql.SQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.
at com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.cloudera.hiveserver2.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106)
at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98)
at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106)
at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63)
at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1)
at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103)
Caused by: com.cloudera.hiveserver2.support.exceptions.GeneralException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.
... 30 more
Caused by: java.lang.RuntimeException: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed
at com.cloudera.hiveserver2.hivecommon.api.HiveServerPrivilegedAction.run(Unknown Source)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.Subject.doAs(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.cloudera.hiveserver2.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106)
at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98)
at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106)
at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63)
at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198)
at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1)
at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49)
at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86)
at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: com.cloudera.hive.jdbc4.internal.apache.thrift.transport.TTransportException: GSS initiate failed
at com.cloudera.hive.jdbc4.internal.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)
at com.cloudera.hive.jdbc4.internal.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)
at com.cloudera.hive.jdbc4.internal.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
... 34 more Regards, Choon Kiat
... View more
03-17-2020
07:01 PM
Hi Eric, Yes due to privacy, I was changed "AIU.XXXXXX" to Domain. Best Regards, Choon Kiat
... View more
03-16-2020
10:16 PM
Hi Eric,
Thanks for your advice, I had referring the document you provide to configure the string but still face the same error.
I attach all info as below, appreciate your could help.
1. jass.ini
Client{ com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab="c:\ProgramData\MIT\Kerberos5\hive.keytab" principal="hive/sthdmgt1.aiu.xxxxxx@AIU.XXXXXX"; doNotPrompt=true };
Client{ com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab="c:\ProgramData\MIT\Kerberos5\hive.keytab" principal="hive/sthdmgt1.aiu.xxxxxx@Domain"; doNotPrompt=true };
2. dbeaver.ini
-startup plugins/org.eclipse.equinox.launcher_1.5.600.v20191014-2022.jar --launcher.library plugins/org.eclipse.equinox.launcher.win32.win32.x86_64_1.1.1100.v20190907-0426 -vmargs -XX:+IgnoreUnrecognizedVMOptions --add-modules=ALL-SYSTEM -Xms64m -Xmx1024m -Djavax.security.auth.useSubjectCredsOnly=false -Djava.security.krb5.conf=C:\Program Files\DBeaver\krb5.conf -Djava.security.auth.login.config=C:\Program Files\DBeaver\jaas.conf
3. JDBC String
jdbc:hive2://{host}:{port}/{database};AuthMech=1;KrbRealm=Domain;KrbHostFQDN={server};KrbServiceName=hive;KrbAuthType=2;LogLevel=6;LogPath=c:\ProgramData\MIT\Kerberos5\log.log
4. Krb5.ini
[libdefaults] default_realm=Domain dns_lookup_kdc = false dns_lookup_realm = false ticket_lifetime = 86400 renew_lifetime = 604800 forwardable = true default_tgs_enctypes = aes256-cts-hmac-sha1-96 default_tkt_enctypes = aes256-cts-hmac-sha1-96 permitted_enctypes = aes256-cts-hmac-sha1-96 udp_preference_limit = 1 kdc_timeout = 3000 [realms] AIU.XXXXXX={ kdc=sthdnj1-pvt.Domain admin_server=sthdnj1-pvt.Domain } [domain_realm]
5. Klist Info
C:\Program Files\MIT\Kerberos\bin>klist Ticket cache: FILE:C:\temp\krb Default principal: hive/sthdmgt1-pvt.Domain@Domain
Valid starting Expires Service principal 03/17/20 13:07:24 03/18/20 13:07:24 krbtgt/Domain@Domain renew until 03/22/20 13:07:24
Error Code:
Mar 17 13:12:05.063 ERROR 31 com.cloudera.hiveserver2.exceptions.ExceptionConverter.toSQLException: [Cloudera][HiveJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
java.sql.SQLException: [Cloudera][HiveJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
at com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.cloudera.hiveserver2.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106)
at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70)
at org.jkiss.dbeaver.ui.dialogs.connection.ConnectionWizard$ConnectionTester.run(ConnectionWizard.java:247)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103)
Caused by: com.cloudera.hiveserver2.support.exceptions.GeneralException: [Cloudera][HiveJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
... 20 more
Caused by: javax.security.auth.login.LoginException: Unable to obtain Principal Name for authentication
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.promptForName(Unknown Source)
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Unknown Source)
at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.login(Unknown Source)
at java.base/javax.security.auth.login.LoginContext.invoke(Unknown Source)
at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source)
at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.login.LoginContext.invokePriv(Unknown Source)
at java.base/javax.security.auth.login.LoginContext.login(Unknown Source)
at com.cloudera.hiveserver2.jdbc.kerberos.Kerberos.getSubjectViaTicketCache(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source)
at com.cloudera.hiveserver2.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.cloudera.hiveserver2.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.cloudera.hiveserver2.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106)
at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70)
at org.jkiss.dbeaver.ui.dialogs.connection.ConnectionWizard$ConnectionTester.run(ConnectionWizard.java:247)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Thanks & Regards,
Tan Choon Kiat
... View more
03-16-2020
01:20 AM
@EricL
Are you referring to the JDBC URL? If yes, below is the JDBC URL that i am using:
jdbc:hive2://10.11.121.20:10001/default;AuthMech=1;principal=hive/domain@domain;KrbHostFQDN=10.11.121.21;KrbServiceName=hive;KrbAuthType=2;LogLevel=6;LogPath=c:\ProgramData\MIT\Kerberos5\log.log and below is the error log during my latest testing:
Mar 16 17:24:25.121 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.DSIConnection(com.cloudera.hive.hive.core.HiveJDBCEnvironment@3c7b0e50): +++++ enter +++++ Mar 16 17:24:25.122 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(101, Variant[type: TYPE_WSTRING, value: HiveJDBC]): +++++ enter +++++ Mar 16 17:24:25.122 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(139, Variant[type: TYPE_WSTRING, value: User]): +++++ enter +++++ Mar 16 17:24:25.123 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(22, Variant[type: TYPE_WSTRING, value: Hive]): +++++ enter +++++ Mar 16 17:24:25.127 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(58, Variant[type: TYPE_WSTRING, value: `]): +++++ enter +++++ Mar 16 17:24:25.127 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(66, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:24:25.127 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(68, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:24:25.128 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(76, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:24:25.128 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(81, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:24:25.128 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(83, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:24:25.129 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(80, Variant[type: TYPE_WSTRING, value: N]): +++++ enter +++++ Mar 16 17:24:25.129 TRACE 41 com.cloudera.hive.hive.core.HiveJDBCConnection.HiveJDBCConnection(com.cloudera.hive.hive.core.HiveJDBCEnvironment@3c7b0e50): +++++ enter +++++ Mar 16 17:24:25.147 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.registerWarningListener(com.cloudera.hive.jdbc.common.SWarningListener@755cbca6): +++++ enter +++++ Mar 16 17:24:25.147 TRACE 41 com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.updateConnectionSettings(): +++++ enter +++++ Mar 16 17:24:25.151 TRACE 41 com.cloudera.hive.jdbc.common.CommonCoreUtils.logConnectionFunctionEntrance({AuthMech=Variant[type: TYPE_WSTRING, value: 1], ConnSchema=Variant[type: TYPE_WSTRING, value: default], DatabaseType=Variant[type: TYPE_WSTRING, value: Hive], HiveServerType=Variant[type: TYPE_WSTRING, value: 2], Host=Variant[type: TYPE_WSTRING, value: 10.11.121.20], KrbAuthType=Variant[type: TYPE_WSTRING, value: 2], KrbHostFQDN=Variant[type: TYPE_WSTRING, value: 10.11.121.21], KrbRealm=Variant[type: TYPE_WSTRING, value: AIU.XXXXX], KrbServiceName=Variant[type: TYPE_WSTRING, value: hive], LogLevel=Variant[type: TYPE_WSTRING, value: 6], LogPath=Variant[type: TYPE_WSTRING, value: c:\ProgramData\MIT\Kerberos5\log.log], Port=Variant[type: TYPE_WSTRING, value: 10001], principal=Variant[type: TYPE_WSTRING, value: hive/sthdmgt1-pvt.aiu.xxxxxx@AIU.XXXXXX], sskTrustStore=Variant[type: TYPE_WSTRING, value: C:\ProgramData\MIT\Kerberos5\hive.truststore], ssl=Variant[type: TYPE_WSTRING, value: 1], trustStorePassword=Variant[type: TYPE_WSTRING, value: "AiuHive"]}, "Major Version: 2", "Minor Version: 5", "Hot Fix Version: 15", "Build Number: 1040", "java.vendor:AdoptOpenJDK", "java.version:11.0.5", "os.arch:amd64", "os.name:Windows 10", "os.version:10.0", "Runtime.totalMemory:82837504", "Runtime.maxMemory:1073741824", "Runtime.avaialableProcessors:8", URLClassLoader.getURLs(): No URLClassLoader available.): +++++ enter +++++ Mar 16 17:24:25.395 TRACE 41 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaAccessControlContext(): +++++ enter +++++ Mar 16 17:24:25.406 TRACE 41 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaJAASConfig(): +++++ enter +++++ Mar 16 17:24:25.406 TRACE 41 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaTicketCache(): +++++ enter +++++ Mar 16 17:24:25.441 ERROR 41 com.cloudera.hive.exceptions.ExceptionConverter.toSQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: CONN_KERBEROS_AUTHENTICATION_ERROR_GET_TICKETCACHE. java.sql.SQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: CONN_KERBEROS_AUTHENTICATION_ERROR_GET_TICKETCACHE. at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) at com.cloudera.hive.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source) at com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.connect(Unknown Source) at com.cloudera.hive.hive.core.HiveJDBCConnection.connect(Unknown Source) at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) at com.cloudera.hive.jdbc.common.AbstractDriver.connect(Unknown Source) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124) at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106) at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72) at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95) at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98) at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106) at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63) at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1) at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71) at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103) Caused by: com.cloudera.hive.support.exceptions.GeneralException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: CONN_KERBEROS_AUTHENTICATION_ERROR_GET_TICKETCACHE. ... 30 more Caused by: com.cloudera.hive.support.exceptions.GeneralException: CONN_KERBEROS_AUTHENTICATION_ERROR_GET_TICKETCACHE ... 30 more Caused by: javax.security.auth.login.LoginException: Unable to obtain Principal Name for authentication at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.promptForName(Unknown Source) at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Unknown Source) at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.login(Unknown Source) at java.base/javax.security.auth.login.LoginContext.invoke(Unknown Source) at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source) at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.login.LoginContext.invokePriv(Unknown Source) at java.base/javax.security.auth.login.LoginContext.login(Unknown Source) at com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaTicketCache(Unknown Source) at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) at com.cloudera.hive.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source) at com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.connect(Unknown Source) at com.cloudera.hive.hive.core.HiveJDBCConnection.connect(Unknown Source) at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) at com.cloudera.hive.jdbc.common.AbstractDriver.connect(Unknown Source) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124) at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106) at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72) at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95) at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98) at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106) at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63) at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1) at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71) at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Mar 16 17:33:37.773 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.DSIConnection(com.cloudera.hive.hive.core.HiveJDBCEnvironment@327a194b): +++++ enter +++++ Mar 16 17:33:37.778 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(101, Variant[type: TYPE_WSTRING, value: HiveJDBC]): +++++ enter +++++ Mar 16 17:33:37.778 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(139, Variant[type: TYPE_WSTRING, value: User]): +++++ enter +++++ Mar 16 17:33:37.779 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(22, Variant[type: TYPE_WSTRING, value: Hive]): +++++ enter +++++ Mar 16 17:33:37.780 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(58, Variant[type: TYPE_WSTRING, value: `]): +++++ enter +++++ Mar 16 17:33:37.780 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(66, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:33:37.781 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(68, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:33:37.782 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(76, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:33:37.782 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(81, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:33:37.783 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(83, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Mar 16 17:33:37.784 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.setProperty(80, Variant[type: TYPE_WSTRING, value: N]): +++++ enter +++++ Mar 16 17:33:37.785 TRACE 41 com.cloudera.hive.hive.core.HiveJDBCConnection.HiveJDBCConnection(com.cloudera.hive.hive.core.HiveJDBCEnvironment@327a194b): +++++ enter +++++ Mar 16 17:33:37.805 TRACE 41 com.cloudera.hive.dsi.core.impl.DSIConnection.registerWarningListener(com.cloudera.hive.jdbc.common.SWarningListener@13ff0f12): +++++ enter +++++ Mar 16 17:33:37.805 TRACE 41 com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.updateConnectionSettings(): +++++ enter +++++ Mar 16 17:33:37.805 TRACE 41 com.cloudera.hive.jdbc.common.CommonCoreUtils.logConnectionFunctionEntrance({AuthMech=Variant[type: TYPE_WSTRING, value: 1], ConnSchema=Variant[type: TYPE_WSTRING, value: default], DatabaseType=Variant[type: TYPE_WSTRING, value: Hive], HiveServerType=Variant[type: TYPE_WSTRING, value: 2], Host=Variant[type: TYPE_WSTRING, value: 10.11.121.20], KrbAuthType=Variant[type: TYPE_WSTRING, value: 2], KrbHostFQDN=Variant[type: TYPE_WSTRING, value: 10.11.121.21], KrbRealm=Variant[type: TYPE_WSTRING, value: AIU.XXXXXX], KrbServiceName=Variant[type: TYPE_WSTRING, value: hive], LogLevel=Variant[type: TYPE_WSTRING, value: 6], LogPath=Variant[type: TYPE_WSTRING, value: c:\ProgramData\MIT\Kerberos5\log.log], Port=Variant[type: TYPE_WSTRING, value: 10001], principal=Variant[type: TYPE_WSTRING, value: hive/sthdmgt1-pvt.aiu.xxxxxx@AIU.XXXXXX], sskTrustStore=Variant[type: TYPE_WSTRING, value: C:\ProgramData\MIT\Kerberos5\hive.truststore], ssl=Variant[type: TYPE_WSTRING, value: 1], trustStorePassword=Variant[type: TYPE_WSTRING, value: "AiuHive"]}, "Major Version: 2", "Minor Version: 5", "Hot Fix Version: 15", "Build Number: 1040", "java.vendor:AdoptOpenJDK", "java.version:11.0.5", "os.arch:amd64", "os.name:Windows 10", "os.version:10.0", "Runtime.totalMemory:80740352", "Runtime.maxMemory:1073741824", "Runtime.avaialableProcessors:8", URLClassLoader.getURLs(): No URLClassLoader available.): +++++ enter +++++ Mar 16 17:33:38.213 TRACE 41 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaAccessControlContext(): +++++ enter +++++ Mar 16 17:33:38.231 TRACE 41 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaJAASConfig(): +++++ enter +++++ Mar 16 17:33:38.231 TRACE 41 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaTicketCache(): +++++ enter +++++ Mar 16 17:33:38.302 ERROR 41 com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport: Kerberos subject retrieved via ticket cache lookup Mar 16 17:33:39.169 ERROR 41 com.cloudera.hive.exceptions.ExceptionConverter.toSQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed Also, could not send response: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target. java.sql.SQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed Also, could not send response: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target. at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) at com.cloudera.hive.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source) at com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.connect(Unknown Source) at com.cloudera.hive.hive.core.HiveJDBCConnection.connect(Unknown Source) at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) at com.cloudera.hive.jdbc.common.AbstractDriver.connect(Unknown Source) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124) at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106) at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72) at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95) at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98) at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106) at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63) at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1) at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71) at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103) Caused by: com.cloudera.hive.support.exceptions.GeneralException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed Also, could not send response: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target. ... 30 more Caused by: java.lang.RuntimeException: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed Also, could not send response: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target at com.cloudera.hive.hivecommon.api.HiveServerPrivilegedAction.run(Unknown Source) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Unknown Source) at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) at com.cloudera.hive.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source) at com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.connect(Unknown Source) at com.cloudera.hive.hive.core.HiveJDBCConnection.connect(Unknown Source) at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) at com.cloudera.hive.jdbc.common.AbstractDriver.connect(Unknown Source) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124) at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106) at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72) at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95) at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.runSync(ConnectJob.java:98) at org.jkiss.dbeaver.ui.actions.datasource.DataSourceHandler.connectToDataSource(DataSourceHandler.java:106) at org.jkiss.dbeaver.ui.actions.datasource.UIServiceConnectionsImpl.initConnection(UIServiceConnectionsImpl.java:63) at org.jkiss.dbeaver.model.navigator.DBNDataSource.initializeNode(DBNDataSource.java:151) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:198) at org.jkiss.dbeaver.model.navigator.DBNDatabaseNode.getChildren(DBNDatabaseNode.java:1) at org.jkiss.dbeaver.model.navigator.DBNUtils.getNodeChildrenFiltered(DBNUtils.java:70) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:49) at org.jkiss.dbeaver.ui.navigator.database.load.TreeLoadService.evaluate(TreeLoadService.java:1) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:86) at org.jkiss.dbeaver.ui.LoadingJob.run(LoadingJob.java:71) at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63) Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed Also, could not send response: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ... 34 more
... View more
03-15-2020
09:04 PM
The authentication is based on Kerberos Authentication.
Have get the kerberos ticket in Windows MIT, but receiving the error message as follow when tried to connection the Cloudera Hive with DBeaver:
Mar 16 11:56:00.397 TRACE 33 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaAccessControlContext(): +++++ enter +++++ Mar 16 11:56:00.398 TRACE 33 com.cloudera.hive.jdbc.kerberos.Kerberos.getSubjectViaJAASConfig(): +++++ enter +++++ Mar 16 11:56:00.399 ERROR 33 com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport: Kerberos subject retrieved via JAAS config Mar 16 11:56:00.920 ERROR 33 com.cloudera.hive.exceptions.ExceptionConverter.toSQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed. java.sql.SQLException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed. at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) at com.cloudera.hive.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source) at com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.connect(Unknown Source) at com.cloudera.hive.hive.core.HiveJDBCConnection.connect(Unknown Source) at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) at com.cloudera.hive.jdbc.common.AbstractDriver.connect(Unknown Source) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124) at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106) at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72) at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95) at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70) at org.jkiss.dbeaver.ui.dialogs.connection.ConnectionWizard$ConnectionTester.run(ConnectionWizard.java:247) at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103) Caused by: com.cloudera.hive.support.exceptions.GeneralException: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed. ... 20 more Caused by: java.lang.RuntimeException: [Cloudera][HiveJDBCDriver](500168) Unable to connect to server: GSS initiate failed at com.cloudera.hive.hivecommon.api.HiveServerPrivilegedAction.run(Unknown Source) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Unknown Source) at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) at com.cloudera.hive.hivecommon.api.ZooKeeperEnabledExtendedHS2Factory.createClient(Unknown Source) at com.cloudera.hive.hivecommon.core.HiveJDBCCommonConnection.connect(Unknown Source) at com.cloudera.hive.hive.core.HiveJDBCConnection.connect(Unknown Source) at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) at com.cloudera.hive.jdbc.common.AbstractDriver.connect(Unknown Source) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.lambda$0(JDBCDataSource.java:157) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:174) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:124) at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:91) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:86) at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:52) at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:109) at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:106) at org.jkiss.dbeaver.ext.generic.model.meta.GenericMetaModel.createDataSourceImpl(GenericMetaModel.java:72) at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:95) at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:801) at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:70) at org.jkiss.dbeaver.ui.dialogs.connection.ConnectionWizard$ConnectionTester.run(ConnectionWizard.java:247) at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:103) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63) Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ... 24 more
... View more
Labels:
- Labels:
-
Apache Hive
-
Kerberos
01-13-2020
11:41 AM
How did you rollback MySQL? downgrade from version 5.7 to 5.5 Did you rollback to previous "back up" database? No backup had been done. Did you shutdown the cluster when you did the database upgrade? Yes, before upgrade we shut down all the cluster and services. From the error message you posted in latest post, I feel there are some out of sync issues between the CM database and the cluster activities. What the action could be taken ?
... View more
01-13-2020
10:24 AM
Could you refer the error messages as below. I had rollback to previous version of the database and checked the db.properties. currently cloudera-scm-server up and running but still getting some error. 2020-01-14 02:22:42,620 WARN com.cloudera.cmf.scheduler-1_Worker-1:com.cloudera.cmf.command.flow.CmdStep: Unexpected exception during command work
java.lang.NullPointerException
at com.cloudera.cmf.service.ScheduledSnapshotsCmdWork.doWork(ScheduledSnapshotsCmdWork.java:128)
at com.cloudera.cmf.command.flow.CmdStep.doWork(CmdStep.java:177)
at com.cloudera.cmf.command.flow.SeqCmdWork.doWork(SeqCmdWork.java:107)
at com.cloudera.cmf.command.flow.CmdStep.doWork(CmdStep.java:177)
at com.cloudera.cmf.command.flow.SeqFlowCmd.run(SeqFlowCmd.java:117)
at com.cloudera.cmf.command.CmdWorkCommand.execute(CmdWorkCommand.java:94)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommandHelper(ServiceHandlerRegistry.java:885)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommand(ServiceHandlerRegistry.java:845)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommand(ServiceHandlerRegistry.java:840)
at com.cloudera.server.cmf.components.OperationsManagerImpl.executeServiceCmd(OperationsManagerImpl.java:1790)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.dispatchCommand(CommandDispatcherJob.java:234)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.executeCommand(CommandDispatcherJob.java:191)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.access$500(CommandDispatcherJob.java:61)
at com.cloudera.cmf.scheduler.CommandDispatcherJob$1.call(CommandDispatcherJob.java:168)
at com.cloudera.cmf.scheduler.CommandDispatcherJob$1.call(CommandDispatcherJob.java:165)
at com.cloudera.server.common.RetryWrapper.executeWithRetry(RetryWrapper.java:32)
at com.cloudera.server.common.RetryUtils.executeWithRetryHelper(RetryUtils.java:210)
at com.cloudera.server.common.RetryUtils.executeWithRetryConstantSleep(RetryUtils.java:73)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.execute(CommandDispatcherJob.java:162)
at org.quartz.core.JobRunShell.run(JobRunShell.java:206)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:548)
2020-01-14 02:22:42,620 WARN com.cloudera.cmf.scheduler-1_Worker-1:com.cloudera.cmf.command.flow.CmdStep: Unexpected exception during command work
java.lang.NullPointerException
at com.cloudera.cmf.service.ScheduledSnapshotsCmdWork.doWork(ScheduledSnapshotsCmdWork.java:128)
at com.cloudera.cmf.command.flow.CmdStep.doWork(CmdStep.java:177)
at com.cloudera.cmf.command.flow.SeqCmdWork.doWork(SeqCmdWork.java:107)
at com.cloudera.cmf.command.flow.CmdStep.doWork(CmdStep.java:177)
at com.cloudera.cmf.command.flow.SeqFlowCmd.run(SeqFlowCmd.java:117)
at com.cloudera.cmf.command.CmdWorkCommand.execute(CmdWorkCommand.java:94)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommandHelper(ServiceHandlerRegistry.java:885)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommand(ServiceHandlerRegistry.java:845)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommand(ServiceHandlerRegistry.java:840)
at com.cloudera.server.cmf.components.OperationsManagerImpl.executeServiceCmd(OperationsManagerImpl.java:1790)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.dispatchCommand(CommandDispatcherJob.java:234)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.executeCommand(CommandDispatcherJob.java:191)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.access$500(CommandDispatcherJob.java:61)
at com.cloudera.cmf.scheduler.CommandDispatcherJob$1.call(CommandDispatcherJob.java:168)
at com.cloudera.cmf.scheduler.CommandDispatcherJob$1.call(CommandDispatcherJob.java:165)
at com.cloudera.server.common.RetryWrapper.executeWithRetry(RetryWrapper.java:32)
at com.cloudera.server.common.RetryUtils.executeWithRetryHelper(RetryUtils.java:210)
at com.cloudera.server.common.RetryUtils.executeWithRetryConstantSleep(RetryUtils.java:73)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.execute(CommandDispatcherJob.java:162)
at org.quartz.core.JobRunShell.run(JobRunShell.java:206)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:548)
2020-01-14 02:22:42,621 ERROR com.cloudera.cmf.scheduler-1_Worker-1:com.cloudera.cmf.scheduler.CommandDispatcherJob: Failed to invoke command (HdfsScheduledSnapshotsCommand) for schedule (8)
com.cloudera.cmf.command.CmdExecException: java.lang.NullPointerException
at com.cloudera.cmf.command.flow.SeqFlowCmd.run(SeqFlowCmd.java:119)
at com.cloudera.cmf.command.CmdWorkCommand.execute(CmdWorkCommand.java:94)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommandHelper(ServiceHandlerRegistry.java:885)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommand(ServiceHandlerRegistry.java:845)
at com.cloudera.cmf.service.ServiceHandlerRegistry.executeCommand(ServiceHandlerRegistry.java:840)
at com.cloudera.server.cmf.components.OperationsManagerImpl.executeServiceCmd(OperationsManagerImpl.java:1790)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.dispatchCommand(CommandDispatcherJob.java:234)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.executeCommand(CommandDispatcherJob.java:191)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.access$500(CommandDispatcherJob.java:61)
at com.cloudera.cmf.scheduler.CommandDispatcherJob$1.call(CommandDispatcherJob.java:168)
at com.cloudera.cmf.scheduler.CommandDispatcherJob$1.call(CommandDispatcherJob.java:165)
at com.cloudera.server.common.RetryWrapper.executeWithRetry(RetryWrapper.java:32)
at com.cloudera.server.common.RetryUtils.executeWithRetryHelper(RetryUtils.java:210)
at com.cloudera.server.common.RetryUtils.executeWithRetryConstantSleep(RetryUtils.java:73)
at com.cloudera.cmf.scheduler.CommandDispatcherJob.execute(CommandDispatcherJob.java:162)
at org.quartz.core.JobRunShell.run(JobRunShell.java:206)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:548)
Caused by: java.lang.NullPointerException
at com.cloudera.cmf.service.ScheduledSnapshotsCmdWork.doWork(ScheduledSnapshotsCmdWork.java:128)
at com.cloudera.cmf.command.flow.CmdStep.doWork(CmdStep.java:177)
at com.cloudera.cmf.command.flow.SeqCmdWork.doWork(SeqCmdWork.java:107)
at com.cloudera.cmf.command.flow.CmdStep.doWork(CmdStep.java:177)
at com.cloudera.cmf.command.flow.SeqFlowCmd.run(SeqFlowCmd.java:117)
... 16 more
2020-01-14 02:22:42,624 INFO com.cloudera.cmf.scheduler-1_Worker-1:com.cloudera.cmf.service.ServiceHandlerRegistry: Executing command GlobalPoolsRefresh BasicCmdArgs{scheduleId=1, scheduledTime=2016-04-03T00:00:00.000Z}.
... View more
01-13-2020
08:04 AM
Could someone help to advice on this ?
After upgrade mysql to latest version, cloudera-scm-server failed to start.
/var/log/cloudera-scm-server/cloudera-scm-server.log
2020-01-13 23:58:49,993 WARN main:org.hibernate.engine.jdbc.spi.SqlExceptionHelper: SQL Error: 0, SQLState: null
2020-01-13 23:58:49,994 ERROR main:org.hibernate.engine.jdbc.spi.SqlExceptionHelper: Connections could not be acquired from the underlying database!
2020-01-13 23:58:50,008 INFO main:org.springframework.beans.factory.support.DefaultListableBeanFactory: Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@6200f9cb: defining beans [commandLineConfigurationBean,entityManagerFactoryBean,com.cloudera.server.cmf.TrialState,com.cloudera.server.cmf.TrialManager,com.cloudera.cmf.crypto.LicenseLoader]; root of factory hierarchy
2020-01-13 23:58:50,008 ERROR main:com.cloudera.server.cmf.Main: Server failed.
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.cloudera.server.cmf.TrialState': Cannot resolve reference to bean 'entityManagerFactoryBean' while setting constructor argument; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactoryBean': FactoryBean threw exception on object creation; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not open connection
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:328)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:106)
at org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:616)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:148)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1003)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:907)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:485)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:293)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:290)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:192)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:585)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)
at com.cloudera.server.cmf.Main.bootstrapSpringContext(Main.java:393)
at com.cloudera.server.cmf.Main.<init>(Main.java:243)
at com.cloudera.server.cmf.Main.main(Main.java:216)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactoryBean': FactoryBean threw exception on object creation; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not open connection
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:149)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:102)
at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1440)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:247)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:192)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:322)
... 17 more
Caused by: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not open connection
at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1387)
at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1310)
at org.hibernate.ejb.AbstractEntityManagerImpl.throwPersistenceException(AbstractEntityManagerImpl.java:1397)
at org.hibernate.ejb.TransactionImpl.begin(TransactionImpl.java:62)
at com.cloudera.enterprise.AbstractWrappedEntityManager.beginForRollbackAndReadonly(AbstractWrappedEntityManager.java:89)
at com.cloudera.enterprise.dbutil.DbUtil.isInnoDbEnabled(DbUtil.java:549)
at com.cloudera.server.cmf.bootstrap.EntityManagerFactoryBean.checkMysqlTableEngineType(EntityManagerFactoryBean.java:139)
at com.cloudera.server.cmf.bootstrap.EntityManagerFactoryBean.getObject(EntityManagerFactoryBean.java:122)
at com.cloudera.server.cmf.bootstrap.EntityManagerFactoryBean.getObject(EntityManagerFactoryBean.java:65)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:142)
... 22 more
Caused by: org.hibernate.exception.GenericJDBCException: Could not open connection
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:54)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:125)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:110)
at org.hibernate.engine.jdbc.internal.LogicalConnectionImpl.obtainConnection(LogicalConnectionImpl.java:221)
at org.hibernate.engine.jdbc.internal.LogicalConnectionImpl.getConnection(LogicalConnectionImpl.java:157)
at org.hibernate.engine.transaction.internal.jdbc.JdbcTransaction.doBegin(JdbcTransaction.java:67)
at org.hibernate.engine.transaction.spi.AbstractTransactionImpl.begin(AbstractTransactionImpl.java:160)
at org.hibernate.internal.SessionImpl.beginTransaction(SessionImpl.java:1426)
at org.hibernate.ejb.TransactionImpl.begin(TransactionImpl.java:59)
... 28 more
Caused by: java.sql.SQLException: Connections could not be acquired from the underlying database!
at com.mchange.v2.sql.SqlUtils.toSQLException(SqlUtils.java:106)
at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool.checkoutPooledConnection(C3P0PooledConnectionPool.java:529)
at com.mchange.v2.c3p0.impl.AbstractPoolBackedDataSource.getConnection(AbstractPoolBackedDataSource.java:128)
at org.hibernate.service.jdbc.connections.internal.C3P0ConnectionProvider.getConnection(C3P0ConnectionProvider.java:84)
at org.hibernate.internal.AbstractSessionImpl$NonContextualJdbcConnectionAccess.obtainConnection(AbstractSessionImpl.java:292)
at org.hibernate.engine.jdbc.internal.LogicalConnectionImpl.obtainConnection(LogicalConnectionImpl.java:214)
... 33 more
Caused by: com.mchange.v2.resourcepool.CannotAcquireResourceException: A ResourcePool could not acquire a resource from its primary factory or source.
at com.mchange.v2.resourcepool.BasicResourcePool.awaitAvailable(BasicResourcePool.java:1319)
at com.mchange.v2.resourcepool.BasicResourcePool.prelimCheckoutResource(BasicResourcePool.java:557)
at com.mchange.v2.resourcepool.BasicResourcePool.checkoutResource(BasicResourcePool.java:477)
at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool.checkoutPooledConnection(C3P0PooledConnectionPool.java:525)
... 37 more
... View more
Labels:
- Labels:
-
Cloudera Manager
11-22-2019
09:20 AM
Hi Sir, Could you refer the log as below: 2019-11-23 01:18:10,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443053160 last_version = 84 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:16,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443093524 last_version = 85 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:22,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443093524 last_version = 85 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:28,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443093524 last_version = 85 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:34,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443093524 last_version = 85 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:40,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443093524 last_version = 85 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:46,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443093524 last_version = 85 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:52,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443093524 last_version = 85 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-23 01:18:58,551 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574443133669 last_version = 86 cur_worker_name = sthddn06.aiu.axiata,60020,1574441947956 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
... View more
11-22-2019
02:42 AM
Hi Sir As per your request [root@sthdnn1-pvt hbase]# tail -f hbase-cmf-hbase-MASTER-sthdnn1-pvt.aiu.axiata.log.out
2019-11-22 18:36:44,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574418972254 last_version = 71 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:36:50,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574418972254 last_version = 71 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:36:56,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419012458 last_version = 72 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:02,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419012458 last_version = 72 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:08,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419012458 last_version = 72 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:14,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419012458 last_version = 72 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:20,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419012458 last_version = 72 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:26,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419012458 last_version = 72 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:32,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419052627 last_version = 73 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:38,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419052627 last_version = 73 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:44,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419052627 last_version = 73 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:50,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419052627 last_version = 73 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:37:56,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419052627 last_version = 73 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:38:02,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419052627 last_version = 73 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:38:08,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419052627 last_version = 73 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
2019-11-22 18:38:14,667 INFO org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 1 unassigned = 0 tasks={/hbase/splitWAL/WALs%2Fsthddn12.aiu.axiata%2C60020%2C1563800191679-splitting%2Fsthddn12.aiu.axiata%252C60020%252C1563800191679.meta.1571407653211.meta=last_update = 1574419092734 last_version = 74 cur_worker_name = sthddn19.aiu.axiata,60020,1574417963955 status = in_progress incarnation = 1 resubmits = 0 batch = installed = 1 done = 0 error = 0}
... View more
11-22-2019
02:26 AM
Hi Sir,
FYI
2019-11-22 18:21:06,962 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:SUPERVISOR_PROCESS_NAME=6214-hbase-MASTER 2019-11-22 18:21:06,962 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:SUPERVISOR_GROUP_NAME=6214-hbase-MASTER 2019-11-22 18:21:06,962 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CGROUP_ROOT_BLKIO=/var/run/cloudera-scm-agent/cgroups/blkio 2019-11-22 18:21:06,962 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:HBASE_OPTS=-Djava.net.preferIPv4Stack=true -Djava.security.auth.login.config=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER/jaas.conf -Xms1073741824 -Xmx1073741824 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:ReservedCodeCacheSize=256m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/hbase_hbase-MASTER-02da15bc011658b41f9bf0ad81ff1d8d_pid41509.hprof -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh -Dhbase.log.dir=/var/log/hbase -Dhbase.log.file=hbase-cmf-hbase-MASTER-sthdmgt1-pvt.aiu.axiata.log.out -Dhbase.home.dir=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase -Dhbase.id.str= -Dhbase.root.logger=INFO,RFA -Djava.library.path=/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/native/Linux-amd64-64 -Dhbase.security.logger=INFO,RFAS 2019-11-22 18:21:06,963 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/solr 2019-11-22 18:21:06,963 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hive-hcatalog 2019-11-22 18:21:06,963 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:HBASE_LOGFILE=hbase-cmf-hbase-MASTER-sthdmgt1-pvt.aiu.axiata.log.out 2019-11-22 18:21:06,963 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/oozie 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/sentry 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:UPSTART_JOB=rc 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CONF_DIR=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:LANG=en_US.UTF-8 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:SPARK_LIBRARY_PATH=/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:HADOOP_CLASSPATH=/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/* 2019-11-22 18:21:06,964 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CGROUP_ROOT_MEMORY=/var/run/cloudera-scm-agent/cgroups/memory 2019-11-22 18:21:06,965 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CGROUP_GROUP_CPU= 2019-11-22 18:21:06,965 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:ORACLE_HOME=/usr/share/oracle/instantclient 2019-11-22 18:21:06,965 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/sqoop2 2019-11-22 18:21:06,965 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:HBASE_CLASSPATH=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/*:/usr/share/cmf/lib/plugins/event-publish-5.14.3-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.14.3.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.13.3-shaded.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/* 2019-11-22 18:21:06,965 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CONSOLETYPE=vt 2019-11-22 18:21:06,965 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hive 2019-11-22 18:21:06,965 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop-kms 2019-11-22 18:21:06,966 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:previous=N 2019-11-22 18:21:06,966 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/impala 2019-11-22 18:21:06,966 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:HBASE_ZNODE_FILE=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER/znode41509 2019-11-22 18:21:06,966 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_SPARK_CLASSPATH=/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/spark-netlib/lib/* 2019-11-22 18:21:06,966 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CLASSPATH=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER:/usr/java/latest/lib/tools.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/api-asn1-api-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/api-util-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/aws-java-sdk-bundle-1.11.134.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-beanutils-1.9.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-beanutils-core-1.8.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-codec-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-digester-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-httpclient-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-logging-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-math-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-net-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/core-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/curator-client-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/curator-framework-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/curator-recipes-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/disruptor-3.3.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/findbugs-annotations-1.3.9-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/gson-2.2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/guava-12.0.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-annotations-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-annotations-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-client-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-common-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-common-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-examples-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-external-blockcache-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop2-compat-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop2-compat-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop-compat-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop-compat-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-it-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-it-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-prefix-tree-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-procedure-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-protocol-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-resource-bundle-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-rest-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-rsgroup-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-rsgroup-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-server-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-server-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-shell-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-spark-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-thrift-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/high-scale-lib-1.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hsqldb-1.8.0.10.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/htrace-core-3.2.0-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/htrace-core.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-annotations-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-core-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-databind-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jamon-runtime-2.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jasper-compiler-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/java-xmlbuilder-0.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jaxb-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jcodings-1.0.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-client-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jets3t-0.9.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jettison-1.3.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jetty-sslengine-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/joni-2.1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jruby-cloudera-1.0.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsch-0.1.42.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsp-2.1-6.1.14.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/junit-4.12.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/libthrift-0.9.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/metrics-core-2.2.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/netty-all-4.0.23.Final.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/slf4j-api-1.7.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/spymemcached-2.11.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/zookeeper.jar:/var/run/cloudera-scm-agent/process/6214-hbase-MASTER:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//*:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/*:/var/run/cloudera-scm-agent/process/6214-hbase-MASTER:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/*:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/*:/usr/share/cmf/lib/plugins/event-publish-5.14.3-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.14.3.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.13.3-shaded.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/* 2019-11-22 18:21:06,973 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/bin/hadoop 2019-11-22 18:21:06,973 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop-0.20-mapreduce 2019-11-22 18:21:06,973 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CM_STATUS_CODES=STATUS_NONE HDFS_DFS_DIR_NOT_EMPTY HBASE_TABLE_DISABLED HBASE_TABLE_ENABLED JOBTRACKER_IN_STANDBY_MODE YARN_RM_IN_STANDBY_MODE 2019-11-22 18:21:06,973 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop-yarn 2019-11-22 18:21:06,974 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:SEARCH_HOME=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/search 2019-11-22 18:21:06,974 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:HOME=/var/run/hbase 2019-11-22 18:21:06,974 INFO org.apache.hadoop.hbase.util.ServerCommandLine: env:MALLOC_ARENA_MAX=4 2019-11-22 18:21:06,977 INFO org.apache.hadoop.hbase.util.ServerCommandLine: vmName=Java HotSpot(TM) 64-Bit Server VM, vmVendor=Oracle Corporation, vmVersion=25.144-b01 2019-11-22 18:21:06,978 INFO org.apache.hadoop.hbase.util.ServerCommandLine: vmInputArguments=[-Dproc_master, -XX:OnOutOfMemoryError=kill -9 %p, -Djava.net.preferIPv4Stack=true, -Djava.security.auth.login.config=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER/jaas.conf, -Xms1073741824, -Xmx1073741824, -XX:+UseParNewGC, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=70, -XX:+CMSParallelRemarkEnabled, -XX:ReservedCodeCacheSize=256m, -XX:+HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=/tmp/hbase_hbase-MASTER-02da15bc011658b41f9bf0ad81ff1d8d_pid41509.hprof, -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh, -Dhbase.log.dir=/var/log/hbase, -Dhbase.log.file=hbase-cmf-hbase-MASTER-sthdmgt1-pvt.aiu.axiata.log.out, -Dhbase.home.dir=/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase, -Dhbase.id.str=, -Dhbase.root.logger=INFO,RFA, -Djava.library.path=/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/native/Linux-amd64-64, -Dhbase.security.logger=INFO,RFAS] 2019-11-22 18:21:07,288 INFO org.apache.hadoop.hbase.regionserver.RSRpcServices: master/sthdmgt1-pvt.aiu.axiata/192.168.225.71:60000 server-side HConnection retries=350 2019-11-22 18:21:07,300 INFO org.apache.hadoop.hbase.ipc.SimpleRpcScheduler: Using fifo as user call queue, count=3 2019-11-22 18:21:07,315 INFO org.apache.hadoop.hbase.ipc.RpcServer: master/sthdmgt1-pvt.aiu.axiata/192.168.225.71:60000: started 10 reader(s) listening on port=60000 2019-11-22 18:21:07,381 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties 2019-11-22 18:21:07,455 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 2019-11-22 18:21:07,456 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: HBase metrics system started 2019-11-22 18:21:07,773 INFO org.apache.hadoop.security.UserGroupInformation: Login successful for user hbase/email@redacted.host using keytab file hbase.keytab 2019-11-22 18:21:07,778 INFO org.apache.hadoop.hbase.io.hfile.CacheConfig: Allocating LruBlockCache size=395.95 MB, blockSize=64 KB 2019-11-22 18:21:07,807 INFO org.apache.hadoop.hbase.io.hfile.CacheConfig: blockCache=LruBlockCache{blockCount=0, currentSize=427112, freeSize=414756568, maxSize=415183680, heapSize=427112, minSize=394424480, minFactor=0.95, multiSize=197212240, multiFactor=0.5, singleSize=98606120, singleFactor=0.25}, cacheDataOnRead=true, cacheDataOnWrite=false, cacheIndexesOnWrite=false, cacheBloomsOnWrite=false, cacheEvictOnClose=false, cacheDataCompressed=false, prefetchOnOpen=false 2019-11-22 18:21:07,809 INFO org.apache.hadoop.hbase.io.hfile.CacheConfig: blockCache=LruBlockCache{blockCount=0, currentSize=427112, freeSize=414756568, maxSize=415183680, heapSize=427112, minSize=394424480, minFactor=0.95, multiSize=197212240, multiFactor=0.5, singleSize=98606120, singleFactor=0.25}, cacheDataOnRead=true, cacheDataOnWrite=false, cacheIndexesOnWrite=false, cacheBloomsOnWrite=false, cacheEvictOnClose=false, cacheDataCompressed=false, prefetchOnOpen=false 2019-11-22 18:21:07,812 INFO org.apache.hadoop.hbase.mob.MobFileCache: MobFileCache is initialized, and the cache size is 1000 2019-11-22 18:21:08,446 INFO org.apache.hadoop.conf.Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS 2019-11-22 18:21:08,459 INFO org.apache.hadoop.hbase.fs.HFileSystem: Added intercepting call to namenode#getBlockLocations so can do block reordering using class class org.apache.hadoop.hbase.fs.HFileSystem$ReorderWALBlocks 2019-11-22 18:21:08,465 INFO org.apache.hadoop.hbase.fs.HFileSystem: Added intercepting call to namenode#getBlockLocations so can do block reordering using class class org.apache.hadoop.hbase.fs.HFileSystem$ReorderWALBlocks 2019-11-22 18:21:08,497 INFO org.apache.hadoop.conf.Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available 2019-11-22 18:21:08,606 INFO org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper: Process identifier=master:60000 connecting to ZooKeeper ensemble=sthdjt1-pvt.aiu.axiata:2181,sthdmgt1-pvt.aiu.axiata:2181,sthdnn1-pvt.aiu.axiata:2181 2019-11-22 18:21:08,611 INFO org.apache.zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-cdh5.14.2--1, built on 03/27/2018 20:39 GMT 2019-11-22 18:21:08,612 INFO org.apache.zookeeper.ZooKeeper: Client environment:host.name=sthdmgt1-pvt.aiu.axiata 2019-11-22 18:21:08,612 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.version=1.8.0_144 2019-11-22 18:21:08,612 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation 2019-11-22 18:21:08,612 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.8.0_144/jre 2019-11-22 18:21:08,612 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.class.path=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER:/usr/java/latest/lib/tools.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/api-asn1-api-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/api-util-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/aws-java-sdk-bundle-1.11.134.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-beanutils-1.9.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-beanutils-core-1.8.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-codec-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-digester-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-httpclient-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-logging-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-math-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/commons-net-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/core-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/curator-client-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/curator-framework-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/curator-recipes-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/disruptor-3.3.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/findbugs-annotations-1.3.9-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/gson-2.2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/guava-12.0.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-annotations-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-annotations-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-client-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-common-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-common-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-examples-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-external-blockcache-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop2-compat-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop2-compat-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop-compat-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-hadoop-compat-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-it-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-it-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-prefix-tree-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-procedure-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-protocol-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-resource-bundle-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-rest-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-rsgroup-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-rsgroup-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-server-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-server-1.2.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-shell-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-spark-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hbase-thrift-1.2.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/high-scale-lib-1.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/hsqldb-1.8.0.10.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/htrace-core-3.2.0-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/htrace-core.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-annotations-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-core-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-databind-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jamon-runtime-2.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jasper-compiler-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/java-xmlbuilder-0.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jaxb-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jcodings-1.0.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-client-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jets3t-0.9.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jettison-1.3.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jetty-sslengine-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/joni-2.1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jruby-cloudera-1.0.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsch-0.1.42.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsp-2.1-6.1.14.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/junit-4.12.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/libthrift-0.9.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/metrics-core-2.2.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/netty-all-4.0.23.Final.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/slf4j-api-1.7.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/spymemcached-2.11.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/zookeeper.jar:/var/run/cloudera-scm-agent/process/6214-hbase-MASTER:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/curator-client-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/gson-2.2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/aws-java-sdk-bundle-1.11.134.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/httpcore-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-beanutils-1.9.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/java-xmlbuilder-0.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jets3t-0.9.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-net-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/httpclient-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/logredactor-1.0.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jettison-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/junit-4.11.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/curator-framework-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jsch-0.1.42.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-digester-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-beanutils-core-1.8.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/hue-plugins-3.9.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jasper-compiler-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/api-util-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/azure-data-lake-store-sdk-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-httpclient-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/curator-recipes-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/api-asn1-api-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jaxb-api-2.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/slf4j-api-1.7.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/mockito-all-1.8.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/lib/stax-api-1.0-2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-azure-datalake-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-annotations-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-scrooge_2.10.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-format-javadoc.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-aws-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-scala_2.10.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-test-hadoop2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common-2.6.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-tools.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-format-sources.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-nfs-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-format.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-auth-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/xml-apis-1.3.04.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-2.6.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/guice-3.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jettison-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/guice-servlet-3.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-guice-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-client-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/spark-yarn-shuffle.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jline-2.11.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jaxb-api-2.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/stax-api-1.0-2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/spark-1.6.0-cdh5.14.2-yarn-shuffle.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-api-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-registry-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-common-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-common-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-client-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/guice-3.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/junit-4.11.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/guice-servlet-3.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/jersey-guice-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//curator-client-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//okhttp-2.4.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-rumen-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//gson-2.2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//httpcore-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-beanutils-1.9.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jets3t-0.9.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-distcp-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-net-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//httpclient-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jettison-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jackson-core-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-sls-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//junit-4.11.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//curator-framework-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jsch-0.1.42.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-datajoin-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-digester-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-archives-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//metrics-core-3.0.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-ant.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-archive-logs-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-streaming-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-httpclient-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-gridmix-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//okio-1.4.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-ant-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-azure-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-extras-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//mockito-all-1.8.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//stax-api-1.0-2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-openstack-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/libexec/../../hadoop-mapreduce/.//hadoop-auth-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/hadoop-lzo.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.14.2.jar:/var/run/cloudera-scm-agent/process/6214-hbase-MASTER:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-azure-datalake-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-thrift.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-annotations-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-scrooge_2.10.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-nfs.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-format-javadoc.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-aws-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-common-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-protobuf.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-scala_2.10.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-test-hadoop2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-common-2.6.0-cdh5.14.2-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-annotations.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-tools.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-pig.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-auth.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-format-sources.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-encoding.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-cascading.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-generator.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-hadoop.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-jackson.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-nfs-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-aws.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-column.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-format.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-common.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/parquet-common.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/hadoop-auth-2.6.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/curator-client-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/gson-2.2.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/aws-java-sdk-bundle-1.11.134.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/httpcore-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-beanutils-1.9.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jets3t-0.9.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-net-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/httpclient-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/logredactor-1.0.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jettison-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/junit-4.11.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/curator-framework-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jsch-0.1.42.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-digester-1.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/hue-plugins-3.9.0-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/api-util-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/azure-data-lake-store-sdk-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-httpclient-3.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/curator-recipes-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jaxb-api-2.2.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/slf4j-api-1.7.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/mockito-all-1.8.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/stax-api-1.0-2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/zookeeper-3.4.5-cdh5.14.2.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/jline-2.11.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/slf4j-api-1.7.5.jar:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/bin/../lib/zookeeper/lib/log4j-1.2.16.jar:/usr/share/cmf/lib/plugins/event-publish-5.14.3-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.14.3.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.13.3-shaded.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/hadoop-lzo.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.14.2.jar 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.library.path=/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/GPLEXTRAS-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.2-1.cdh5.14.2.p0.3/lib/hbase/lib/native/Linux-amd64-64 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.compiler=<NA> 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.name=Linux 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.arch=amd64 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.version=2.6.32-696.20.1.el6.x86_64 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.name=hbase 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.home=/var/run/hbase 2019-11-22 18:21:08,632 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.dir=/var/run/cloudera-scm-agent/process/6214-hbase-MASTER 2019-11-22 18:21:08,633 INFO org.apache.zookeeper.ZooKeeper: Initiating client connection, connectString=sthdjt1-pvt.aiu.axiata:2181,sthdmgt1-pvt.aiu.axiata:2181,sthdnn1-pvt.aiu.axiata:2181 sessionTimeout=60000 watcher=master:600000x0, quorum=sthdjt1-pvt.aiu.axiata:2181,sthdmgt1-pvt.aiu.axiata:2181,sthdnn1-pvt.aiu.axiata:2181, baseZNode=/hbase 2019-11-22 18:21:08,651 INFO org.apache.zookeeper.Login: Client successfully logged in. 2019-11-22 18:21:08,653 INFO org.apache.zookeeper.Login: TGT refresh thread started. 2019-11-22 18:21:08,656 INFO org.apache.zookeeper.Login: TGT valid starting at: Fri Nov 22 18:21:08 MYT 2019 2019-11-22 18:21:08,657 INFO org.apache.zookeeper.Login: TGT expires: Sat Nov 23 18:21:08 MYT 2019 2019-11-22 18:21:08,657 INFO org.apache.zookeeper.Login: TGT refresh sleeping until: Sat Nov 23 14:44:23 MYT 2019 2019-11-22 18:21:08,657 INFO org.apache.zookeeper.client.ZooKeeperSaslClient: Client will use GSSAPI as SASL mechanism. 2019-11-22 18:21:08,661 INFO org.apache.zookeeper.ClientCnxn: Opening socket connection to server sthdjt1-pvt.aiu.axiata/192.168.225.73:2181. Will attempt to SASL-authenticate using Login Context section 'Client' 2019-11-22 18:21:08,665 INFO org.apache.zookeeper.ClientCnxn: Socket connection established, initiating session, client: /192.168.225.71:53476, server: sthdjt1-pvt.aiu.axiata/192.168.225.73:2181 2019-11-22 18:21:08,671 INFO org.apache.zookeeper.ClientCnxn: Session establishment complete on server sthdjt1-pvt.aiu.axiata/192.168.225.73:2181, sessionid = 0x16e9289b7c40050, negotiated timeout = 60000 2019-11-22 18:21:08,754 INFO org.apache.hadoop.hbase.ipc.RpcServer: RpcServer.responder: starting 2019-11-22 18:21:08,755 INFO org.apache.hadoop.hbase.ipc.RpcServer: RpcServer.listener,port=60000: starting 2019-11-22 18:21:08,885 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 2019-11-22 18:21:08,889 INFO org.apache.hadoop.hbase.http.HttpRequestLog: Http request log for http.requests.master is not defined 2019-11-22 18:21:08,898 INFO org.apache.hadoop.hbase.http.HttpServer: Added global filter 'safety' (class=org.apache.hadoop.hbase.http.HttpServer$QuotingInputFilter) 2019-11-22 18:21:08,899 INFO org.apache.hadoop.hbase.http.HttpServer: Added global filter 'clickjackingprevention' (class=org.apache.hadoop.hbase.http.ClickjackingPreventionFilter) 2019-11-22 18:21:08,901 INFO org.apache.hadoop.hbase.http.HttpServer: Added filter static_user_filter (class=org.apache.hadoop.hbase.http.lib.StaticUserWebFilter$StaticUserFilter) to context master 2019-11-22 18:21:08,901 INFO org.apache.hadoop.hbase.http.HttpServer: Added filter static_user_filter (class=org.apache.hadoop.hbase.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2019-11-22 18:21:08,901 INFO org.apache.hadoop.hbase.http.HttpServer: Added filter static_user_filter (class=org.apache.hadoop.hbase.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2019-11-22 18:21:08,938 INFO org.apache.hadoop.hbase.http.HttpServer: Jetty bound to port 60010 2019-11-22 18:21:08,938 INFO org.mortbay.log: jetty-6.1.26.cloudera.4 2019-11-22 18:21:09,249 INFO org.mortbay.log: Started email@redacted.host:60010 2019-11-22 18:21:09,254 INFO org.apache.hadoop.hbase.master.HMaster: hbase.rootdir=hdfs://nameservice1/hbase, hbase.cluster.distributed=true 2019-11-22 18:21:09,265 INFO org.apache.hadoop.hbase.master.HMaster: Adding backup master ZNode /hbase/backup-masters/sthdmgt1-pvt.aiu.axiata,60000,1574418067476 2019-11-22 18:21:09,320 INFO org.apache.hadoop.hbase.master.ActiveMasterManager: Another master is the active master, sthdnn1-pvt.aiu.axiata,60000,1574417963519; waiting to become the next active master 2019-11-22 18:21:09,349 INFO org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x18022608 connecting to ZooKeeper ensemble=sthdjt1-pvt.aiu.axiata:2181,sthdmgt1-pvt.aiu.axiata:2181,sthdnn1-pvt.aiu.axiata:2181 2019-11-22 18:21:09,349 INFO org.apache.zookeeper.ZooKeeper: Initiating client connection, connectString=sthdjt1-pvt.aiu.axiata:2181,sthdmgt1-pvt.aiu.axiata:2181,sthdnn1-pvt.aiu.axiata:2181 sessionTimeout=60000 watcher=hconnection-0x180226080x0, quorum=sthdjt1-pvt.aiu.axiata:2181,sthdmgt1-pvt.aiu.axiata:2181,sthdnn1-pvt.aiu.axiata:2181, baseZNode=/hbase 2019-11-22 18:21:09,350 INFO org.apache.zookeeper.client.ZooKeeperSaslClient: Client will use GSSAPI as SASL mechanism. 2019-11-22 18:21:09,351 INFO org.apache.zookeeper.ClientCnxn: Opening socket connection to server sthdmgt1-pvt.aiu.axiata/192.168.225.71:2181. Will attempt to SASL-authenticate using Login Context section 'Client' 2019-11-22 18:21:09,351 INFO org.apache.zookeeper.ClientCnxn: Socket connection established, initiating session, client: /192.168.225.71:44370, server: sthdmgt1-pvt.aiu.axiata/192.168.225.71:2181 2019-11-22 18:21:09,354 INFO org.apache.zookeeper.ClientCnxn: Session establishment complete on server sthdmgt1-pvt.aiu.axiata/192.168.225.71:2181, sessionid = 0x26e9289b9c5003a, negotiated timeout = 60000 2019-11-22 18:21:09,377 INFO org.apache.hadoop.hbase.regionserver.HRegionServer: ClusterId : 3dfaf54a-b78e-41c7-b846-7328284c8656
... View more
11-22-2019
01:25 AM
Someone could help to look on this issues ?
19/11/22 17:24:09 INFO Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available HBase Shell; enter 'help<RETURN>' for list of supported commands. Type "exit<RETURN>" to leave the HBase Shell Version 1.2.0-cdh5.14.2, rUnknown, Tue Mar 27 13:32:17 PDT 2018
hbase(main):001:0> status
ERROR: org.apache.hadoop.hbase.PleaseHoldException: Master is initializing at org.apache.hadoop.hbase.master.HMaster.checkInitialized(HMaster.java:2378) at org.apache.hadoop.hbase.master.MasterRpcServices.getClusterStatus(MasterRpcServices.java:784) at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55652) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2191) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)
Show cluster status. Can be 'summary', 'simple', 'detailed', or 'replication'. The default is 'summary'. Examples:
hbase> status hbase> status 'simple' hbase> status 'summary' hbase> status 'detailed' hbase> status 'replication' hbase> status 'replication', 'source' hbase> status 'replication', 'sink'
hbase(main):002:0>
... View more
- Tags:
- HBase
- hbase-shell
Labels:
- Labels:
-
Apache HBase