[root@cm-hue-01 scc-dev]# ^C [root@cm-hue-01 scc-dev]# yarn logs -applicationId application_1654062818531_0167 WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS. Container: container_e20_1654062818531_0167_01_000001 on data-01.novalocal_8041 LogAggregationType: AGGREGATED =============================================================================== LogType:container-localizer-syslog LogLastModifiedTime:Thu Jun 09 11:31:06 +0530 2022 LogLength:506 LogContents: 2022-06-09 11:30:04,938 INFO [main] org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer: Disk Validator: yarn.nodemanager.disk-validator is loaded. 2022-06-09 11:30:06,170 WARN [ContainerLocalizer Downloader] org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error End of LogType:container-localizer-syslog ******************************************************************************************* End of LogType:prelaunch.err ****************************************************************************** Container: container_e20_1654062818531_0167_01_000001 on data-01.novalocal_8041 LogAggregationType: AGGREGATED =============================================================================== LogType:prelaunch.out LogLastModifiedTime:Thu Jun 09 11:31:06 +0530 2022 LogLength:70 LogContents: Setting up env variables Setting up job resources Launching container End of LogType:prelaunch.out ****************************************************************************** Container: container_e20_1654062818531_0167_01_000001 on data-01.novalocal_8041 LogAggregationType: AGGREGATED =============================================================================== LogType:stderr LogLastModifiedTime:Thu Jun 09 11:31:06 +0530 2022 LogLength:15073 LogContents: DEBUG StatusLogger Loaded Provider Provider[priority=10, className=org.apache.logging.log4j.core.impl.Log4jContextFactory, url=jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/log4j-core-2.8.2.jar!/META-INF/log4j-provider.properties, classLoader=sun.misc.Launcher$AppClassLoader@3b764bce] DEBUG StatusLogger Loaded Provider Provider[priority=10, className=org.apache.logging.log4j.core.impl.Log4jContextFactory, url=jar:file:/data/yarn/nm/filecache/6219/3.0.0-cdh6.3.2-mr-framework.tar.gz/log4j-core-2.8.2.jar!/META-INF/log4j-provider.properties, classLoader=sun.misc.Launcher$AppClassLoader@3b764bce] DEBUG StatusLogger Using ShutdownCallbackRegistry class org.apache.logging.log4j.core.util.DefaultShutdownCallbackRegistry DEBUG StatusLogger Using ShutdownCallbackRegistry class org.apache.logging.log4j.core.util.DefaultShutdownCallbackRegistry DEBUG StatusLogger Took 0.154259 seconds to load 226 plugins from sun.misc.Launcher$AppClassLoader@3b764bce DEBUG StatusLogger PluginManager 'Converter' found 45 plugins DEBUG StatusLogger Starting OutputStreamManager SYSTEM_OUT.false.false-1 DEBUG StatusLogger Starting LoggerContext[name=3b764bce, org.apache.logging.log4j.core.LoggerContext@767e20cf]... DEBUG StatusLogger Initializing Thread Context Data Service Providers DEBUG StatusLogger Thread Context Data Service Provider initialization complete DEBUG StatusLogger Reconfiguration started for context[name=3b764bce] at URI null (org.apache.logging.log4j.core.LoggerContext@767e20cf) with optional ClassLoader: null DEBUG StatusLogger PluginManager 'ConfigurationFactory' found 4 plugins DEBUG StatusLogger Missing dependencies for Yaml support, ConfigurationFactory org.apache.logging.log4j.core.config.yaml.YamlConfigurationFactory is inactive DEBUG StatusLogger Using configurationFactory org.apache.logging.log4j.core.config.ConfigurationFactory$Factory@7103cb56 ERROR StatusLogger Reconfiguration failed: No configuration found for '3b764bce' at 'null' in 'null' DEBUG StatusLogger Shutdown hook enabled. Registering a new one. DEBUG StatusLogger LoggerContext[name=3b764bce, org.apache.logging.log4j.core.LoggerContext@767e20cf] started OK. org.apache.hive.service.cli.HiveSQLException: RuntimeException: couldn't retrieve HBase table (cim.Locations) info: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RetriesExhaustedException: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RemoteWithExtrasException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:266) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:252) at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:318) at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:259) at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:473) at com.cisco.cim.oozie.util.DBManager.executeImpalaCmdQuery(DBManager.java:150) at com.cisco.cim.oozie.util.ETL.getCities(ETL.java:296) at com.cisco.cim.oozie.util.ETL.ETLloop(ETL.java:1291) at com.cisco.cim.oozie.action.ImpalaETLAction.main(ImpalaETLAction.java:209) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104) at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) End of LogType:stderr *********************************************************************** Container: container_e20_1654062818531_0167_01_000001 on data-01.novalocal_8041 LogAggregationType: AGGREGATED =============================================================================== LogType:stdout LogLastModifiedTime:Thu Jun 09 11:31:06 +0530 2022 LogLength:1140261 LogContents: 09:09:14.408 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]) 09:09:14.408 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]) 09:09:14.408 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]) 09:09:14.408 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Renewal failures since startup]) 09:09:14.408 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Renewal failures since last successful login]) 09:09:14.408 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics Launcher AM configuration loaded 09:09:14.408 [main] DEBUG org.apache.hadoop.security.SecurityUtil - Setting hadoop.security.token.service.use_ip to true 09:09:14.408 [main] DEBUG org.apache.hadoop.util.Shell - Failed to detect a valid hadoop home directory java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:469) at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:440) at org.apache.hadoop.util.Shell.(Shell.java:517) at org.apache.hadoop.util.StringUtils.(StringUtils.java:78) at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1651) at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:102) at org.apache.hadoop.security.SecurityUtil.(SecurityUtil.java:86) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:326) at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:380) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:135) 09:09:14.408 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0 09:09:14.408 [main] DEBUG org.apache.hadoop.security.Groups - Creating new Groups object 09:09:14.408 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using local user:UnixPrincipal: oozie 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "UnixPrincipal: oozie" with name oozie 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "oozie" 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Reading credentials from location set in HADOOP_TOKEN_FILE_LOCATION: /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/container_tokens 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Loaded 4 tokens 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - UGI loginUser:oozie (auth:SIMPLE) Executing Oozie Launcher with tokens: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id { id: 167 cluster_timestamp: 1654062818531 } attemptId: 1 } keyId: -1310509196) Kind: RM_DELEGATION_TOKEN, Service: 10.106.8.128:8032,10.106.8.129:8032, Ident: (RM_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401835, maxDate=1655359201835, sequenceNumber=48642, masterKeyId=388) Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) Kind: MR_DELEGATION_TOKEN, Service: 10.106.8.129:10020, Ident: (MR_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401773, maxDate=1655359201773, sequenceNumber=167, masterKeyId=10) 09:09:14.408 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) 09:09:14.408 [main] DEBUG org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.async.AMRMClientAsync entered state INITED 09:09:14.408 [main] DEBUG org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl entered state INITED 09:09:14.409 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.yarn.client.RMProxy.getProxy(RMProxy.java:147) 09:09:14.409 [main] DEBUG org.apache.hadoop.yarn.ipc.YarnRPC - Creating YarnRPC for org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC 09:09:14.409 [main] DEBUG org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC - Creating a HadoopYarnProtoRpc proxy for protocol interface org.apache.hadoop.yarn.api.ApplicationMasterProtocol 09:09:14.409 [main] DEBUG org.apache.hadoop.ipc.Server - rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@2d0399f4 09:09:14.409 [main] DEBUG org.apache.hadoop.ipc.Client - ipc.client.bind.wildcard.addr set to true. Will bind client sockets to wildcard address. 09:09:14.409 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.409 [main] DEBUG org.apache.hadoop.service.AbstractService - Service org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl is started 09:09:14.409 [main] DEBUG org.apache.hadoop.service.AbstractService - Service org.apache.hadoop.yarn.client.api.async.AMRMClientAsync is started 09:09:14.409 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.409 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to cm-hue-01.novalocal/10.106.8.128:8030 09:09:14.409 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB info:org.apache.hadoop.yarn.security.SchedulerSecurityInfo$1@5286c33a 09:09:14.409 [main] DEBUG org.apache.hadoop.yarn.security.AMRMTokenSelector - Looking for a token with service 10.106.8.128:8030 09:09:14.409 [main] DEBUG org.apache.hadoop.yarn.security.AMRMTokenSelector - Token kind is YARN_AM_RM_TOKEN and the token's service name is 10.106.8.128:8030,10.106.8.129:8030 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ApplicationMasterProtocolPB 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: Cg4KCginARDjyc7ukTAQARD07oyP+/////8B 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.409 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"Cg4KCginARDjyc7ukTAQARD07oyP+/////8B\",realm=\"default\",nonce=\"woRVwIY220LrzNUIZrI3Q1wrENx9Opqro5PQ/OtE\",nc=00000001,cnonce=\"GKFKhlobfuwvxyi6Gk6hcd3uyg5bk2vOaoj52rY/\",digest-uri=\"/default\",maxbuf=65536,response=9fee447ced4ae076708537ab8b322f18,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.409 [main] DEBUG org.apache.hadoop.ipc.Client - Negotiated QOP is :auth 09:09:14.409 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie: starting, having connections 1 09:09:14.409 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie sending #0 org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB.registerApplicationMaster 09:09:14.409 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie got value #0 09:09:14.409 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: registerApplicationMaster took 647ms Removing token from the Ugi: YARN_AM_RM_TOKEN Executing Action Main with tokens: Kind: RM_DELEGATION_TOKEN, Service: 10.106.8.128:8032,10.106.8.129:8032, Ident: (RM_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401835, maxDate=1655359201835, sequenceNumber=48642, masterKeyId=388) Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) Kind: MR_DELEGATION_TOKEN, Service: 10.106.8.129:10020, Ident: (MR_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401773, maxDate=1655359201773, sequenceNumber=167, masterKeyId=10) 09:09:14.409 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) Starting the execution of prepare actions Completed the execution of prepare actions successfully Files in current dir:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/. ====================== File: launch_container.sh File: container_tokens File: datanucleus-rdbms-4.1.7.jar File: log4j2.xml File: failureaccess-1.0.1.jar File: jsp-api-2.1.jar File: disruptor-3.3.6.jar File: slf4j-simple-1.7.25.jar File: hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.0.jar File: websocket-api-9.3.25.v20180904.jar File: woodstox-core-5.0.3.jar File: kerby-config-1.0.0.jar File: jul-to-slf4j-1.7.25.jar File: xalan-2.7.2.jar File: mssql-jdbc-6.2.1.jre7.jar File: jackson-module-jaxb-annotations-2.9.9.jar File: jersey-guice-1.19.jar File: hbase-http-2.1.0-cdh6.3.0.jar File: dozer-5.5.1.jar File: websocket-server-9.3.25.v20180904.jar File: hadoop-yarn-common-3.0.0-cdh6.3.0.jar File: aopalliance-repackaged-2.5.0-b32.jar File: csvjdbc-1.0.34.jar File: netty-transport-classes-kqueue-4.1.71.Final.jar File: javax.inject-2.5.0-b32.jar File: fastutil-6.5.6.jar File: kerb-core-1.0.0.jar File: netty-codec-http2-4.1.71.Final.jar File: commons-net-3.1.jar File: commons-el-1.0.jar File: jackson-core-2.13.0.jar File: jaxb2-basics-runtime-1.11.1.jar File: commons-collections-3.2.2.jar File: hive-jdbc-2.1.1-cdh6.3.0.jar File: netty-common-4.1.71.Final.jar File: guice-4.0.jar File: serializer-2.7.2.jar File: jaxb-impl-2.2.3-1.jar File: gson-2.2.4.jar File: netty-resolver-dns-classes-macos-4.1.71.Final.jar File: commons-httpclient-3.1.jar File: json-io-2.5.1.jar File: javax.inject-1.jar File: antlr-runtime-3.5.2.jar File: jetty-xml-9.3.25.v20180904.jar File: j2objc-annotations-1.3.jar File: checker-qual-3.5.0.jar File: commons-exec-1.3.jar File: netty-transport-4.1.71.Final.jar File: objenesis-1.0.jar File: jsr311-api-1.1.1.jar File: hive-llap-tez-2.1.1-cdh6.3.0.jar File: jersey-guava-2.25.1.jar File: batik-dom-1.10.jar File: kerb-identity-1.0.0.jar File: fst-2.50.jar File: logback-core-1.3.0-alpha11.jar File: netty-resolver-4.1.71.Final.jar File: netty-transport-native-unix-common-4.1.71.Final.jar File: javassist-3.20.0-GA.jar File: kerby-util-1.0.0.jar File: hadoop-yarn-registry-2.7.1.jar File: javax.annotation-api-1.3.2.jar File: jetty-annotations-9.3.25.v20180904.jar File: zookeeper-3.6.0.jar File: twill-api-0.6.0-incubating.jar File: jetty-rewrite-9.3.25.v20180904.jar File: re2j-1.1.jar File: asm-6.0.jar File: jackson-jaxrs-base-2.9.9.jar File: jcip-annotations-1.0-1.jar File: log4j-web-2.8.2.jar File: hbase-shaded-protobuf-2.2.1.jar File: oozie-client-5.1.0-cdh6.3.3.jar File: hive-llap-common-2.1.1-cdh6.3.0.jar File: commons-pool-1.5.4.jar File: batik-awt-util-1.10.jar File: jdo-api-3.0.1.jar File: jetty-client-9.3.25.v20180904.jar File: container-log4j.properties File: hadoop-auth-3.0.0-cdh6.3.3.jar File: commons-lang3-3.7.jar File: datanucleus-core-4.1.6.jar File: cim-etl-oozie-1.2.2.4.jar File: jaxb2-basics-tools-1.11.1.jar File: libthrift-0.13.0.jar File: ant-1.10.8.jar File: netty-transport-rxtx-4.1.71.Final.jar File: jcommander-1.30.jar File: hadoop-common-3.0.0-cdh6.3.0.jar File: findbugs-annotations-1.3.9-1.jar File: hk2-locator-2.5.0-b32.jar File: htrace-core4-4.2.0-incubating.jar File: hive-orc-2.1.1-cdh6.3.0.jar File: hadoop-hdfs-3.0.0-cdh6.3.2.jar File: commons-io-2.6.jar File: joni-2.1.11.jar File: javax.el-3.0.1-b12.jar File: netty-handler-proxy-4.1.71.Final.jar File: kerb-client-1.0.0.jar File: jta-1.1.jar File: httpcore-4.4.10.jar File: ecj-4.4.2.jar File: batik-svggen-1.10.jar File: transaction-api-1.1.jar File: netty-tcnative-classes-2.0.46.Final.jar File: hadoop-yarn-server-resourcemanager-3.0.0-cdh6.3.0.jar File: jpam-1.1.jar File: twill-zookeeper-0.6.0-incubating.jar File: hive-shims-common-2.1.1-cdh6.3.0.jar File: metrics-core-3.2.6.jar File: netty-transport-native-epoll-4.1.71.Final-linux-aarch_64.jar File: bcpkix-jdk15on-1.60.jar File: okhttp-2.7.5.jar File: metrics-json-3.1.0.jar File: netty-codec-haproxy-4.1.71.Final.jar File: jamon-runtime-2.4.1.jar File: hk2-api-2.5.0-b32.jar File: jcl-over-slf4j-1.7.25.jar File: commons-configuration2-2.1.1.jar File: jetty-webapp-9.3.25.v20180904.jar File: jetty-jaas-9.3.25.v20180904.jar File: hadoop-hdfs-client-3.0.0-cdh6.3.0.jar File: oozie-fluent-job-api-5.1.0-cdh6.3.3.jar File: ehcache-3.3.1.jar File: jetty-io-9.3.25.v20180904.jar File: parquet-hadoop-bundle-1.9.0-cdh6.3.0.jar File: jackson-databind-2.13.0.jar File: javax.servlet.jsp-api-2.3.1.jar File: commons-daemon-1.0.13.jar File: kerb-simplekdc-1.0.0.jar File: netty-codec-dns-4.1.71.Final.jar File: nimbus-jose-jwt-4.41.1.jar File: jsch-0.1.54.jar File: listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar File: kerby-pkix-1.0.0.jar File: nashorn-promise-0.1.1.jar File: jasper-runtime-5.5.23.jar File: netty-resolver-dns-4.1.71.Final.jar File: batik-ext-1.10.jar File: hbase-client-2.2.4.jar 09:09:14.409 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie sending #1 org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB.allocate File: hive-metastore-2.1.1-cdh6.3.0.jar File: javax.jdo-3.2.0-m3.jar File: jsp-api-2.0.jar File: stax2-api-3.1.4.jar File: snappy-0.2.jar File: hbase-shaded-miscellaneous-2.2.1.jar File: kerby-xdr-1.0.0.jar File: jersey-server-1.19.jar File: dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar File: json-20160810.jar File: batik-script-1.10.jar File: xz-1.6.jar File: slf4j-api-2.0.0-alpha4.jar File: log4j-api-2.17.0.jar File: protobuf-java-2.5.0.jar File: batik-svgrasterizer-1.10.jar File: log4j-slf4j-impl-2.8.2.jar File: javaparser-1.0.11.jar File: jersey-common-2.25.1.jar File: bcprov-jdk15on-1.60.jar File: hive-shims-0.23-2.1.1-cdh6.3.0.jar File: jetty-http-9.3.25.v20180904.jar File: json-simple-1.1.jar File: twill-common-0.6.0-incubating.jar File: jersey-client-1.19.jar File: commons-lang-2.6.jar File: netty-codec-memcache-4.1.71.Final.jar File: jackson-jaxrs-json-provider-2.9.9.jar File: guice-servlet-4.0.jar File: cim-json-parser-1.1.0.9-SNAPSHOT.jar File: jetty-util-6.1.26.jar File: tephra-api-0.6.0.jar File: hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.0.jar 09:09:14.409 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie got value #1 File: batik-svg-dom-1.10.jar File: netty-transport-udt-4.1.71.Final.jar 09:09:14.409 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: allocate took 3ms File: hbase-metrics-api-2.2.4.jar File: ant-launcher-1.10.8.jar File: commons-compress-1.18.jar File: netty-transport-native-kqueue-4.1.71.Final-osx-x86_64.jar File: osgi-resource-locator-1.0.1.jar File: okio-1.6.0.jar File: asm-tree-6.0.jar File: hadoop-yarn-client-3.0.0-cdh6.3.0.jar File: derby-10.14.1.0.jar File: jetty-runner-9.3.25.v20180904.jar File: xml-apis-ext-1.3.04.jar File: netty-codec-xml-4.1.71.Final.jar File: netty-transport-native-kqueue-4.1.71.Final-osx-aarch_64.jar File: oozie-sharelib-oozie.jar File: hive-serde-2.1.1-cdh6.3.0.jar File: avro-1.8.2-cdh6.3.0.jar File: xmlgraphics-commons-2.3.jar File: batik-css-1.10.jar File: batik-parser-1.10.jar File: HikariCP-2.6.1.jar File: guice-assistedinject-3.0.jar File: netty-codec-socks-4.1.71.Final.jar File: hive-shims-scheduler-2.1.1-cdh6.3.0.jar File: netty-resolver-dns-native-macos-4.1.71.Final-osx-x86_64.jar File: hbase-protocol-shaded-2.2.4.jar File: javax.ws.rs-api-2.0.1.jar File: hbase-server-2.1.0-cdh6.3.0.jar File: tephra-hbase-compat-1.0-0.6.0.jar File: logback-classic-1.3.0-alpha11.jar File: hive-shims-2.1.1-cdh6.3.0.jar File: hadoop-yarn-api-3.0.0-cdh6.3.0.jar File: hive-classification-2.1.1-cdh6.3.0.jar File: jcodings-1.0.18.jar File: asm-commons-6.0.jar File: netty-transport-native-epoll-4.1.71.Final.jar File: java-util-1.9.0.jar File: hbase-common-2.2.4.jar File: xercesImpl-2.12.0.jar File: javax.servlet.jsp-2.3.2.jar File: kerb-crypto-1.0.0.jar File: netty-transport-sctp-4.1.71.Final.jar File: batik-anim-1.10.jar File: HikariCP-java7-2.4.12.jar File: hadoop-distcp-3.0.0-cdh6.3.2.jar File: netty-codec-4.1.71.Final.jar File: kerb-util-1.0.0.jar File: batik-xml-1.10.jar File: batik-codec-1.10.jar File: httpclient-4.5.6.jar File: hive-storage-api-2.1.1-cdh6.3.0.jar File: commons-math3-3.6.1.jar File: netty-handler-4.1.71.Final.jar File: jetty-plus-9.3.25.v20180904.jar File: hadoop-client-3.0.0-cdh6.3.0.jar File: jms-1.1.jar File: batik-gvt-1.10.jar File: netty-transport-native-epoll-4.1.71.Final-linux-x86_64.jar File: joda-time-2.9.9.jar File: websocket-client-9.3.25.v20180904.jar File: jasper-compiler-5.5.23.jar File: hbase-zookeeper-2.1.0-cdh6.3.0.jar File: asm-all-5.0.2.jar File: commons-codec-1.11.jar File: slf4j-log4j12-1.7.25.jar File: hbase-mapreduce-2.1.0-cdh6.3.0.jar File: javax.activation-api-1.2.0.jar File: hadoop-yarn-server-applicationhistoryservice-3.0.0-cdh6.3.0.jar File: websocket-common-9.3.25.v20180904.jar File: jetty-schemas-3.1.jar File: jetty-util-9.3.25.v20180904.jar File: hive-service-2.1.1-cdh6.3.0.jar File: websocket-servlet-9.3.25.v20180904.jar File: kerby-asn1-1.0.0.jar File: datanucleus-api-jdo-4.2.1.jar File: commons-crypto-1.0.0.jar File: log4j-1.2-api-2.8.2.jar File: netty-all-4.1.71.Final.jar File: batik-util-1.10.jar File: jersey-container-servlet-core-2.25.1.jar File: batik-transcoder-1.10.jar File: tephra-core-0.6.0.jar File: commons-logging-1.2.jar File: opencsv-2.3.jar File: kerb-server-1.0.0.jar File: netty-resolver-dns-native-macos-4.1.71.Final-osx-aarch_64.jar File: log4j-core-2.17.0.jar File: batik-rasterizer-1.10.jar File: jersey-json-1.19.jar File: oozie-sharelib-oozie-5.1.0-cdh6.3.2.jar File: jersey-media-jaxb-2.25.1.jar File: kerb-common-1.0.0.jar File: netty-codec-http-4.1.71.Final.jar File: taglibs-standard-spec-1.2.5.jar File: taglibs-standard-impl-1.2.5.jar File: hbase-procedure-2.1.0-cdh6.3.0.jar File: hbase-metrics-2.2.4.jar File: jaxb2-basics-1.11.1.jar File: apache-jsp-9.3.25.v20180904.jar File: hbase-hadoop-compat-2.2.4.jar File: netty-transport-classes-epoll-4.1.71.Final.jar File: jetty-6.1.26.jar File: commons-dbcp-1.4.jar File: jersey-client-2.25.1.jar File: jersey-server-2.25.1.jar File: audience-annotations-0.5.0.jar File: kerb-admin-1.0.0.jar File: jetty-jndi-9.3.25.v20180904.jar File: bonecp-0.8.0.RELEASE.jar File: curator-client-2.12.0.jar File: jetty-security-9.3.25.v20180904.jar File: aopalliance-1.0.jar File: accessors-smart-1.2.jar File: error_prone_annotations-2.3.4.jar File: jettison-1.1.jar File: mr-framework/accessors-smart-1.2.jar File: mr-framework/netty-3.10.6.Final.jar File: mr-framework/hadoop-mapreduce-client-uploader-3.0.0-cdh6.3.2.jar File: mr-framework/woodstox-core-5.0.3.jar File: mr-framework/jersey-servlet-1.19.jar File: mr-framework/hadoop-yarn-server-common-3.0.0-cdh6.3.2.jar File: mr-framework/aopalliance-1.0.jar File: mr-framework/slf4j-api-1.7.25.jar File: mr-framework/protobuf-java-2.5.0.jar File: mr-framework/jersey-json-1.19.jar File: mr-framework/parquet-common.jar File: mr-framework/zookeeper.jar File: mr-framework/xz-1.6.jar File: mr-framework/logredactor-2.0.7.jar File: mr-framework/parquet-hadoop.jar File: mr-framework/jul-to-slf4j-1.7.25.jar File: mr-framework/re2j-1.1.jar File: mr-framework/jetty-util-9.3.25.v20180904.jar File: mr-framework/commons-configuration2-2.1.1.jar File: mr-framework/jackson-core-2.9.9.jar File: mr-framework/json-simple-1.1.1.jar File: mr-framework/hadoop-archives-3.0.0-cdh6.3.2.jar File: mr-framework/kerby-asn1-1.0.0.jar File: mr-framework/commons-cli-1.2.jar File: mr-framework/kerb-crypto-1.0.0.jar File: mr-framework/jetty-webapp-9.3.25.v20180904.jar File: mr-framework/jackson-mapper-asl-1.9.13-cloudera.1.jar File: mr-framework/parquet-encoding.jar File: mr-framework/parquet-thrift.jar File: mr-framework/parquet-generator.jar File: mr-framework/netty-handler-4.1.17.Final.jar File: mr-framework/okhttp-2.7.5.jar File: mr-framework/audience-annotations-0.5.0.jar File: mr-framework/hadoop-azure-3.0.0-cdh6.3.2.jar File: mr-framework/jetty-server-9.3.25.v20180904.jar File: mr-framework/jettison-1.1.jar File: mr-framework/objenesis-1.0.jar File: mr-framework/jackson-jaxrs-base-2.9.9.jar File: mr-framework/mssql-jdbc-6.2.1.jre7.jar File: mr-framework/commons-lang3-3.7.jar File: mr-framework/log4j-core-2.8.2.jar File: mr-framework/stax2-api-3.1.4.jar File: mr-framework/commons-math3-3.1.1.jar File: mr-framework/commons-compress-1.18.jar File: mr-framework/jaxb-impl-2.2.3-1.jar File: mr-framework/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.2-tests.jar File: mr-framework/commons-net-3.1.jar File: mr-framework/jersey-core-1.19.jar File: mr-framework/parquet-pig-bundle.jar File: mr-framework/avro-1.8.2-cdh6.3.2.jar File: mr-framework/hadoop-hdfs-client-3.0.0-cdh6.3.2-tests.jar File: mr-framework/hadoop-archive-logs-3.0.0-cdh6.3.2.jar File: mr-framework/httpclient-4.5.3.jar File: mr-framework/kerb-server-1.0.0.jar File: mr-framework/fst-2.50.jar File: mr-framework/kerb-admin-1.0.0.jar File: mr-framework/kerby-util-1.0.0.jar File: mr-framework/commons-codec-1.11.jar File: mr-framework/parquet-format-sources.jar File: mr-framework/javax.activation-api-1.2.0.jar File: mr-framework/hadoop-sls-3.0.0-cdh6.3.2.jar File: mr-framework/kerby-pkix-1.0.0.jar File: mr-framework/jcip-annotations-1.0-1.jar File: mr-framework/bcpkix-jdk15on-1.60.jar File: mr-framework/javax.servlet-api-3.1.0.jar File: mr-framework/json-io-2.5.1.jar File: mr-framework/event-publish-6.3.0-shaded.jar File: mr-framework/hadoop-yarn-api-3.0.0-cdh6.3.2.jar File: mr-framework/jackson-databind-2.9.9.3.jar File: mr-framework/lz4-java-1.5.0.jar File: mr-framework/slf4j-log4j12.jar File: mr-framework/commons-collections-3.2.2.jar File: mr-framework/gson-2.2.4.jar File: mr-framework/hadoop-kms-3.0.0-cdh6.3.2.jar File: mr-framework/zstd-jni-1.3.8-1.jar File: mr-framework/azure-keyvault-core-0.8.0.jar File: mr-framework/commons-io-2.6.jar File: mr-framework/jetty-util-ajax-9.3.25.v20180904.jar File: mr-framework/hadoop-yarn-applications-distributedshell-3.0.0-cdh6.3.2.jar File: mr-framework/jetty-http-9.3.25.v20180904.jar File: mr-framework/hadoop-resourceestimator-3.0.0-cdh6.3.2.jar File: mr-framework/jersey-guice-1.19.jar File: mr-framework/guice-servlet-4.0.jar File: mr-framework/jsp-api-2.1.jar File: mr-framework/jsr311-api-1.1.1.jar File: mr-framework/hadoop-aliyun-3.0.0-cdh6.3.2.jar File: mr-framework/geronimo-jcache_1.0_spec-1.0-alpha-1.jar File: mr-framework/commons-logging-1.1.3.jar File: mr-framework/asm-5.0.4.jar File: mr-framework/jackson-xc-1.9.13.jar File: mr-framework/hadoop-common-3.0.0-cdh6.3.2.jar File: mr-framework/kerb-util-1.0.0.jar File: mr-framework/jackson-core-asl-1.9.13.jar File: mr-framework/hadoop-yarn-registry-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-annotations-3.0.0-cdh6.3.2.jar File: mr-framework/jsch-0.1.54.jar File: mr-framework/hadoop-streaming-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-aws-3.0.0-cdh6.3.2.jar File: mr-framework/parquet-hadoop-bundle.jar File: mr-framework/hadoop-mapreduce-client-core-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.2.jar File: mr-framework/jetty-io-9.3.25.v20180904.jar File: mr-framework/jsr305-3.0.0.jar File: mr-framework/hadoop-nfs-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-datajoin-3.0.0-cdh6.3.2.jar File: mr-framework/jersey-server-1.19.jar File: mr-framework/parquet-cascading.jar File: mr-framework/jackson-annotations-2.9.9.jar File: mr-framework/jetty-xml-9.3.25.v20180904.jar File: mr-framework/gcs-connector-hadoop3-1.9.10-cdh6.3.2-shaded.jar File: mr-framework/curator-framework-2.12.0.jar File: mr-framework/zookeeper-3.4.5-cdh6.3.2.jar File: mr-framework/hadoop-mapreduce-client-hs-plugins-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-rumen-3.0.0-cdh6.3.2.jar File: mr-framework/parquet-cascading3.jar File: mr-framework/hadoop-yarn-server-tests-3.0.0-cdh6.3.2.jar File: mr-framework/curator-recipes-2.12.0.jar File: mr-framework/spark-2.4.0-cdh6.3.2-yarn-shuffle.jar File: mr-framework/avro.jar File: mr-framework/hadoop-hdfs-nfs-3.0.0-cdh6.3.2.jar File: mr-framework/htrace-core4-4.1.0-incubating.jar File: mr-framework/parquet-column.jar File: mr-framework/parquet-format-javadoc.jar File: mr-framework/metrics-core-3.0.1.jar File: mr-framework/hadoop-yarn-client-3.0.0-cdh6.3.2.jar File: mr-framework/kerby-config-1.0.0.jar File: mr-framework/kerb-core-1.0.0.jar File: mr-framework/okio-1.6.0.jar File: mr-framework/hadoop-hdfs-native-client-3.0.0-cdh6.3.2.jar File: mr-framework/jdom-1.1.jar File: mr-framework/kafka-clients-2.2.1-cdh6.3.2.jar File: mr-framework/parquet-protobuf.jar File: mr-framework/guice-4.0.jar File: mr-framework/aws-java-sdk-bundle-1.11.271.jar File: mr-framework/azure-data-lake-store-sdk-2.2.9.jar File: mr-framework/bcprov-jdk15on-1.60.jar File: mr-framework/jackson-jaxrs-1.9.13.jar File: mr-framework/hadoop-hdfs-3.0.0-cdh6.3.2-tests.jar File: mr-framework/ojalgo-43.0.jar File: mr-framework/jersey-client-1.19.jar File: mr-framework/log4j-1.2.17.jar File: mr-framework/netty-codec-4.1.17.Final.jar File: mr-framework/hadoop-mapreduce-client-common-3.0.0-cdh6.3.2.jar File: mr-framework/kerby-xdr-1.0.0.jar File: mr-framework/hadoop-hdfs-client-3.0.0-cdh6.3.2.jar File: mr-framework/paranamer-2.8.jar File: mr-framework/hadoop-hdfs-httpfs-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-mapreduce-client-nativetask-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-yarn-common-3.0.0-cdh6.3.2.jar File: mr-framework/HikariCP-java7-2.4.12.jar File: mr-framework/hadoop-auth-3.0.0-cdh6.3.2.jar File: mr-framework/commons-lang-2.6.jar File: mr-framework/kerb-identity-1.0.0.jar File: mr-framework/hadoop-mapreduce-client-hs-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-hdfs-3.0.0-cdh6.3.2.jar File: mr-framework/parquet-pig.jar File: mr-framework/kerb-simplekdc-1.0.0.jar File: mr-framework/parquet-scala_2.11.jar File: mr-framework/hadoop-distcp-3.0.0-cdh6.3.2.jar File: mr-framework/kerb-client-1.0.0.jar File: mr-framework/netty-common-4.1.17.Final.jar File: mr-framework/azure-storage-5.4.0.jar File: mr-framework/parquet-jackson.jar File: mr-framework/java-util-1.9.0.jar File: mr-framework/hadoop-yarn-server-nodemanager-3.0.0-cdh6.3.2.jar File: mr-framework/javax.inject-1.jar File: mr-framework/snappy-java-1.1.4.jar File: mr-framework/parquet-avro.jar File: mr-framework/hadoop-kafka-3.0.0-cdh6.3.2.jar File: mr-framework/ehcache-3.3.1.jar File: mr-framework/kerb-common-1.0.0.jar File: mr-framework/guava-11.0.2.jar File: mr-framework/commons-beanutils-1.9.4.jar File: mr-framework/jaxb-api-2.2.11.jar File: mr-framework/jetty-servlet-9.3.25.v20180904.jar File: mr-framework/parquet-format.jar File: mr-framework/hadoop-openstack-3.0.0-cdh6.3.2.jar File: mr-framework/aliyun-sdk-oss-2.8.3.jar File: mr-framework/nimbus-jose-jwt-4.41.1.jar File: mr-framework/hadoop-mapreduce-client-app-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-gridmix-3.0.0-cdh6.3.2.jar File: mr-framework/curator-client-2.12.0.jar File: mr-framework/jackson-jaxrs-json-provider-2.9.9.jar File: mr-framework/jetty-security-9.3.25.v20180904.jar File: mr-framework/netty-transport-4.1.17.Final.jar File: mr-framework/commons-daemon-1.0.13.jar File: mr-framework/netty-codec-http-4.1.17.Final.jar File: mr-framework/hadoop-yarn-applications-unmanaged-am-launcher-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.2.jar File: mr-framework/jackson-module-jaxb-annotations-2.9.9.jar File: mr-framework/hadoop-common-3.0.0-cdh6.3.2-tests.jar File: mr-framework/hadoop-azure-datalake-3.0.0-cdh6.3.2.jar File: mr-framework/netty-resolver-4.1.17.Final.jar File: mr-framework/hadoop-mapreduce-client-shuffle-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-extras-3.0.0-cdh6.3.2.jar File: mr-framework/hadoop-hdfs-native-client-3.0.0-cdh6.3.2-tests.jar File: mr-framework/wildfly-openssl-1.0.4.Final.jar File: mr-framework/hadoop-mapreduce-examples-3.0.0-cdh6.3.2.jar File: mr-framework/tt-instrumentation-6.3.0.jar File: mr-framework/netty-buffer-4.1.17.Final.jar File: mr-framework/httpcore-4.4.6.jar File: mr-framework/json-smart-2.3.jar File: mr-framework/log4j-api-2.8.2.jar File: mr-framework/leveldbjni-all-1.8.jar File: validation-api-1.1.0.Final.jar File: jackson-annotations-2.13.0.jar File: snappy-java-1.1.4.jar File: hive-llap-client-2.1.1-cdh6.3.0.jar File: jsr305-3.0.2.jar File: xml-apis-1.4.01.jar File: netty-codec-stomp-4.1.71.Final.jar File: libfb303-0.9.3.jar File: hbase-hadoop2-compat-2.2.4.jar File: leveldbjni-all-1.8.jar File: curator-recipes-2.12.0.jar File: javolution-5.5.1.jar File: twill-discovery-core-0.6.0-incubating.jar File: hbase-shaded-netty-3.5.1.jar File: jersey-core-1.19.jar File: metrics-jvm-3.1.0.jar File: twill-core-0.6.0-incubating.jar File: hbase-protocol-2.2.4.jar File: geronimo-jcache_1.0_spec-1.0-alpha-1.jar File: hadoop-yarn-server-common-3.0.0-cdh6.3.0.jar File: netty-codec-mqtt-4.1.71.Final.jar File: hadoop-mapreduce-client-core-3.0.0-cdh6.3.0.jar File: launcher.xml File: json-smart-2.3.jar File: batik-i18n-1.10.jar File: hadoop-annotations-3.0.0-cdh6.3.3.jar File: hive-llap-server-2.1.1-cdh6.3.0.jar File: netty-buffer-4.1.71.Final.jar File: jersey-servlet-1.19.jar File: hive-common-2.1.1-cdh6.3.0.jar File: javax.servlet-api-3.1.0.jar File: hadoop-mapreduce-client-common-3.0.0-cdh6.3.0.jar File: apache-jstl-9.3.25.v20180904.jar File: twill-discovery-api-0.6.0-incubating.jar File: jetty-util-ajax-9.3.25.v20180904.jar File: commons-beanutils-1.9.3.jar File: jetty-servlet-9.3.25.v20180904.jar File: servlet-api-2.4.jar File: action.xml File: curator-framework-2.12.0.jar File: commons-cli-1.4.jar File: hive-service-rpc-2.1.1-cdh6.3.0.jar File: slider-core-0.90.2-incubating.jar File: guava-30.1-jre.jar File: batik-bridge-1.10.jar File: zookeeper-jute-3.6.0.jar File: graphviz-java-0.7.0.jar File: paranamer-2.8.jar File: netty-codec-redis-4.1.71.Final.jar File: jetty-server-9.3.25.v20180904.jar File: jaxb-api-2.2.11.jar File: batik-constants-1.10.jar File: hbase-replication-2.1.0-cdh6.3.0.jar File: log4j-1.2.17.jar File: netty-codec-smtp-4.1.71.Final.jar File: hk2-utils-2.5.0-b32.jar Oozie Launcher Application Master configuration =============================================== Workflow job id : 0000084-220601112151640-oozie-oozi-W Workflow action id: 0000084-220601112151640-oozie-oozi-W@impala-etl Classpath : ------------------------ /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001 /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/datanucleus-rdbms-4.1.7.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/failureaccess-1.0.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsp-api-2.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/disruptor-3.3.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slf4j-simple-1.7.25.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-api-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/woodstox-core-5.0.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-config-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jul-to-slf4j-1.7.25.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xalan-2.7.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mssql-jdbc-6.2.1.jre7.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-module-jaxb-annotations-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-guice-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-http-2.1.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/dozer-5.5.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-server-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-common-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/aopalliance-repackaged-2.5.0-b32.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/csvjdbc-1.0.34.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-classes-kqueue-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.inject-2.5.0-b32.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/fastutil-6.5.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-core-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-http2-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-net-3.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-el-1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-core-2.13.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb2-basics-runtime-1.11.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-collections-3.2.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-jdbc-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-common-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guice-4.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/serializer-2.7.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb-impl-2.2.3-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/gson-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-classes-macos-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-httpclient-3.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-io-2.5.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.inject-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/antlr-runtime-3.5.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-xml-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/j2objc-annotations-1.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/checker-qual-3.5.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-exec-1.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/objenesis-1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsr311-api-1.1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-tez-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-guava-2.25.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-dom-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-identity-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/fst-2.50.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/logback-core-1.3.0-alpha11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-unix-common-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javassist-3.20.0-GA.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-util-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-registry-2.7.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.annotation-api-1.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-annotations-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/zookeeper-3.6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-api-0.6.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-rewrite-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/re2j-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-jaxrs-base-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcip-annotations-1.0-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-web-2.8.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-shaded-protobuf-2.2.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-client-5.1.0-cdh6.3.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-common-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-pool-1.5.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-awt-util-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jdo-api-3.0.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-client-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-auth-3.0.0-cdh6.3.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-lang3-3.7.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/datanucleus-core-4.1.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/cim-etl-oozie-1.2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb2-basics-tools-1.11.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/libthrift-0.13.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ant-1.10.8.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-rxtx-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcommander-1.30.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-common-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/findbugs-annotations-1.3.9-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hk2-locator-2.5.0-b32.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/htrace-core4-4.2.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-orc-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-hdfs-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-io-2.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/joni-2.1.11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.el-3.0.1-b12.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-handler-proxy-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-client-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jta-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/httpcore-4.4.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ecj-4.4.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-svggen-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/transaction-api-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-tcnative-classes-2.0.46.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-resourcemanager-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jpam-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-zookeeper-0.6.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-common-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/metrics-core-3.2.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-epoll-4.1.71.Final-linux-aarch_64.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/bcpkix-jdk15on-1.60.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/okhttp-2.7.5.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/metrics-json-3.1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-haproxy-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jamon-runtime-2.4.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hk2-api-2.5.0-b32.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcl-over-slf4j-1.7.25.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-configuration2-2.1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-webapp-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-jaas-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-hdfs-client-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-fluent-job-api-5.1.0-cdh6.3.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ehcache-3.3.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-io-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/parquet-hadoop-bundle-1.9.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-databind-2.13.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.servlet.jsp-api-2.3.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-daemon-1.0.13.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-simplekdc-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-dns-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/nimbus-jose-jwt-4.41.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsch-0.1.54.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-pkix-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/nashorn-promise-0.1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jasper-runtime-5.5.23.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-ext-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-client-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-metastore-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.jdo-3.2.0-m3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsp-api-2.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/stax2-api-3.1.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/snappy-0.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-shaded-miscellaneous-2.2.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-xdr-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-server-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-20160810.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-script-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xz-1.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slf4j-api-2.0.0-alpha4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-api-2.17.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/protobuf-java-2.5.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-svgrasterizer-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-slf4j-impl-2.8.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javaparser-1.0.11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-common-2.25.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/bcprov-jdk15on-1.60.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-0.23-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-http-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-simple-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-common-0.6.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-client-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-lang-2.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-memcache-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-jaxrs-json-provider-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guice-servlet-4.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/cim-json-parser-1.1.0.9-SNAPSHOT.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-util-6.1.26.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/tephra-api-0.6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-svg-dom-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-udt-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-metrics-api-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ant-launcher-1.10.8.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-compress-1.18.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-kqueue-4.1.71.Final-osx-x86_64.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/osgi-resource-locator-1.0.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/okio-1.6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-tree-6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-client-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/derby-10.14.1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-runner-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xml-apis-ext-1.3.04.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-xml-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-kqueue-4.1.71.Final-osx-aarch_64.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-sharelib-oozie.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-serde-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/avro-1.8.2-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xmlgraphics-commons-2.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-css-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-parser-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/HikariCP-2.6.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guice-assistedinject-3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-socks-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-scheduler-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-native-macos-4.1.71.Final-osx-x86_64.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-protocol-shaded-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.ws.rs-api-2.0.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-server-2.1.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/tephra-hbase-compat-1.0-0.6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/logback-classic-1.3.0-alpha11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-api-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-classification-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcodings-1.0.18.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-commons-6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-epoll-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/java-util-1.9.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-common-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xercesImpl-2.12.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.servlet.jsp-2.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-crypto-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-sctp-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-anim-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/HikariCP-java7-2.4.12.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-distcp-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-util-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-xml-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-codec-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/httpclient-4.5.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-storage-api-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-math3-3.6.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-handler-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-plus-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-client-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jms-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-gvt-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-epoll-4.1.71.Final-linux-x86_64.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/joda-time-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-client-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jasper-compiler-5.5.23.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-zookeeper-2.1.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-all-5.0.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-codec-1.11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slf4j-log4j12-1.7.25.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-mapreduce-2.1.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.activation-api-1.2.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-applicationhistoryservice-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-common-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-schemas-3.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-util-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-service-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-servlet-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-asn1-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/datanucleus-api-jdo-4.2.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-crypto-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-1.2-api-2.8.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-all-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-util-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-container-servlet-core-2.25.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-transcoder-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/tephra-core-0.6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-logging-1.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/opencsv-2.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-server-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-native-macos-4.1.71.Final-osx-aarch_64.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-core-2.17.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-rasterizer-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-json-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-sharelib-oozie-5.1.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-media-jaxb-2.25.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-common-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-http-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/taglibs-standard-spec-1.2.5.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/taglibs-standard-impl-1.2.5.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-procedure-2.1.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-metrics-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb2-basics-1.11.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/apache-jsp-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-hadoop-compat-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-classes-epoll-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-6.1.26.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-dbcp-1.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-client-2.25.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-server-2.25.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/audience-annotations-0.5.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-admin-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-jndi-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/bonecp-0.8.0.RELEASE.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/curator-client-2.12.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-security-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/aopalliance-1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/accessors-smart-1.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/error_prone_annotations-2.3.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jettison-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/validation-api-1.1.0.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-annotations-2.13.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/snappy-java-1.1.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-client-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsr305-3.0.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xml-apis-1.4.01.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-stomp-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/libfb303-0.9.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-hadoop2-compat-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/leveldbjni-all-1.8.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/curator-recipes-2.12.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javolution-5.5.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-discovery-core-0.6.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-shaded-netty-3.5.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-core-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/metrics-jvm-3.1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-core-0.6.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-protocol-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/geronimo-jcache_1.0_spec-1.0-alpha-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-common-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-mqtt-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-mapreduce-client-core-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-smart-2.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-i18n-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-annotations-3.0.0-cdh6.3.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-server-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-buffer-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-servlet-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-common-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.servlet-api-3.1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-mapreduce-client-common-3.0.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/apache-jstl-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-discovery-api-0.6.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-util-ajax-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-beanutils-1.9.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-servlet-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/servlet-api-2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/curator-framework-2.12.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-cli-1.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-service-rpc-2.1.1-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slider-core-0.90.2-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guava-30.1-jre.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-bridge-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/zookeeper-jute-3.6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/graphviz-java-0.7.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/paranamer-2.8.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-redis-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-server-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb-api-2.2.11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-constants-1.10.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-replication-2.1.0-cdh6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-1.2.17.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-smtp-4.1.71.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hk2-utils-2.5.0-b32.jar /etc/hadoop/conf.cloudera.yarn /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/gcs-connector-hadoop3-shaded.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-annotations.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-auth.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-aws.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure-datalake.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-kms.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-nfs.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-aws-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-annotations-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-auth-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-nfs-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/gcs-connector-hadoop3-1.9.10-cdh6.3.2-shaded.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure-datalake-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common-3.0.0-cdh6.3.2-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-kms-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-thrift.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-scala_2.11.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-protobuf.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-pig.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-pig-bundle.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-jackson.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-hadoop.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-hadoop-bundle.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-generator.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-encoding.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-common.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-column.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-cascading3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-cascading.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-avro.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-format.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-format-sources.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-format-javadoc.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/audience-annotations-0.5.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/aws-java-sdk-bundle-1.11.271.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/log4j-core-2.8.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/htrace-core4-4.1.0-incubating.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-util-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-beanutils-1.9.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-util-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-lang3-3.7.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-logging-1.1.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-util-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/curator-client-2.12.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/curator-framework-2.12.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/paranamer-2.8.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/gson-2.2.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-codec-1.11.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-annotations-2.9.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-core-2.9.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/asm-5.0.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-databind-2.9.9.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-xc-1.9.13.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsr305-3.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jaxb-api-2.2.11.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/httpclient-4.5.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/httpcore-4.4.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/log4j-1.2.17.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-core-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/curator-recipes-2.12.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-servlet-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-server-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-io-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-server-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-http-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/json-smart-2.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jcip-annotations-1.0-1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-xml-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-servlet-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsr311-api-1.1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsp-api-2.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/nimbus-jose-jwt-4.41.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-client-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/re2j-1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-crypto-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-compress-1.18.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-server-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jul-to-slf4j-1.7.25.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-pkix-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsch-0.1.54.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-cli-1.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-xdr-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-common-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/netty-3.10.6.Final.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/accessors-smart-1.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-json-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/snappy-java-1.1.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/logredactor-2.0.7.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-admin-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-math3-3.1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/xz-1.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-configuration2-2.1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/slf4j-api-1.7.25.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/metrics-core-3.0.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-simplekdc-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/javax.activation-api-1.2.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-net-3.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/azure-data-lake-store-sdk-2.2.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/zookeeper.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/avro.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-config-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-asn1-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-webapp-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-lang-2.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/guava-11.0.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/wildfly-openssl-1.0.4.Final.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/woodstox-core-5.0.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/protobuf-java-2.5.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-security-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-core-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-collections-3.2.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jettison-1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/javax.servlet-api-3.1.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/slf4j-log4j12.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/stax2-api-3.1.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-io-2.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/log4j-api-2.8.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-identity-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-core-asl-1.9.13.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-httpfs.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-3.0.0-cdh6.3.2-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-nfs-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-httpfs-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client-3.0.0-cdh6.3.2-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client-3.0.0-cdh6.3.2-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/accessors-smart-1.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/audience-annotations-0.5.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/avro-1.8.2-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/asm-5.0.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-codec-1.11.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-compress-1.18.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-configuration2-2.1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-cli-1.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-lang3-3.7.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/httpclient-4.5.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-math3-3.1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/httpcore-4.4.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/curator-client-2.12.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/curator-framework-2.12.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/curator-recipes-2.12.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/okhttp-2.7.5.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-core-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/htrace-core4-4.1.0-incubating.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/gson-2.2.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-pkix-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-annotations-2.9.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-core-2.9.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-databind-2.9.9.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jettison-1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-io-2.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-core-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-server-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-servlet-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/json-smart-2.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-http-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-io-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-security-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-server-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-servlet-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-util-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-util-ajax-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-webapp-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-xml-9.3.25.v20180904.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-net-3.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/xz-1.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-lang-2.6.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-client-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-common-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-admin-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-crypto-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-identity-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-server-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-simplekdc-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-json-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/guava-11.0.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-config-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jsch-0.1.54.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-util-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-util-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-xdr-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/netty-3.10.6.Final.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/nimbus-jose-jwt-4.41.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/okio-1.6.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/log4j-1.2.17.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/paranamer-2.8.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/stax2-api-3.1.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/snappy-java-1.1.4.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/zookeeper-3.4.5-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/woodstox-core-5.0.3.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-asn1-1.0.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/re2j-1.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-api.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-client.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-common.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-registry.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-common.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-router.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-tests.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-timeline-pluginstorage.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-common-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-api-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-tests-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-router-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-timeline-pluginstorage-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-registry-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-client-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-common-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-3.0.0-cdh6.3.2.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.9.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/ehcache-3.3.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/bcpkix-jdk15on-1.60.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/metrics-core-3.0.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/bcprov-jdk15on-1.60.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/objenesis-1.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/json-io-2.5.1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jersey-client-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.9.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/javax.inject-1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/spark-2.4.0-cdh6.3.2-yarn-shuffle.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.9.9.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jersey-guice-1.19.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/fst-2.50.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/aopalliance-1.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/mssql-jdbc-6.2.1.jre7.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/guice-servlet-4.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/java-util-1.9.0.jar /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/guice-4.0.jar /etc/hadoop/conf.cloudera.yarn /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/accessors-smart-1.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-3.10.6.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-uploader-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/woodstox-core-5.0.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-servlet-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-common-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/aopalliance-1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/slf4j-api-1.7.25.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/protobuf-java-2.5.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-json-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-common.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/zookeeper.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/xz-1.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/logredactor-2.0.7.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-hadoop.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jul-to-slf4j-1.7.25.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/re2j-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-util-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-configuration2-2.1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-core-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/json-simple-1.1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-archives-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-asn1-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-cli-1.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-crypto-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-webapp-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-mapper-asl-1.9.13-cloudera.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-encoding.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-thrift.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-generator.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-handler-4.1.17.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/okhttp-2.7.5.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/audience-annotations-0.5.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-azure-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-server-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jettison-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/objenesis-1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-jaxrs-base-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/mssql-jdbc-6.2.1.jre7.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-lang3-3.7.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/log4j-core-2.8.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/stax2-api-3.1.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-math3-3.1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-compress-1.18.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jaxb-impl-2.2.3-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.2-tests.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-net-3.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-core-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-pig-bundle.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/avro-1.8.2-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-client-3.0.0-cdh6.3.2-tests.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-archive-logs-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/httpclient-4.5.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-server-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/fst-2.50.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-admin-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-util-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-codec-1.11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-format-sources.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/javax.activation-api-1.2.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-sls-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-pkix-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jcip-annotations-1.0-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/bcpkix-jdk15on-1.60.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/javax.servlet-api-3.1.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/json-io-2.5.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/event-publish-6.3.0-shaded.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-api-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-databind-2.9.9.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/lz4-java-1.5.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/slf4j-log4j12.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-collections-3.2.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/gson-2.2.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-kms-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/zstd-jni-1.3.8-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/azure-keyvault-core-0.8.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-io-2.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-util-ajax-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-applications-distributedshell-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-http-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-resourceestimator-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-guice-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/guice-servlet-4.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsp-api-2.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsr311-api-1.1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-aliyun-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/geronimo-jcache_1.0_spec-1.0-alpha-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-logging-1.1.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/asm-5.0.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-xc-1.9.13.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-common-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-util-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-core-asl-1.9.13.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-registry-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-annotations-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsch-0.1.54.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-streaming-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-aws-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-hadoop-bundle.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-core-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-io-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsr305-3.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-nfs-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-datajoin-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-server-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-cascading.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-annotations-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-xml-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/gcs-connector-hadoop3-1.9.10-cdh6.3.2-shaded.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/curator-framework-2.12.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/zookeeper-3.4.5-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-hs-plugins-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-rumen-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-cascading3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-tests-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/curator-recipes-2.12.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/spark-2.4.0-cdh6.3.2-yarn-shuffle.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/avro.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-nfs-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/htrace-core4-4.1.0-incubating.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-column.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-format-javadoc.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/metrics-core-3.0.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-client-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-config-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-core-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/okio-1.6.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-native-client-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jdom-1.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kafka-clients-2.2.1-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-protobuf.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/guice-4.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/aws-java-sdk-bundle-1.11.271.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/azure-data-lake-store-sdk-2.2.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/bcprov-jdk15on-1.60.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-jaxrs-1.9.13.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-3.0.0-cdh6.3.2-tests.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/ojalgo-43.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-client-1.19.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/log4j-1.2.17.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-codec-4.1.17.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-common-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-xdr-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-client-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/paranamer-2.8.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-httpfs-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-nativetask-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-common-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/HikariCP-java7-2.4.12.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-auth-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-lang-2.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-identity-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-hs-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-pig.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-simplekdc-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-scala_2.11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-distcp-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-client-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-common-4.1.17.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/azure-storage-5.4.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-jackson.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/java-util-1.9.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-nodemanager-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/javax.inject-1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/snappy-java-1.1.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-avro.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-kafka-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/ehcache-3.3.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-common-1.0.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/guava-11.0.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-beanutils-1.9.4.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jaxb-api-2.2.11.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-servlet-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-format.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-openstack-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/aliyun-sdk-oss-2.8.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/nimbus-jose-jwt-4.41.1.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-app-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-gridmix-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/curator-client-2.12.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-jaxrs-json-provider-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-security-9.3.25.v20180904.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-transport-4.1.17.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-daemon-1.0.13.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-codec-http-4.1.17.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-applications-unmanaged-am-launcher-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-module-jaxb-annotations-2.9.9.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-common-3.0.0-cdh6.3.2-tests.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-azure-datalake-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-resolver-4.1.17.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-shuffle-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-extras-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-native-client-3.0.0-cdh6.3.2-tests.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/wildfly-openssl-1.0.4.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-examples-3.0.0-cdh6.3.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/tt-instrumentation-6.3.0.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-buffer-4.1.17.Final.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/httpcore-4.4.6.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/json-smart-2.3.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/log4j-api-2.8.2.jar /data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/leveldbjni-all-1.8.jar ------------------------ Main class : org.apache.oozie.action.hadoop.JavaMain Maximum output : 2048 Java System Properties: ------------------------ # #Thu Jun 09 11:30:09 IST 2022 java.runtime.name=OpenJDK Runtime Environment sun.boot.library.path=/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/amd64 java.vm.version=25.332-b08 hadoop.root.logger=INFO,CLA java.vm.vendor=Amazon.com Inc. java.vendor.url=https\://aws.amazon.com/corretto/ log4j1.compatibility=true path.separator=\: java.vm.name=OpenJDK 64-Bit Server VM file.encoding.pkg=sun.io user.country=US sun.java.launcher=SUN_STANDARD sun.os.patch.level=unknown java.vm.specification.name=Java Virtual Machine Specification user.dir=/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001 java.runtime.version=1.8.0_332-b08 java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment java.endorsed.dirs=/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/endorsed os.arch=amd64 yarn.app.container.log.dir=/data/yarn/container-logs/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001 java.io.tmpdir=/tmp line.separator=\n java.vm.specification.vendor=Oracle Corporation os.name=Linux log4j.configuration=container-log4j.properties sun.jnu.encoding=ANSI_X3.4-1968 java.library.path=\:/usr/java/packages/lib/amd64\:/usr/lib64\:/lib64\:/lib\:/usr/lib java.specification.name=Java Platform API Specification java.class.version=52.0 sun.management.compiler=HotSpot 64-Bit Tiered Compilers os.version=3.10.0-1127.19.1.el7.x86_64 hadoop.root.logfile=syslog yarn.app.container.log.filesize=1048576 user.home=/var/lib/oozie user.timezone=Asia/Kolkata java.awt.printerjob=sun.print.PSPrinterJob file.encoding=ANSI_X3.4-1968 java.specification.version=1.8 submitter.user=oozie log4j.debug=true java.class.path=/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/datanucleus-rdbms-4.1.7.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/failureaccess-1.0.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsp-api-2.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/disruptor-3.3.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slf4j-simple-1.7.25.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-api-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/woodstox-core-5.0.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-config-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jul-to-slf4j-1.7.25.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xalan-2.7.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mssql-jdbc-6.2.1.jre7.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-module-jaxb-annotations-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-guice-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-http-2.1.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/dozer-5.5.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-server-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-common-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/aopalliance-repackaged-2.5.0-b32.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/csvjdbc-1.0.34.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-classes-kqueue-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.inject-2.5.0-b32.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/fastutil-6.5.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-core-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-http2-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-net-3.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-el-1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-core-2.13.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb2-basics-runtime-1.11.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-collections-3.2.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-jdbc-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-common-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guice-4.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/serializer-2.7.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb-impl-2.2.3-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/gson-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-classes-macos-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-httpclient-3.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-io-2.5.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.inject-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/antlr-runtime-3.5.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-xml-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/j2objc-annotations-1.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/checker-qual-3.5.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-exec-1.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/objenesis-1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsr311-api-1.1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-tez-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-guava-2.25.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-dom-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-identity-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/fst-2.50.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/logback-core-1.3.0-alpha11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-unix-common-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javassist-3.20.0-GA.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-util-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-registry-2.7.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.annotation-api-1.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-annotations-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/zookeeper-3.6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-api-0.6.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-rewrite-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/re2j-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-jaxrs-base-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcip-annotations-1.0-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-web-2.8.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-shaded-protobuf-2.2.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-client-5.1.0-cdh6.3.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-common-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-pool-1.5.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-awt-util-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jdo-api-3.0.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-client-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-auth-3.0.0-cdh6.3.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-lang3-3.7.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/datanucleus-core-4.1.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/cim-etl-oozie-1.2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb2-basics-tools-1.11.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/libthrift-0.13.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ant-1.10.8.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-rxtx-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcommander-1.30.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-common-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/findbugs-annotations-1.3.9-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hk2-locator-2.5.0-b32.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/htrace-core4-4.2.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-orc-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-hdfs-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-io-2.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/joni-2.1.11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.el-3.0.1-b12.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-handler-proxy-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-client-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jta-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/httpcore-4.4.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ecj-4.4.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-svggen-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/transaction-api-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-tcnative-classes-2.0.46.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-resourcemanager-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jpam-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-zookeeper-0.6.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-common-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/metrics-core-3.2.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-epoll-4.1.71.Final-linux-aarch_64.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/bcpkix-jdk15on-1.60.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/okhttp-2.7.5.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/metrics-json-3.1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-haproxy-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jamon-runtime-2.4.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hk2-api-2.5.0-b32.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcl-over-slf4j-1.7.25.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-configuration2-2.1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-webapp-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-jaas-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-hdfs-client-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-fluent-job-api-5.1.0-cdh6.3.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ehcache-3.3.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-io-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/parquet-hadoop-bundle-1.9.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-databind-2.13.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.servlet.jsp-api-2.3.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-daemon-1.0.13.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-simplekdc-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-dns-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/nimbus-jose-jwt-4.41.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsch-0.1.54.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-pkix-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/nashorn-promise-0.1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jasper-runtime-5.5.23.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-ext-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-client-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-metastore-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.jdo-3.2.0-m3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsp-api-2.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/stax2-api-3.1.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/snappy-0.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-shaded-miscellaneous-2.2.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-xdr-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-server-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-20160810.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-script-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xz-1.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slf4j-api-2.0.0-alpha4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-api-2.17.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/protobuf-java-2.5.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-svgrasterizer-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-slf4j-impl-2.8.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javaparser-1.0.11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-common-2.25.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/bcprov-jdk15on-1.60.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-0.23-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-http-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-simple-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-common-0.6.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-client-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-lang-2.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-memcache-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-jaxrs-json-provider-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guice-servlet-4.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/cim-json-parser-1.1.0.9-SNAPSHOT.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-util-6.1.26.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/tephra-api-0.6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-svg-dom-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-udt-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-metrics-api-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/ant-launcher-1.10.8.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-compress-1.18.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-kqueue-4.1.71.Final-osx-x86_64.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/osgi-resource-locator-1.0.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/okio-1.6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-tree-6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-client-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/derby-10.14.1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-runner-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xml-apis-ext-1.3.04.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-xml-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-kqueue-4.1.71.Final-osx-aarch_64.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-sharelib-oozie.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-serde-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/avro-1.8.2-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xmlgraphics-commons-2.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-css-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-parser-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/HikariCP-2.6.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guice-assistedinject-3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-socks-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-scheduler-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-native-macos-4.1.71.Final-osx-x86_64.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-protocol-shaded-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.ws.rs-api-2.0.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-server-2.1.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/tephra-hbase-compat-1.0-0.6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/logback-classic-1.3.0-alpha11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-shims-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-api-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-classification-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jcodings-1.0.18.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-commons-6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-epoll-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/java-util-1.9.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-common-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xercesImpl-2.12.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.servlet.jsp-2.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-crypto-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-sctp-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-anim-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/HikariCP-java7-2.4.12.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-distcp-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-util-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-xml-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-codec-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/httpclient-4.5.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-storage-api-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-math3-3.6.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-handler-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-plus-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-client-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jms-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-gvt-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-native-epoll-4.1.71.Final-linux-x86_64.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/joda-time-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-client-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jasper-compiler-5.5.23.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-zookeeper-2.1.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/asm-all-5.0.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-codec-1.11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slf4j-log4j12-1.7.25.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-mapreduce-2.1.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.activation-api-1.2.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-applicationhistoryservice-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-common-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-schemas-3.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-util-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-service-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/websocket-servlet-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerby-asn1-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/datanucleus-api-jdo-4.2.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-crypto-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-1.2-api-2.8.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-all-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-util-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-container-servlet-core-2.25.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-transcoder-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/tephra-core-0.6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-logging-1.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/opencsv-2.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-server-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-resolver-dns-native-macos-4.1.71.Final-osx-aarch_64.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-core-2.17.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-rasterizer-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-json-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/oozie-sharelib-oozie-5.1.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-media-jaxb-2.25.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-common-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-http-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/taglibs-standard-spec-1.2.5.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/taglibs-standard-impl-1.2.5.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-procedure-2.1.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-metrics-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb2-basics-1.11.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/apache-jsp-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-hadoop-compat-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-transport-classes-epoll-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-6.1.26.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-dbcp-1.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-client-2.25.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-server-2.25.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/audience-annotations-0.5.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/kerb-admin-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-jndi-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/bonecp-0.8.0.RELEASE.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/curator-client-2.12.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-security-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/aopalliance-1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/accessors-smart-1.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/error_prone_annotations-2.3.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jettison-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/validation-api-1.1.0.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jackson-annotations-2.13.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/snappy-java-1.1.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-client-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jsr305-3.0.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/xml-apis-1.4.01.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-stomp-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/libfb303-0.9.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-hadoop2-compat-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/leveldbjni-all-1.8.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/curator-recipes-2.12.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javolution-5.5.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-discovery-core-0.6.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-shaded-netty-3.5.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-core-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/metrics-jvm-3.1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-core-0.6.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-protocol-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/geronimo-jcache_1.0_spec-1.0-alpha-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-yarn-server-common-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-mqtt-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-mapreduce-client-core-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/json-smart-2.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-i18n-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-annotations-3.0.0-cdh6.3.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-llap-server-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-buffer-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jersey-servlet-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-common-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/javax.servlet-api-3.1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hadoop-mapreduce-client-common-3.0.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/apache-jstl-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/twill-discovery-api-0.6.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-util-ajax-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-beanutils-1.9.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-servlet-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/servlet-api-2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/curator-framework-2.12.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/commons-cli-1.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hive-service-rpc-2.1.1-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/slider-core-0.90.2-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/guava-30.1-jre.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-bridge-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/zookeeper-jute-3.6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/graphviz-java-0.7.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/paranamer-2.8.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-redis-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jetty-server-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/jaxb-api-2.2.11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/batik-constants-1.10.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hbase-replication-2.1.0-cdh6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/log4j-1.2.17.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/netty-codec-smtp-4.1.71.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/hk2-utils-2.5.0-b32.jar\:/etc/hadoop/conf.cloudera.yarn\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/gcs-connector-hadoop3-shaded.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-annotations.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-auth.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-aws.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure-datalake.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-kms.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-nfs.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-aws-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-annotations-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-auth-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-nfs-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/gcs-connector-hadoop3-1.9.10-cdh6.3.2-shaded.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure-datalake-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-common-3.0.0-cdh6.3.2-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-kms-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/hadoop-azure-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-thrift.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-scala_2.11.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-protobuf.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-pig.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-pig-bundle.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-jackson.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-hadoop.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-hadoop-bundle.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-generator.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-encoding.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-common.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-column.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-cascading3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-cascading.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-avro.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-format.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-format-sources.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/parquet-format-javadoc.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/audience-annotations-0.5.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/aws-java-sdk-bundle-1.11.271.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/log4j-core-2.8.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/htrace-core4-4.1.0-incubating.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-util-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-beanutils-1.9.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-util-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-lang3-3.7.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-logging-1.1.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-util-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/curator-client-2.12.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/curator-framework-2.12.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/paranamer-2.8.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/gson-2.2.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-codec-1.11.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-annotations-2.9.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-core-2.9.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/asm-5.0.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-databind-2.9.9.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-mapper-asl-1.9.13-cloudera.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-xc-1.9.13.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsr305-3.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jaxb-api-2.2.11.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/httpclient-4.5.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/httpcore-4.4.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/log4j-1.2.17.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-core-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/curator-recipes-2.12.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-servlet-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-server-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-io-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-server-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-http-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/json-smart-2.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jcip-annotations-1.0-1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-xml-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-servlet-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsr311-api-1.1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsp-api-2.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/nimbus-jose-jwt-4.41.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-client-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/re2j-1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-crypto-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-compress-1.18.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-server-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jul-to-slf4j-1.7.25.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-pkix-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jsch-0.1.54.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-cli-1.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-xdr-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-common-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/netty-3.10.6.Final.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/accessors-smart-1.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jersey-json-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/snappy-java-1.1.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/logredactor-2.0.7.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-admin-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-math3-3.1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/xz-1.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-configuration2-2.1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/slf4j-api-1.7.25.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/metrics-core-3.0.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-simplekdc-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/javax.activation-api-1.2.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-net-3.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/azure-data-lake-store-sdk-2.2.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/zookeeper.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/avro.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-config-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerby-asn1-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-webapp-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-lang-2.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/guava-11.0.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/wildfly-openssl-1.0.4.Final.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/woodstox-core-5.0.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/protobuf-java-2.5.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jetty-security-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-core-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-collections-3.2.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jettison-1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/javax.servlet-api-3.1.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/slf4j-log4j12.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/stax2-api-3.1.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/commons-io-2.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/log4j-api-2.8.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/kerb-identity-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/jackson-core-asl-1.9.13.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-httpfs.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-3.0.0-cdh6.3.2-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-nfs-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-httpfs-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-native-client-3.0.0-cdh6.3.2-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/hadoop-hdfs-client-3.0.0-cdh6.3.2-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/accessors-smart-1.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/audience-annotations-0.5.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/avro-1.8.2-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-beanutils-1.9.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/asm-5.0.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-codec-1.11.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-collections-3.2.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-compress-1.18.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-configuration2-2.1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-cli-1.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-lang3-3.7.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/httpclient-4.5.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-math3-3.1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/httpcore-4.4.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/curator-client-2.12.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/curator-framework-2.12.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/curator-recipes-2.12.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/okhttp-2.7.5.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-core-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/htrace-core4-4.1.0-incubating.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/gson-2.2.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-pkix-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-annotations-2.9.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-core-2.9.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-databind-2.9.9.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13-cloudera.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jackson-xc-1.9.13.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/javax.activation-api-1.2.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jettison-1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-io-2.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-core-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jaxb-api-2.2.11.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-server-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-servlet-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/json-smart-2.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-http-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-io-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-security-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-server-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-servlet-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-util-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-util-ajax-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-webapp-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jetty-xml-9.3.25.v20180904.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-net-3.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/json-simple-1.1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/xz-1.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/commons-lang-2.6.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jsr311-api-1.1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-client-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-common-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-admin-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-crypto-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-identity-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-server-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-simplekdc-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jersey-json-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/guava-11.0.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-config-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/jsch-0.1.54.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-util-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerb-util-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-xdr-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/netty-3.10.6.Final.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/nimbus-jose-jwt-4.41.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/okio-1.6.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/log4j-1.2.17.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/paranamer-2.8.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/stax2-api-3.1.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/snappy-java-1.1.4.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/zookeeper-3.4.5-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/woodstox-core-5.0.3.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/kerby-asn1-1.0.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/re2j-1.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-api.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-client.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-common.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-registry.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-common.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-router.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-tests.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-timeline-pluginstorage.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-common-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-api-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-tests-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-router-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-timeline-pluginstorage-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-registry-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-client-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-common-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-3.0.0-cdh6.3.2.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.9.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/ehcache-3.3.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/bcpkix-jdk15on-1.60.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/metrics-core-3.0.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/bcprov-jdk15on-1.60.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/objenesis-1.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/json-io-2.5.1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jersey-client-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.9.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/javax.inject-1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/spark-2.4.0-cdh6.3.2-yarn-shuffle.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jackson-jaxrs-base-2.9.9.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/jersey-guice-1.19.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/fst-2.50.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/aopalliance-1.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/mssql-jdbc-6.2.1.jre7.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/guice-servlet-4.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/java-util-1.9.0.jar\:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/guice-4.0.jar\:\:/etc/hadoop/conf.cloudera.yarn\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/accessors-smart-1.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-3.10.6.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-uploader-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/woodstox-core-5.0.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-servlet-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-common-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/aopalliance-1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/slf4j-api-1.7.25.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/protobuf-java-2.5.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-json-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-common.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/zookeeper.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/xz-1.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/logredactor-2.0.7.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-hadoop.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jul-to-slf4j-1.7.25.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/re2j-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-util-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-configuration2-2.1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-core-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/json-simple-1.1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-archives-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-asn1-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-cli-1.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-crypto-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-webapp-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-mapper-asl-1.9.13-cloudera.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-encoding.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-thrift.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-generator.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-handler-4.1.17.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/okhttp-2.7.5.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/audience-annotations-0.5.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-azure-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-server-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jettison-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/objenesis-1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-jaxrs-base-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/mssql-jdbc-6.2.1.jre7.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-lang3-3.7.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/log4j-core-2.8.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/stax2-api-3.1.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-math3-3.1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-compress-1.18.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jaxb-impl-2.2.3-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.2-tests.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-net-3.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-core-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-pig-bundle.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/avro-1.8.2-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-client-3.0.0-cdh6.3.2-tests.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-archive-logs-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/httpclient-4.5.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-server-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/fst-2.50.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-admin-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-util-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-codec-1.11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-format-sources.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/javax.activation-api-1.2.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-sls-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-pkix-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jcip-annotations-1.0-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/bcpkix-jdk15on-1.60.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/javax.servlet-api-3.1.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/json-io-2.5.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/event-publish-6.3.0-shaded.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-api-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-databind-2.9.9.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/lz4-java-1.5.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/slf4j-log4j12.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-collections-3.2.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/gson-2.2.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-kms-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/zstd-jni-1.3.8-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/azure-keyvault-core-0.8.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-io-2.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-util-ajax-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-applications-distributedshell-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-http-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-resourceestimator-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-guice-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/guice-servlet-4.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsp-api-2.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsr311-api-1.1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-aliyun-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/geronimo-jcache_1.0_spec-1.0-alpha-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-logging-1.1.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/asm-5.0.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-xc-1.9.13.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-common-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-util-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-core-asl-1.9.13.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-registry-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-annotations-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsch-0.1.54.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-streaming-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-aws-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-hadoop-bundle.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-core-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-web-proxy-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-io-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jsr305-3.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-nfs-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-datajoin-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-server-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-cascading.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-annotations-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-xml-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/gcs-connector-hadoop3-1.9.10-cdh6.3.2-shaded.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/curator-framework-2.12.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/zookeeper-3.4.5-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-hs-plugins-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-rumen-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-cascading3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-tests-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/curator-recipes-2.12.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/spark-2.4.0-cdh6.3.2-yarn-shuffle.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/avro.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-nfs-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/htrace-core4-4.1.0-incubating.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-column.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-format-javadoc.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/metrics-core-3.0.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-client-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-config-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-core-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/okio-1.6.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-native-client-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jdom-1.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kafka-clients-2.2.1-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-protobuf.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/guice-4.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/aws-java-sdk-bundle-1.11.271.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/azure-data-lake-store-sdk-2.2.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/bcprov-jdk15on-1.60.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-jaxrs-1.9.13.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-3.0.0-cdh6.3.2-tests.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/ojalgo-43.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jersey-client-1.19.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/log4j-1.2.17.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-codec-4.1.17.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-common-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerby-xdr-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-client-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/paranamer-2.8.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-httpfs-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-nativetask-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-common-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/HikariCP-java7-2.4.12.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-auth-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-lang-2.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-identity-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-hs-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-pig.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-simplekdc-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-scala_2.11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-distcp-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-client-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-common-4.1.17.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/azure-storage-5.4.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-jackson.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/java-util-1.9.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-server-nodemanager-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/javax.inject-1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/snappy-java-1.1.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-avro.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-kafka-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/ehcache-3.3.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/kerb-common-1.0.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/guava-11.0.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-beanutils-1.9.4.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jaxb-api-2.2.11.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-servlet-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/parquet-format.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-openstack-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/aliyun-sdk-oss-2.8.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/nimbus-jose-jwt-4.41.1.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-app-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-gridmix-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/curator-client-2.12.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-jaxrs-json-provider-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jetty-security-9.3.25.v20180904.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-transport-4.1.17.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/commons-daemon-1.0.13.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-codec-http-4.1.17.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-yarn-applications-unmanaged-am-launcher-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-jobclient-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/jackson-module-jaxb-annotations-2.9.9.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-common-3.0.0-cdh6.3.2-tests.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-azure-datalake-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-resolver-4.1.17.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-client-shuffle-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-extras-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-hdfs-native-client-3.0.0-cdh6.3.2-tests.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/wildfly-openssl-1.0.4.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/hadoop-mapreduce-examples-3.0.0-cdh6.3.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/tt-instrumentation-6.3.0.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/netty-buffer-4.1.17.Final.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/httpcore-4.4.6.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/json-smart-2.3.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/log4j-api-2.8.2.jar\:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/leveldbjni-all-1.8.jar\: user.name=oozie java.vm.specification.version=1.8 sun.java.command=org.apache.oozie.action.hadoop.LauncherAM java.home=/usr/lib/jvm/java-1.8.0-amazon-corretto/jre sun.arch.data.model=64 user.language=en java.specification.vendor=Oracle Corporation awt.toolkit=sun.awt.X11.XToolkit java.vm.info=mixed mode java.version=1.8.0_332 java.ext.dirs=/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/ext\:/usr/java/packages/lib/ext sun.boot.class.path=/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/resources.jar\:/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/rt.jar\:/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/sunrsasign.jar\:/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/jsse.jar\:/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/jce.jar\:/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/charsets.jar\:/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/jfr.jar\:/usr/lib/jvm/java-1.8.0-amazon-corretto/jre/classes java.vendor=Amazon.com Inc. file.separator=/ java.vendor.url.bug=https\://github.com/corretto/corretto-8/issues/ sun.io.unicode.encoding=UnicodeLittle sun.cpu.endian=little sun.cpu.isalist= ------------------------ Environment variables ------------------------ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3314-yarn-NODEMANAGER JAVA_HOME=/usr/lib/jvm/java-1.8.0-amazon-corretto/ APP_SUBMIT_TIME_ENV=1654754403115 NM_HOST=data-01.novalocal LD_LIBRARY_PATH= HADOOP_HDFS_HOME=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs LOGNAME=oozie JVM_PID=9513 HADOOP_MAPRED_HOME=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-mapreduce PWD=/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001 HADOOP_COMMON_HOME=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop _=/usr/lib/jvm/java-1.8.0-amazon-corretto//bin/java LOCAL_DIRS=/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167 APPLICATION_WEB_PROXY_BASE=/proxy/application_1654062818531_0167 NM_HTTP_PORT=8042 HADOOP_CLIENT_CONF_DIR=/etc/hadoop/conf.cloudera.yarn LOG_DIRS=/data/yarn/container-logs/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001 PRELAUNCH_OUT=/data/yarn/container-logs/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/prelaunch.out NM_AUX_SERVICE_mapreduce_shuffle=AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA= NM_PORT=8041 HADOOP_YARN_HOME=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn USER=oozie CLASSPATH=/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/*:/etc/hadoop/conf.cloudera.yarn:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/*:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/*:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/*:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/*:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop-yarn/lib/*::/etc/hadoop/conf.cloudera.yarn:/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/mr-framework/*: PRELAUNCH_ERR=/data/yarn/container-logs/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/prelaunch.err HADOOP_TOKEN_FILE_LOCATION=/data/yarn/nm/usercache/oozie/appcache/application_1654062818531_0167/container_e20_1654062818531_0167_01_000001/container_tokens LOCAL_USER_DIRS=/data/yarn/nm/usercache/oozie/ HOME=/home/ SHLVL=1 CONTAINER_ID=container_e20_1654062818531_0167_01_000001 MALLOC_ARENA_MAX=4 ------------------------ ================================================================= >>> Invoking Main class now >>> 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - Loading filesystems 09:09:14.409 [AMRM Heartbeater thread] INFO org.apache.hadoop.conf.Configuration - resource-types.xml not found 09:09:14.409 [AMRM Heartbeater thread] INFO org.apache.hadoop.yarn.util.resource.ResourceUtils - Unable to find 'resource-types.xml'. 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - file:// = class org.apache.hadoop.fs.LocalFileSystem from /data/yarn/nm/filecache/0/8498/hadoop-common-3.0.0-cdh6.3.0.jar 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /data/yarn/nm/filecache/0/8498/hadoop-common-3.0.0-cdh6.3.0.jar 09:09:14.409 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.yarn.util.resource.ResourceUtils - Adding resource type - name = memory-mb, units = Mi, type = COUNTABLE 09:09:14.409 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.yarn.util.resource.ResourceUtils - Adding resource type - name = vcores, units = , type = COUNTABLE 09:09:14.409 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.yarn.util.resource.ResourceUtils - Mandatory Resource 'yarn.resource-types.memory-mb.minimum-allocation' is not configured in resource-types config file. Setting allocation specified using 'yarn.scheduler.minimum-allocation-mb' 09:09:14.409 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.yarn.util.resource.ResourceUtils - Mandatory Resource 'yarn.resource-types.memory-mb.maximum-allocation' is not configured in resource-types config file. Setting allocation specified using 'yarn.scheduler.maximum-allocation-mb' 09:09:14.409 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.yarn.util.resource.ResourceUtils - Mandatory Resource 'yarn.resource-types.vcores.minimum-allocation' is not configured in resource-types config file. Setting allocation specified using 'yarn.scheduler.minimum-allocation-vcores' 09:09:14.409 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.yarn.util.resource.ResourceUtils - Mandatory Resource 'yarn.resource-types.vcores.maximum-allocation' is not configured in resource-types config file. Setting allocation specified using 'yarn.scheduler.maximum-allocation-vcores' 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - ftp:// = class org.apache.hadoop.fs.ftp.FTPFileSystem from /data/yarn/nm/filecache/0/8498/hadoop-common-3.0.0-cdh6.3.0.jar 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - har:// = class org.apache.hadoop.fs.HarFileSystem from /data/yarn/nm/filecache/0/8498/hadoop-common-3.0.0-cdh6.3.0.jar 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /data/yarn/nm/filecache/0/8498/hadoop-common-3.0.0-cdh6.3.0.jar 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /data/yarn/nm/filecache/0/8498/hadoop-common-3.0.0-cdh6.3.0.jar 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /data/yarn/nm/filecache/0/8448/hadoop-hdfs-client-3.0.0-cdh6.3.0.jar 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /data/yarn/nm/filecache/0/8448/hadoop-hdfs-client-3.0.0-cdh6.3.0.jar 09:09:14.409 [main] DEBUG org.apache.hadoop.fs.FileSystem - swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /data/yarn/nm/filecache/0/8448/hadoop-hdfs-client-3.0.0-cdh6.3.0.jar 09:09:14.410 [main] DEBUG org.apache.hadoop.fs.FileSystem - gs:// = class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem from /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/gcs-connector-hadoop3-1.9.10-cdh6.3.2-shaded.jar 09:09:14.410 [main] DEBUG org.apache.hadoop.fs.FileSystem - s3n:// = class org.apache.hadoop.fs.s3native.NativeS3FileSystem from /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/hadoop-aws-3.0.0-cdh6.3.2.jar 09:09:14.410 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking for FS supporting hdfs 09:09:14.410 [main] DEBUG org.apache.hadoop.fs.FileSystem - looking for configuration option fs.hdfs.impl 09:09:14.410 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking in service filesystems for implementation class 09:09:14.410 [main] DEBUG org.apache.hadoop.fs.FileSystem - FS for hdfs is class org.apache.hadoop.hdfs.DistributedFileSystem 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.use.legacy.blockreader.local = false 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.read.shortcircuit = false 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.domain.socket.data.traffic = false 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.domain.socket.path = /var/run/hdfs-sockets/dn 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0 09:09:14.410 [main] DEBUG org.apache.hadoop.security.token.Token - Cloned private token Kind: HDFS_DELEGATION_TOKEN, Service: 10.106.8.128:8020, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) from Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.HAUtilClient - Mapped HA service delegation token for logical URI hdfs://nameservice1/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0 to namenode cm-hue-01.novalocal/10.106.8.128:8020 09:09:14.410 [main] DEBUG org.apache.hadoop.security.token.Token - Cloned private token Kind: HDFS_DELEGATION_TOKEN, Service: 10.106.8.129:8020, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) from Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.HAUtilClient - Mapped HA service delegation token for logical URI hdfs://nameservice1/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0 to namenode name-01.novalocal/10.106.8.129:8020 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.use.legacy.blockreader.local = false 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.read.shortcircuit = false 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.domain.socket.data.traffic = false 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.domain.socket.path = /var/run/hdfs-sockets/dn 09:09:14.410 [main] DEBUG org.apache.hadoop.io.retry.RetryUtils - multipleLinearRandomRetry = null 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.410 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library... 09:09:14.410 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path 09:09:14.410 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 09:09:14.410 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 09:09:14.410 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Both short-circuit local reads and UNIX domain socket are disabled. 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to cm-hue-01.novalocal/10.106.8.128:8020 09:09:14.410 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ClientNamenodeProtocolPB 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk= 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk=\",realm=\"default\",nonce=\"r1ZeIT4RdhEfWeOGWubjjNC9w4ajXtXlJel4WyWj\",nc=00000001,cnonce=\"jvRzrjCBfmf/xqSTOCixMkSkyJuikCxf9DFQgJY+\",digest-uri=\"/default\",maxbuf=65536,response=ff15196dd14ba4d97ad552c28d590feb,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.410 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedActionException as:oozie (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.410 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719) 09:09:14.410 [main] WARN org.apache.hadoop.ipc.Client - Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.410 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedActionException as:oozie (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - closing ipc connection to cm-hue-01.novalocal/10.106.8.128:8020: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error org.apache.hadoop.ipc.RemoteException: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:374) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614) at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:799) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:795) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560) at org.apache.hadoop.ipc.Client.call(Client.java:1391) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:875) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1496) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1493) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1508) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1617) at org.apache.oozie.action.hadoop.HdfsOperations.fileExists(HdfsOperations.java:77) at org.apache.oozie.action.hadoop.LauncherAM.setRecoveryId(LauncherAM.java:473) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:403) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8020 from oozie: closed 09:09:14.410 [main] DEBUG org.apache.hadoop.io.retry.RetryInvocationHandler - org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error, while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over cm-hue-01.novalocal/10.106.8.128:8020. Trying to failover immediately. org.apache.hadoop.ipc.RemoteException: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499) at org.apache.hadoop.ipc.Client.call(Client.java:1445) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:875) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1496) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1493) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1508) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1617) at org.apache.oozie.action.hadoop.HdfsOperations.fileExists(HdfsOperations.java:77) at org.apache.oozie.action.hadoop.LauncherAM.setRecoveryId(LauncherAM.java:473) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:403) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) 09:09:14.410 [main] DEBUG org.apache.hadoop.io.retry.RetryUtils - multipleLinearRandomRetry = null 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to name-01.novalocal/10.106.8.129:8020 09:09:14.410 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ClientNamenodeProtocolPB 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk= 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.410 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk=\",realm=\"default\",nonce=\"oFfhBfFD0RbcPYkGELXtNNetsjLI5LBbzUPrNtQL\",nc=00000001,cnonce=\"k4lV1K7+dgugCmgerD3iF9kH66TN/+tH1+6AZa6y\",digest-uri=\"/default\",maxbuf=65536,response=9daf5dfa3c5191fc0d7b906045244d32,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.Client - Negotiated QOP is :auth 09:09:14.410 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie: starting, having connections 2 09:09:14.410 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #2 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo 09:09:14.410 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #2 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: getFileInfo took 8ms 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0: masked={ masked: rw-r--r--, unmasked: rw-rw-rw- } 09:09:14.410 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #3 org.apache.hadoop.hdfs.protocol.ClientProtocol.create 09:09:14.410 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #3 09:09:14.410 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: create took 12ms 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - computePacketChunkSize: src=/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0, chunkSize=516, chunksPerPacket=126, packetSize=65016 09:09:14.410 [LeaseRenewer:oozie@nameservice1] DEBUG org.apache.hadoop.hdfs.client.impl.LeaseRenewer - Lease renewer daemon for [DFSClient_NONMAPREDUCE_-484883445_1] with renew id 1 started 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - WriteChunk allocating new packet seqno=0, src=/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0, packetSize=65016, chunksPerPacket=126, bytesCurBlock=0, DFSOutputStream:block==null 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.DataStreamer - Queued packet seqno: 0 offsetInBlock: 0 lastPacketInBlock: false lastByteOffsetInBlock: 30, block==null 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.DataStreamer - Queued packet seqno: 1 offsetInBlock: 30 lastPacketInBlock: true lastByteOffsetInBlock: 30, block==null 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.hdfs.DataStreamer - stage=PIPELINE_SETUP_CREATE, block==null 09:09:14.410 [main] DEBUG org.apache.hadoop.hdfs.DataStreamer - block==null waiting for ack for: 1 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.hdfs.DataStreamer - Allocating new block: block==null 09:09:14.410 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #4 org.apache.hadoop.hdfs.protocol.ClientProtocol.addBlock 09:09:14.410 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #4 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: addBlock took 36ms 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.hdfs.DataStreamer - pipeline = [DatanodeInfoWithStorage[10.106.8.130:1004,DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99,DISK], DatanodeInfoWithStorage[10.106.8.131:1004,DS-df71d723-6cd0-49c0-bc2a-339500d37ba2,DISK]], blk_1085484631_11743892 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.hdfs.DataStreamer - Connecting to datanode 10.106.8.130:1004 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.hdfs.DataStreamer - Send buf size 1838592 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 09:09:14.410 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #5 org.apache.hadoop.hdfs.protocol.ClientProtocol.getServerDefaults 09:09:14.410 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #5 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: getServerDefaults took 2ms 09:09:14.410 [Thread-8] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL client skipping handshake in secured configuration with privileged port for addr = /10.106.8.130, datanodeId = DatanodeInfoWithStorage[10.106.8.130:1004,DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99,DISK] 09:09:14.411 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0 block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892] DEBUG org.apache.hadoop.hdfs.DataStreamer - nodes [DatanodeInfoWithStorage[10.106.8.130:1004,DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99,DISK], DatanodeInfoWithStorage[10.106.8.131:1004,DS-df71d723-6cd0-49c0-bc2a-339500d37ba2,DISK]] storageTypes [DISK, DISK] storageIDs [DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99, DS-df71d723-6cd0-49c0-bc2a-339500d37ba2] 09:09:14.411 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0 block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892] DEBUG org.apache.hadoop.hdfs.DataStreamer - blk_1085484631_11743892 sending packet seqno: 0 offsetInBlock: 0 lastPacketInBlock: false lastByteOffsetInBlock: 30 09:09:14.411 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0 block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892] DEBUG org.apache.hadoop.hdfs.DataStreamer - stage=DATA_STREAMING, blk_1085484631_11743892 09:09:14.411 [ResponseProcessor for block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892] DEBUG org.apache.hadoop.hdfs.DataStreamer - DFSClient seqno: 0 reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 733946 flag: 0 flag: 0 09:09:14.411 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0 block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892] DEBUG org.apache.hadoop.hdfs.DataStreamer - blk_1085484631_11743892 sending packet seqno: 1 offsetInBlock: 30 lastPacketInBlock: true lastByteOffsetInBlock: 30 09:09:14.411 [ResponseProcessor for block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892] DEBUG org.apache.hadoop.hdfs.DataStreamer - DFSClient seqno: 1 reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 1168107 flag: 0 flag: 0 09:09:14.411 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/0000084-220601112151640-oozie-oozi-W@impala-etl@0 block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892] DEBUG org.apache.hadoop.hdfs.DataStreamer - Closing old block BP-602310420-10.106.8.128-1602151336997:blk_1085484631_11743892 09:09:14.411 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #6 org.apache.hadoop.hdfs.protocol.ClientProtocol.complete 09:09:14.411 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #6 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: complete took 35ms 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client@311bf055 Launcher class: class org.apache.oozie.action.hadoop.JavaMain INFO: loading log4j config file log4j.properties. INFO: log4j config file log4j.properties loaded successfully. 09:09:14.411 [main] DEBUG org.apache.hadoop.conf.Configuration - Handling deprecation for all properties in config... 09:09:14.411 [main] DEBUG org.apache.hadoop.conf.Configuration - Handling deprecation for oozie.action.id 09:09:14.411 [main] DEBUG org.apache.hadoop.conf.Configuration - Handling deprecation for oozie.job.id 09:09:14.411 [main] DEBUG org.apache.hadoop.conf.Configuration - Handling deprecation for mapreduce.job.tags 09:09:14.411 [main] DEBUG org.apache.hadoop.conf.Configuration - Handling deprecation for oozie.launcher.job.id {"properties":[{"key":"oozie.launcher.job.id","value":"0000084-220601112151640-oozie-oozi-W","isFinal":false,"resource":"programmatically"},{"key":"oozie.job.id","value":"0000084-220601112151640-oozie-oozi-W","isFinal":false,"resource":"programmatically"},{"key":"oozie.action.id","value":"0000084-220601112151640-oozie-oozi-W@impala-etl","isFinal":false,"resource":"programmatically"},{"key":"mapreduce.job.tags","value":"oozie-e85ca41e4a643e69139ea447d26602fe","isFinal":false,"resource":"programmatically"}]}Setting [tez.application.tags] tag: oozie-e85ca41e4a643e69139ea447d26602fe Setting [spark.yarn.tags] tag: oozie-e85ca41e4a643e69139ea447d26602fe Fetching child yarn jobs tag id : oozie-e85ca41e4a643e69139ea447d26602fe 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.yarn.client.RMProxy.getProxy(RMProxy.java:147) 09:09:14.411 [main] DEBUG org.apache.hadoop.yarn.ipc.YarnRPC - Creating YarnRPC for org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC 09:09:14.411 [main] DEBUG org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC - Creating a HadoopYarnProtoRpc proxy for protocol interface org.apache.hadoop.yarn.api.ApplicationClientProtocol 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to cm-hue-01.novalocal/10.106.8.128:8032 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationClientProtocolPB info:org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo$2@58112bc4 09:09:14.411 [main] DEBUG org.apache.hadoop.yarn.security.client.RMDelegationTokenSelector - Looking for a token with service 10.106.8.128:8032 09:09:14.411 [main] DEBUG org.apache.hadoop.yarn.security.client.RMDelegationTokenSelector - Token kind is RM_DELEGATION_TOKEN and the token's service name is 10.106.8.128:8032,10.106.8.129:8032 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ApplicationClientProtocolPB 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: CgVvb3ppZRIEeWFybholb296aWUvZGF0YS0wMi5ub3ZhbG9jYWxAQ0lNLklWU0cuQVVUSCCrvLG4lDAoq8Tj2JYwMIL8AjiEAw== 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"CgVvb3ppZRIEeWFybholb296aWUvZGF0YS0wMi5ub3ZhbG9jYWxAQ0lNLklWU0cuQVVUSCCrvLG4lDAoq8Tj2JYwMIL8AjiEAw==\",realm=\"default\",nonce=\"TFvgwfqaf/ZK7wLcseo0svGFqNOqvcWnffLm0Ogt\",nc=00000001,cnonce=\"HY5mcTexlVZbXQN9bdEzNreKqy0h22g30nabXiO4\",digest-uri=\"/default\",maxbuf=65536,response=81c6356db23f454b1986b6afc4246988,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - Negotiated QOP is :auth 09:09:14.411 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie: starting, having connections 3 09:09:14.411 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie sending #7 org.apache.hadoop.yarn.api.ApplicationClientProtocolPB.getApplications 09:09:14.411 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie got value #7 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: getApplications took 21ms No child applications found Java action main class : com.cisco.cim.oozie.action.ImpalaETLAction Java action arguments : hdfs://nameservice1/user/oozie/oozie/res/config.properties 2022-06-09T11:30+0530 jdbc:hive2://cm-hue-01.novalocal:10000/;principal=hive/cm-hue-01.novalocal@CIM.IVSG.AUTH jdbc:hive2://cm-hue-01.novalocal:21050/;principal=impala/cm-hue-01.novalocal@CIM.IVSG.AUTH 10 30 100000 GMT+0530 /usr/local/etl/etl.keytab hdfs://nameservice1/user/oozie/oozie/res/ 09:09:14.411 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS 09:09:14.411 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking for FS supporting hdfs 09:09:14.411 [main] DEBUG org.apache.hadoop.fs.FileSystem - looking for configuration option fs.hdfs.impl 09:09:14.411 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking in service filesystems for implementation class 09:09:14.411 [main] DEBUG org.apache.hadoop.fs.FileSystem - FS for hdfs is class org.apache.hadoop.hdfs.DistributedFileSystem 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.use.legacy.blockreader.local = false 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.read.shortcircuit = false 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.domain.socket.data.traffic = false 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.domain.socket.path = /var/run/hdfs-sockets/dn 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0 09:09:14.411 [main] DEBUG org.apache.hadoop.security.token.Token - Cloned private token Kind: HDFS_DELEGATION_TOKEN, Service: 10.106.8.128:8020, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) from Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.HAUtilClient - Mapped HA service delegation token for logical URI hdfs://nameservice1 to namenode cm-hue-01.novalocal/10.106.8.128:8020 09:09:14.411 [main] DEBUG org.apache.hadoop.security.token.Token - Cloned private token Kind: HDFS_DELEGATION_TOKEN, Service: 10.106.8.129:8020, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) from Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.HAUtilClient - Mapped HA service delegation token for logical URI hdfs://nameservice1 to namenode name-01.novalocal/10.106.8.129:8020 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.use.legacy.blockreader.local = false 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.read.shortcircuit = false 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.domain.socket.data.traffic = false 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.domain.socket.path = /var/run/hdfs-sockets/dn 09:09:14.411 [main] DEBUG org.apache.hadoop.io.retry.RetryUtils - multipleLinearRandomRetry = null 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to cm-hue-01.novalocal/10.106.8.128:8020 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ClientNamenodeProtocolPB 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk= 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.411 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk=\",realm=\"default\",nonce=\"y9Dmj3yobQMEZrjBUmEGNQwULZoh7D//JxVQpg/S\",nc=00000001,cnonce=\"r5GMmpXLDo+x0w4hgfEix6BUC6hAmbwllA8q50Lq\",digest-uri=\"/default\",maxbuf=65536,response=b0b58545e3642bf6e21cdc1f855b383a,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedActionException as:oozie (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719) 09:09:14.411 [main] WARN org.apache.hadoop.ipc.Client - Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedActionException as:oozie (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - closing ipc connection to cm-hue-01.novalocal/10.106.8.128:8020: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error org.apache.hadoop.ipc.RemoteException: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:374) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614) at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:799) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:795) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560) at org.apache.hadoop.ipc.Client.call(Client.java:1391) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy15.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:304) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy16.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:859) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:848) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:837) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1005) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:317) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:313) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:325) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:898) at com.cisco.cim.oozie.util.HDFSAccessor.readProperties(HDFSAccessor.java:38) at com.cisco.cim.oozie.action.ImpalaETLAction.main(ImpalaETLAction.java:83) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104) at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8020 from oozie: closed 09:09:14.411 [main] DEBUG org.apache.hadoop.io.retry.RetryInvocationHandler - org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error, while invoking ClientNamenodeProtocolTranslatorPB.getBlockLocations over cm-hue-01.novalocal/10.106.8.128:8020. Trying to failover immediately. org.apache.hadoop.ipc.RemoteException: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499) at org.apache.hadoop.ipc.Client.call(Client.java:1445) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy15.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:304) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy16.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:859) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:848) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:837) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1005) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:317) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:313) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:325) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:898) at com.cisco.cim.oozie.util.HDFSAccessor.readProperties(HDFSAccessor.java:38) at com.cisco.cim.oozie.action.ImpalaETLAction.main(ImpalaETLAction.java:83) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104) at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) 09:09:14.411 [main] DEBUG org.apache.hadoop.io.retry.RetryUtils - multipleLinearRandomRetry = null 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.411 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #8 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations 09:09:14.411 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #8 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: getBlockLocations took 2ms 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - newInfo = LocatedBlocks{; fileLength=358; underConstruction=false; blocks=[LocatedBlock{BP-602310420-10.106.8.128-1602151336997:blk_1085484534_11743795; getBlockSize()=358; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.106.8.132:1004,DS-49d8f531-ee28-49e2-b98e-98128258099b,DISK], DatanodeInfoWithStorage[10.106.8.131:1004,DS-df71d723-6cd0-49c0-bc2a-339500d37ba2,DISK]]}]; lastLocatedBlock=LocatedBlock{BP-602310420-10.106.8.128-1602151336997:blk_1085484534_11743795; getBlockSize()=358; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.106.8.131:1004,DS-df71d723-6cd0-49c0-bc2a-339500d37ba2,DISK], DatanodeInfoWithStorage[10.106.8.132:1004,DS-49d8f531-ee28-49e2-b98e-98128258099b,DISK]]}; isLastBlockComplete=true; ecPolicy=null} 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - Connecting to datanode 10.106.8.132:1004 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 09:09:14.411 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #9 org.apache.hadoop.hdfs.protocol.ClientProtocol.getServerDefaults 09:09:14.411 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #9 09:09:14.411 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: getServerDefaults took 2ms 09:09:14.411 [main] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL client skipping handshake in secured configuration with privileged port for addr = /10.106.8.132, datanodeId = DatanodeInfoWithStorage[10.106.8.132:1004,DS-49d8f531-ee28-49e2-b98e-98128258099b,DISK] 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using kerberos user:hive/cm-hue-01.novalocal@CIM.IVSG.AUTH 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "hive/cm-hue-01.novalocal@CIM.IVSG.AUTH" with name hive/cm-hue-01.novalocal@CIM.IVSG.AUTH 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "hive/cm-hue-01.novalocal@CIM.IVSG.AUTH" 09:09:14.411 [main] INFO org.apache.hadoop.security.UserGroupInformation - Login successful for user hive/cm-hue-01.novalocal@CIM.IVSG.AUTH using keytab file /usr/local/etl/etl.keytab. Keytab auto renewal enabled : false 09:09:14.411 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:hive/cm-hue-01.novalocal@CIM.IVSG.AUTH (auth:KERBEROS) from:com.cisco.cim.oozie.util.DBManager.getHiveConnection(DBManager.java:76) 09:09:14.412 [main] INFO org.apache.hive.jdbc.Utils - Supplied authorities: cm-hue-01.novalocal:10000 09:09:14.412 [main] INFO org.apache.hive.jdbc.Utils - Resolved authority: cm-hue-01.novalocal:10000 09:09:14.412 [main] DEBUG org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge - Current authMethod = KERBEROS 09:09:14.412 [main] DEBUG org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge - Not setting UGI conf as passed-in authMethod of kerberos = current. 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:hive/cm-hue-01.novalocal@CIM.IVSG.AUTH (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208) 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:hive/cm-hue-01.novalocal@CIM.IVSG.AUTH (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - opening transport org.apache.thrift.transport.TSaslClientTransport@385e36d4 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslClientTransport - Sending mechanism name GSSAPI and initial response of length 651 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status START and payload length 6 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status OK and payload length 651 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Start message handled 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status OK and payload length 108 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status OK and payload length 0 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status OK and payload length 32 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status COMPLETE and payload length 32 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Main negotiation loop complete 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: SASL Client receiving last message 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status COMPLETE and payload length 0 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 71 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using kerberos user:impala/cm-hue-01.novalocal@CIM.IVSG.AUTH 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "impala/cm-hue-01.novalocal@CIM.IVSG.AUTH" with name impala/cm-hue-01.novalocal@CIM.IVSG.AUTH 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "impala/cm-hue-01.novalocal@CIM.IVSG.AUTH" 09:09:14.412 [main] INFO org.apache.hadoop.security.UserGroupInformation - Login successful for user impala/cm-hue-01.novalocal@CIM.IVSG.AUTH using keytab file /usr/local/etl/etl.keytab. Keytab auto renewal enabled : false 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:impala/cm-hue-01.novalocal@CIM.IVSG.AUTH (auth:KERBEROS) from:com.cisco.cim.oozie.util.DBManager.getImpalaConnection(DBManager.java:97) 09:09:14.412 [main] INFO org.apache.hive.jdbc.Utils - Supplied authorities: cm-hue-01.novalocal:21050 09:09:14.412 [main] INFO org.apache.hive.jdbc.Utils - Resolved authority: cm-hue-01.novalocal:21050 09:09:14.412 [main] DEBUG org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge - Current authMethod = KERBEROS 09:09:14.412 [main] DEBUG org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge - Not setting UGI conf as passed-in authMethod of kerberos = current. 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:impala/cm-hue-01.novalocal@CIM.IVSG.AUTH (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208) 09:09:14.412 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:impala/cm-hue-01.novalocal@CIM.IVSG.AUTH (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - opening transport org.apache.thrift.transport.TSaslClientTransport@3effd4f3 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslClientTransport - Sending mechanism name GSSAPI and initial response of length 638 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status START and payload length 6 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status OK and payload length 638 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Start message handled 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status OK and payload length 156 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status OK and payload length 0 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status OK and payload length 32 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status COMPLETE and payload length 32 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Main negotiation loop complete 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: SASL Client receiving last message 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status COMPLETE and payload length 0 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 71 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 2526 09:09:14.412 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create database if not exists cimdata 09:09:14.412 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 155 09:09:14.413 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.413 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.413 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 254 09:09:14.413 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: invalidate metadata 09:09:14.413 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 137 09:09:14.418 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.418 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.418 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.418 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.etlLog (tableName String, sql String, status String, dataStartTimestamp bigint, dataStopTimestamp bigint) 09:09:14.418 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.418 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.418 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 258 09:09:14.419 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie: closed 09:09:14.419 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie: stopped, remaining connections 2 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.420 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: select distinct dataStartTimestamp, dataStopTimestamp from cimdata.etllog 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 191 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 173 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 115 09:09:14.420 [main] INFO com.cisco.cim.oozie.util.ETLLog - ETLLog start/stop timestamp are: 0/0 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 132 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.420 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 136 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 125 09:09:14.421 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 96 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.421 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create database if not exists cim 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 151 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 254 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 132 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 136 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 125 09:09:14.421 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 96 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.421 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create database if not exists cimdata 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 155 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 254 09:09:14.421 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: invalidate metadata 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.421 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 137 09:09:14.421 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie: closed 09:09:14.421 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8032 from oozie: stopped, remaining connections 1 09:09:14.421 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie: closed 09:09:14.421 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie: stopped, remaining connections 0 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.426 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.etltime (endtimestamp bigint) 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 178 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.426 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Locations ( rowkey string,tenantId string,sid string,city string,sampleTimestamp bigint,country string,name string,receiveTimestamp bigint,parent string,ancestor string,type string,sampleTimestampID int,timezone string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:tenantId,d:sid,d:city,d:sampleTimestamp#b,d:country,d:name,d:receiveTimestamp#b,d:parent,d:ancestor,d:type,d:sampleTimestampID#b,d:timezone') TBLPROPERTIES('hbase.table.name' = 'cim.Locations') 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 747 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.426 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: invalidate metadata 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.426 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 137 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.430 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Locations_hbase ( rowkey string,sid string,city string,sampleTimestamp bigint,name string,parent string,ancestor string,tenantId string,country string,receiveTimestamp bigint,type string,sampleTimestampID int,locationId string,timezone string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:sid,d:city,d:sampleTimestamp#b,d:name,d:parent,d:ancestor,d:tenantId,d:country,d:receiveTimestamp#b,d:type,d:sampleTimestampID#b,d:locationId,d:timezone') TBLPROPERTIES('hbase.table.name' = 'cim.Locations') 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 784 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.430 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Locations(sampleTimestamp bigint,parent string,country string,city string,sampleTimestampID int,timezone string,ancestor string,tenantId string,name string,receiveTimestamp bigint,type string,sid string) 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.430 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 356 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.435 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingspot_event_hourly(locationid string,city string,timeid int,year int,month int,week int,weekday int,monthweek int,day int,hour int,noparkingzoneviolation_count int,parkingpotentialexpiry_count int,limitexceededviolation_count int,fareexpendedviolation_count int,potentialfareexpiry_count int,parkingspotid string,parkingspaceid string) 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 494 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.435 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingarea_event_hourly(locationid string,city string,timeid int,year int,month int,week int,weekday int,monthweek int,day int,hour int,noparkingzoneviolation_count int,limitexceededviolation_count int,fareexpendedviolation_count int,parkingareaid string,parkingspaceid string) 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 431 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.435 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.roadsegment_event_hourly(locationid string,city string,timeid int,year int,month int,week int,weekday int,monthweek int,day int,hour int,accident_count int,congestion_count int,wrongway_count int,speeding_count int,vehiclestopped_count int,constructionwork_count int,roadsegmentid string) 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 441 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.435 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.incident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,day smallint,incidenttype string,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 588 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.435 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingincident_event_hourly(code string,week smallint,timeid int,city string,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,hour smallint,day smallint,create_count int,severity string,monthweek smallint,total_count int,update_count int,parkingspotid string,destroy_count int,month smallint,locationid string,parkingspaceid string,parkingareaid string,incidenttype string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 659 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.435 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.lightincident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,day smallint,incidenttype string,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 593 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.435 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.trafficincident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,roadsegmentlaneid string,locationid string,day smallint,incidenttype string,create_count int,roadsegmentid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.435 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 641 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.mobilityincident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,roadsegmentlaneid string,locationid string,day smallint,incidenttype string,create_count int,roadsegmentid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 642 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.transitincident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,day smallint,incidenttype string,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 595 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.wasteincident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,day smallint,incidenttype string,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 593 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.environmentincident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,day smallint,incidenttype string,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 599 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.networkincident_event_hourly(severity string,monthweek smallint,code string,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,providerdetails_provider string,deviceid string,devicetype string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,day smallint,incidenttype string,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 629 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.genericevent_event_hourly(severity string,monthweek smallint,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,eventtype string,day smallint,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 591 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingevent_event_hourly(week smallint,timeid int,city string,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,hour smallint,day smallint,create_count int,severity string,monthweek smallint,total_count int,update_count int,parkingspotid string,destroy_count int,month smallint,locationid string,parkingspaceid string,eventtype string,parkingareaid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 655 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.lightevent_event_hourly(severity string,monthweek smallint,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,eventtype string,day smallint,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 589 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.trafficevent_event_hourly(severity string,monthweek smallint,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,roadsegmentlaneid string,locationid string,eventtype string,day smallint,create_count int,roadsegmentid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 637 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.mobilityevent_event_hourly(severity string,monthweek smallint,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,roadsegmentlaneid string,locationid string,eventtype string,day smallint,create_count int,roadsegmentid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 638 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.transitevent_event_hourly(severity string,monthweek smallint,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,eventtype string,day smallint,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 591 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.wasteevent_event_hourly(severity string,monthweek smallint,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,eventtype string,day smallint,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 589 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.environmentevent_event_hourly(severity string,monthweek smallint,week smallint,timeid int,city string,total_count int,update_count int,thirdpartyid string,weekday smallint,providerdetails_providerid string,source string,providerdetails_provider string,close_count int,destroy_count int,month smallint,hour smallint,locationid string,eventtype string,day smallint,create_count int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 595 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.wastecollectionridership_state_hourly(week smallint,timeid int,city string,distance_max double,weekday smallint,count_sum bigint,binsvisited_max int,sid string,totalwastecollected_sum double,driverid string,hour smallint,distance_min double,distance_avg double,distance_sum double,day smallint,monthweek smallint,binsvisited_sum bigint,totalwastecollected_max double,agencyid string,wastecollectiontripid string,month smallint,totalwastecollected_min double,totalwastecollected_avg double,locationid string,vehicleid string,binsvisited_min int,binsvisited_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 782 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.wastebin_state_hourly(tilt_min double,tilt_avg double,week smallint,timeid int,city string,temperature_max double,weekday smallint,sid string,filllevel_max double,hour smallint,temperature_min double,temperature_avg double,day smallint,filllevel_min double,filllevel_avg double,monthweek smallint,weight_min double,wastespaceid string,count int,weight_avg double,tilt_max double,month smallint,weight_max double,locationid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 645 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.roadsegment_state_hourly(monthweek smallint,week smallint,timeid int,city string,count int,weekday smallint,sid string,month smallint,hour smallint,freeflowmins_max double,locationid string,freeflowmins_min double,freeflowmins_avg double,day smallint,roadsegmentid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 487 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingspace_total(sid string,total int,tenantid string) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 209 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.datelist(monthweek smallint,week smallint,month smallint,hour smallint,timeid int,year smallint,city string,sampletimestamp bigint,weekday smallint,utcid int,timezoneoffset int,day smallint) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 343 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.datelist_half(monthweek smallint,week smallint,month smallint,hour smallint,timeid int,year smallint,city string,sampletimestamp bigint,weekday smallint,halfutcid int,timezoneoffset int,day smallint) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 352 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingspot_update(city string,year int,expectedrevenue double,sid string,sampletimestampid int,month int,locationid string,sampletimestamp bigint,parkingspaceid string,tenantid string,timezoneoffset bigint,sampletimestamp_delta bigint,occupied int) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 402 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.temporal_inner(sid string,isvalid boolean,tenantid string,entityid string,entitytype string,mobilitystatsname string,city string,temporalmobilitystatsname string) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 315 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.spatial_inner(sid string,isvalid boolean,tenantid string,entityid string,entitytype string,mobilitystatsname string,city string,spatialmobilitystatsname string) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 313 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.personaldevice_update(roiid_before string,receivetimestamp bigint,sampletimestamp bigint,sampletimestamp_before bigint,tenantid string,macaddress string,sampletimestamp_delta bigint,sid string,sampletimestampid bigint) partitioned by (roiid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 420 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.personaldevice_update_period(before_sampletimestamp bigint,sampletimestamp bigint,tenantid string,macaddress string,roiid string,sampletimestampid bigint) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 307 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.personaldevice_bounce_period(before_sampletimestamp bigint,roiid_before string,sampletimestamp bigint,tenantid string,macaddress string,sampletimestamp_delta bigint,roiid string,sid string,sampletimestampid bigint) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 367 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.personaldevice_update_nonredundant(sampletimestamp bigint,macaddress string) partitioned by (roiid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 278 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.personaldevice_update_hourly(sessiontime_count_20_30 bigint,monthweek smallint,week smallint,timeid int,city string,count bigint,weekday smallint,repeat_count bigint,sessiontime_count_0_2 bigint,sessiontime_count_10_20 bigint,month smallint,hour smallint,first_count bigint,sampletimestamp bigint,locationid string,sessiontime_count_2_10 bigint,roiid string,day smallint) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 586 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.personaldevice_update_bounce_hourly(sessiontime_count_20_30 bigint,monthweek smallint,week smallint,timeid int,city string,count bigint,weekday smallint,repeat_count bigint,sessiontime_count_0_2 bigint,sessiontime_count_10_20 bigint,month smallint,hour smallint,first_count bigint,sampletimestamp bigint,locationid string,sessiontime_count_2_10 bigint,roiid string,day smallint) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 593 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parking_session_hourly(revenue_max double,monthweek smallint,week smallint,timeid int,city string,count int,weekday smallint,entityid string,revenue_sum double,sessiontime_min bigint,revenue_min double,sid string,sessiontime_max bigint,entitytype string,month smallint,hour smallint,sessiontime_sum bigint,sampletimestamp bigint,locationid string,parkingspaceid string,turnover boolean,day smallint) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 614 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AQIEntryData_state(sampleTimestamp bigint,code string,sources string,timezone string,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,sampleTimestampID int,locationId string,timezoneoffset int,attributeName string,value double,day int,sensitiveGroups string,announcement string,period string,advisory string,entityType string,entityId string,pollutant string,effects string,tenantId string,category string,startTimestamp bigint,isvalid boolean) partitioned by (year int,month int,city string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 692 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AQIEntry_statisticdata_hourly(code string,week int,sources string,timeid int,weekday int,parentEntityType string,sid string,hour int,timezoneoffset int,attributeName string,value double,day int,sensitiveGroups string,announcement string,period string,monthweek int,advisory string,value_min double,value_avg Double,entityType string,entityId string,pollutant string,effects string,locationid string,value_max double,category string,startTimestamp bigint, month int,city string) partitioned by (year int,tenantId string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 691 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AQIEntry_statistic_hourly(geocoordinates_longitude double,code string,week int,sources string,timeid int,weekday int,parentEntityType string,sid string,hour int,geocoordinates_latitude double,timezoneoffset int,attributeName string,value double,day int,sensitiveGroups string,announcement string,period string,monthweek int,advisory string,value_min double,value_avg Double,entityType string,geocoordinates_altitude double,entityId string,pollutant string,effects string,locationid string,value_max double,category string,startTimestamp bigint,month int,city string) partitioned by (year int,tenantId string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 780 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ParkingSpace_hierarchy ( rowkey string,length int,entityId string,entityType string,sampleTimestamp bigint,tenantId String,receiveTimestamp bigint,ancestorId string,ancestorType string,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:length#b,d:entityId,d:entityType,d:sampleTimestamp#b,d:tenantId#b,d:receiveTimestamp#b,d:ancestorId,d:ancestorType,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.ParkingSpace.hierarchy') 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 745 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingSpace_hierarchy(ancestorType string,sampleTimestamp bigint,sampleTimestampID int,entityType string,ancestorId string,length int,tenantid string,entityId string,receiveTimestamp bigint) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 344 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ParkingSpace_hbase ( rowkey string,opParams_zoneType string,sid string,boundary_geoPoint string,demarcated boolean,sampleTimestamp bigint,isValid boolean,parent string,siblingIndex int,ancestor string,tenantId string,receiveTimestamp bigint,label string,undemarcated boolean,opParams_maxDurationMinutes int,sampleTimestampID int,locationId string,levelLabel string,operatedBy string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:opParams_zoneType,d:sid,d:boundary_geoPoint,d:demarcated#b,d:sampleTimestamp#b,d:isValid#b,d:parent,d:siblingIndex#b,d:ancestor,d:tenantId,d:receiveTimestamp#b,d:label,d:undemarcated#b,d:opParams_maxDurationMinutes#b,d:sampleTimestampID#b,d:locationId,d:levelLabel,d:operatedBy') TBLPROPERTIES('hbase.table.name' = 'cim.ParkingSpace') 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1052 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ParkingSpace_location(sampleTimestamp bigint,parent string,city String,timezone String,isValid boolean,ancestor string,siblingIndex int,operatedBy string,boundary_geoPoint string,receiveTimestamp bigint,label string,demarcated boolean,sid string,opParams_zoneType string,levelLabel string,sampleTimestampID int,locationId String,tenantId string,undemarcated boolean,opParams_maxDurationMinutes int) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 547 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingSpace(sampleTimestamp bigint,parent string,city string,timezone String,isValid boolean,ancestor string,siblingIndex int,operatedBy string,boundary_geoPoint string,label string,receiveTimestamp bigint,demarcated boolean,sid string,opParams_zoneType string,levelLabel string,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,undemarcated boolean,opParams_maxDurationMinutes int) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 561 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingSpace_state_hourly(monthweek int,week int,timeid int,city string,count int,weekday int,total_min int,total_sum int,sid string,occupied_min int,dimensionid bigint,month int,hour int,occupied_sum int,total_max int,locationid string,day int,occupied_max int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 477 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingSpace_state(sampleTimestamp bigint,timezone string,isValid boolean,available int,receiveTimestamp bigint,sid string,total int,dimensionid bigint,reserved int,sampleTimestampID int,floating int,locationId string,tenantId string,timezoneoffset int,day int,occupied int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 516 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ParkingSpace_state_hbase ( rowkey string,reserved int,total int,startTimestamp bigint,sid string,tenantId string,occupied int,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,available int,sampleTimestampID int,floating int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:reserved#b,d:total#b,d:startTimestamp#b,d:sid,d:tenantId,d:occupied#b,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:available#b,d:sampleTimestampID#b,d:floating#b') TBLPROPERTIES('hbase.table.name' = 'cim.ParkingSpace.state') 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 807 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ParkingSpace_state_period(sampleTimestamp bigint,total int,reserved int,sampleTimestampID int,floating int,isValid boolean,tenantId string,available int,receiveTimestamp bigint,startTimestamp bigint,occupied int,sid string) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 372 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Stop_hbase ( rowkey string,geocoordinates_longitude double,boundary_geoPoint string,description string,sampleTimestamp bigint,isValid boolean,providerDetails string,avgWaitTime int,geohash string,label string,tag string,sampleTimestampID int,locationId string,zoneId string,createTimestampID int,sid string,geocoordinates_altitude double,name string,destroyTimestamp bigint,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,wifiAccessPointId string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double,wheelchairBoarding int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:boundary_geoPoint,d:description,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:avgWaitTime#b,d:geohash,d:label,d:tag,d:sampleTimestampID#b,d:locationId,d:zoneId,d:createTimestampID#b,d:sid,d:geocoordinates_altitude#b,d:name,d:destroyTimestamp#b,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:wifiAccessPointId,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b,d:wheelchairBoarding#b') TBLPROPERTIES('hbase.table.name' = 'cim.Stop') 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1417 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Stop_location(geocoordinates_longitude double,sampleTimestamp bigint,city String,timezone String,description string,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,avgWaitTime int,thirdPartyId string,lastUpdated bigint,wheelchairBoarding int,sampleTimestampID int,locationId String,geohash string,createTimestampID int,providerDetails string,geocoordinates_latitude double,zoneId string,tag string,destroyTimestampID int,geocoordinates_altitude double,isValid boolean,label string,name string,tenantId string,destroyTimestamp bigint,wifiAccessPointId string) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 742 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.437 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Stop(geocoordinates_longitude double,sampleTimestamp bigint,city string,timezone string,description string,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,avgWaitTime int,thirdPartyId string,lastUpdated bigint,wheelchairBoarding int,sampleTimestampID int,locationId string,geohash string,createTimestampID int,geocoordinates_latitude double,providerDetails string,timezoneoffset int,zoneId string,tag string,destroyTimestampID int,geocoordinates_altitude double,isValid boolean,label string,Zonedimensionid bigint,tenantId string,name string,wifiAccessPointId string,destroyTimestamp bigint) 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.437 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 779 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.RoadSegment_hbase ( rowkey string,objectType string,path_geoPoint string,sid string,sampleTimestamp bigint,isValid boolean,spatialSupported boolean,providerDetails string,refSpeed double,length double,tenantId string,receiveTimestamp bigint,roadClass string,sampleTimestampID int,locationId string,operatedBy string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:objectType,d:path_geoPoint,d:sid,d:sampleTimestamp#b,d:isValid#b,d:spatialSupported#b,d:providerDetails,d:refSpeed#b,d:length#b,d:tenantId,d:receiveTimestamp#b,d:roadClass,d:sampleTimestampID#b,d:locationId,d:operatedBy') TBLPROPERTIES('hbase.table.name' = 'cim.RoadSegment') 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 926 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.RoadSegment_location(sampleTimestamp bigint,city String,timezone String,isValid boolean,roadClass string,length double,operatedBy string,path_geoPoint string,receiveTimestamp bigint,objectType string,sid string,sampleTimestampID int,locationId String,providerDetails string,tenantId string,spatialSupported boolean,refSpeed double) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 480 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.RoadSegment(sampleTimestamp bigint,city string,timezone String,isValid boolean,roadClass string,length double,operatedBy string,path_geoPoint string,receiveTimestamp bigint,sid string,objectType string,sampleTimestampID int,locationId string,tenantId string,providerDetails string,timezoneoffset int,spatialSupported boolean,refSpeed double) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 494 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.RoadSegment_state(sampleTimestamp bigint,timezone string,isValid boolean,receiveTimestamp bigint,sid string,dimensionid bigint,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,freeFlowMins double,congestion int,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 487 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.RoadSegment_state_hbase ( rowkey string,startTimestamp bigint,congestion int,sid string,tenantId string,sampleTimestamp bigint,isValid boolean,freeFlowMins double,receiveTimestamp bigint,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:startTimestamp#b,d:congestion#b,d:sid,d:tenantId,d:sampleTimestamp#b,d:isValid#b,d:freeFlowMins#b,d:receiveTimestamp#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.RoadSegment.state') 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 746 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.RoadSegment_state_period(sampleTimestamp bigint,sampleTimestampID int,isValid boolean,tenantId string,receiveTimestamp bigint,freeFlowMins double,congestion int,startTimestamp bigint,sid string) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 343 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ServiceCalendarException_hbase ( rowkey string,createTimestampID int,nickname string,sid string,sampleTimestamp bigint,isValid boolean,destroyTimestamp bigint,providerDetails string,exceptionType int,private string,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,exceptionDate bigint,custom string,active int,tenantId string,receiveTimestamp bigint,createTimestamp bigint,label string,tag string,serviceId string,sampleTimestampID int,locationId string,source string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:nickname,d:sid,d:sampleTimestamp#b,d:isValid#b,d:destroyTimestamp#b,d:providerDetails,d:exceptionType#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:exceptionDate#b,d:custom,d:active#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:label,d:tag,d:serviceId,d:sampleTimestampID#b,d:locationId,d:source') TBLPROPERTIES('hbase.table.name' = 'cim.ServiceCalendarException') 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1234 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ServiceCalendarException_location(sampleTimestamp bigint,exceptionType int,private string,city String,timezone String,receiveTimestamp bigint,source string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,nickname string,providerDetails string,tag string,destroyTimestampID int,serviceId string,isValid boolean,custom string,active int,label string,tenantId string,exceptionDate bigint,destroyTimestamp bigint) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 645 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ServiceCalendarException(exceptionType int,sampleTimestamp bigint,private string,city string,timezone String,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,tag string,serviceId string,destroyTimestampID int,custom string,isValid boolean,active int,label string,tenantId string,exceptionDate bigint,destroyTimestamp bigint) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 659 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltrasonicSensor_state_hourly(monthweek int,week int,timeid int,city string,distance_max double,count int,weekday int,temperature_max double,sid string,temperature_Source string,dimensionid bigint,month int,hour int,distance_Source string,distance_min double,distance_avg double,locationid string,temperature_min double,temperature_avg double,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 566 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltrasonicData_state_hourly(monthweek int,week int,timeid int,city string,distance_max double,count int,weekday int,temperature_max double,sid string,temperature_Source string,dimensionid bigint,month int,hour int,distance_Source string,distance_min double,distance_avg double,locationid string,temperature_min double,temperature_avg double,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 564 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltrasonicData_state(sampleTimestamp bigint,distance double,timezone string,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,temperature_Source string,lastUpdated bigint,distance_expiresAt bigint,distance_Source string,sampleTimestampID int,locationId string,distance_reliability double,tenantId string,temperature double,distance_accuracy double,timezoneoffset int,temperature_reliability double,temperature_accuracy double,temperature_expiresAt bigint,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 730 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltrasonicSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,distance double,timezone string,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,sampleTimestampID int,locationId string,temperature double,geocoordinates_latitude double,timezoneoffset int,temperature_reliability double,temperature_accuracy double,day int,geocoordinates_altitude double,isValid boolean,temperature_Source string,distance_expiresAt bigint,distance_Source string,distance_reliability double,tenantId string,distance_accuracy double,temperature_expiresAt bigint,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 826 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.UltrasonicSensor_state_hbase ( rowkey string,temperature double,sid string,sampleTimestamp bigint,temperature_Source string,isValid boolean,distance_expiresAt bigint,distance double,distance_reliability double,lastUpdated bigint,startTimestamp bigint,temperature_expiresAt bigint,tenantId string,distance_Source string,temperature_reliability double,parentEntityType string,distance_accuracy double,temperature_accuracy double,receiveTimestamp bigint,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:temperature#b,d:sid,d:sampleTimestamp#b,d:temperature_Source,d:isValid#b,d:distance_expiresAt#b,d:distance#b,d:distance_reliability#b,d:lastUpdated#b,d:startTimestamp#b,d:temperature_expiresAt#b,d:tenantId,d:distance_Source,d:temperature_reliability#b,d:parentEntityType,d:distance_accuracy#b,d:temperature_accuracy#b,d:receiveTimestamp#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.UltrasonicSensor.state') 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1235 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.UltrasonicData_state_period(sampleTimestamp bigint,distance double,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,temperature_Source string,lastUpdated bigint,distance_expiresAt bigint,distance_Source string,sampleTimestampID int,distance_reliability double,temperature double,tenantId string,distance_accuracy double,temperature_expiresAt bigint,temperature_reliability double,temperature_accuracy double,startTimestamp bigint) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 605 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.UltrasonicSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,distance double,geocoordinates_altitude double,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,temperature_Source string,lastUpdated bigint,distance_expiresAt bigint,distance_Source string,sampleTimestampID int,distance_reliability double,temperature double,tenantId string,geocoordinates_latitude double,distance_accuracy double,temperature_expiresAt bigint,temperature_reliability double,temperature_accuracy double,startTimestamp bigint) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 701 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.MobilityROI_hbase ( rowkey string,objectType string,sid string,boundary_geoPoint string,sampleTimestamp bigint,isValid boolean,spatialSupported boolean,providerDetails string,thirdPartyId string,geohash string,tenantId string,receiveTimestamp bigint,label string,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:objectType,d:sid,d:boundary_geoPoint,d:sampleTimestamp#b,d:isValid#b,d:spatialSupported#b,d:providerDetails,d:thirdPartyId,d:geohash,d:tenantId,d:receiveTimestamp#b,d:label,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.MobilityROI') 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 901 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.MobilityROI_location(sampleTimestamp bigint,city String,timezone String,isValid boolean,boundary_geoPoint string,receiveTimestamp bigint,label string,objectType string,sid string,thirdPartyId string,sampleTimestampID int,locationId String,geohash string,providerDetails string,tenantId string,spatialSupported boolean) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 467 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.MobilityROI(sampleTimestamp bigint,city string,timezone String,isValid boolean,boundary_geoPoint string,label string,receiveTimestamp bigint,sid string,objectType string,thirdPartyId string,sampleTimestampID int,locationId string,geohash string,tenantId string,providerDetails string,timezoneoffset int,spatialSupported boolean) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 481 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WindSensor_state_hourly(monthweek int,week int,timeid int,city string,count int,weekday int,windDirection_min double,windDirection_avg double,windSpeed_max double,sid string,dimensionid bigint,month int,hour int,locationid string,windSpeed_min double,windSpeed_avg double,windSpeed_Source string,windDirection_Source string,day int,windDirection_max double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 572 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WindData_state_hourly(monthweek int,week int,timeid int,city string,count int,weekday int,windDirection_min double,windDirection_avg double,windSpeed_max double,sid string,dimensionid bigint,month int,hour int,locationid string,windSpeed_min double,windSpeed_avg double,windSpeed_Source string,windDirection_Source string,day int,windDirection_max double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 570 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WindData_state(windSpeed_reliability double,sampleTimestamp bigint,timezone string,isValid boolean,windSpeed_accuracy double,parentEntityType string,receiveTimestamp bigint,windDirection_reliability double,sid string,windDirection_expiresAt bigint,lastUpdated bigint,windSpeed_expiresAt bigint,sampleTimestampID int,locationId string,tenantId string,windSpeed_Source string,timezoneoffset int,windDirection double,windDirection_Source string,windSpeed double,day int,windDirection_accuracy double,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 739 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WindSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,timezone string,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,windSpeed_expiresAt bigint,sampleTimestampID int,locationId string,geocoordinates_latitude double,windSpeed_Source string,timezoneoffset int,windDirection double,windSpeed double,day int,windDirection_accuracy double,windSpeed_reliability double,geocoordinates_altitude double,isValid boolean,windSpeed_accuracy double,windDirection_reliability double,windDirection_expiresAt bigint,tenantId string,windDirection_Source string,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 835 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WindSensor_state_hbase ( rowkey string,windDirection_accuracy double,windDirection double,windSpeed_reliability double,sid string,sampleTimestamp bigint,isValid boolean,windSpeed_Source string,windSpeed_accuracy double,lastUpdated bigint,windDirection_expiresAt bigint,windSpeed_expiresAt bigint,startTimestamp bigint,tenantId string,windSpeed double,parentEntityType string,windDirection_reliability double,receiveTimestamp bigint,sampleTimestampID int,windDirection_Source string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:windDirection_accuracy#b,d:windDirection#b,d:windSpeed_reliability#b,d:sid,d:sampleTimestamp#b,d:isValid#b,d:windSpeed_Source,d:windSpeed_accuracy#b,d:lastUpdated#b,d:windDirection_expiresAt#b,d:windSpeed_expiresAt#b,d:startTimestamp#b,d:tenantId,d:windSpeed#b,d:parentEntityType,d:windDirection_reliability#b,d:receiveTimestamp#b,d:sampleTimestampID#b,d:windDirection_Source') TBLPROPERTIES('hbase.table.name' = 'cim.WindSensor.state') 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1253 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WindData_state_period(windSpeed_reliability double,sampleTimestamp bigint,isValid boolean,windSpeed_accuracy double,parentEntityType string,receiveTimestamp bigint,windDirection_reliability double,sid string,windDirection_expiresAt bigint,lastUpdated bigint,windSpeed_expiresAt bigint,sampleTimestampID int,tenantId string,windSpeed_Source string,windDirection double,windSpeed double,windDirection_Source string,windDirection_accuracy double,startTimestamp bigint) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 614 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WindSensor_state_period(windSpeed_reliability double,sampleTimestamp bigint,geocoordinates_longitude double,geocoordinates_altitude double,isValid boolean,windSpeed_accuracy double,parentEntityType string,receiveTimestamp bigint,windDirection_reliability double,sid string,windDirection_expiresAt bigint,lastUpdated bigint,windSpeed_expiresAt bigint,sampleTimestampID int,tenantId string,geocoordinates_latitude double,windSpeed_Source string,windDirection double,windSpeed double,windDirection_Source string,windDirection_accuracy double,startTimestamp bigint) 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 710 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.438 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Locations_hierarchy ( rowkey string,length int,entityId string,entityType string,sampleTimestamp bigint,tenantId String,receiveTimestamp bigint,ancestorId string,ancestorType string,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:length#b,d:entityId,d:entityType,d:sampleTimestamp#b,d:tenantId#b,d:receiveTimestamp#b,d:ancestorId,d:ancestorType,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.Locations.hierarchy') 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.438 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 739 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Locations_hierarchy(ancestorType string,sampleTimestamp bigint,sampleTimestampID int,entityType string,ancestorId string,length int,tenantid string,entityId string,receiveTimestamp bigint) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 341 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.CustomerComplaints_hbase ( rowkey string,sampleTimestamp bigint,isValid boolean,complaint_description string,providerDetails string,applicableDomain string,label string,customerId string,tag string,sampleTimestampID int,locationId string,createTimestampID int,sid string,destroyTimestamp bigint,private string,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,custom string,customer_info string,wasteBinId string,tenantId string,parentDomain string,receiveTimestamp bigint,createTimestamp bigint,complaintOnDate bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:sampleTimestamp#b,d:isValid#b,d:complaint_description,d:providerDetails,d:applicableDomain,d:label,d:customerId,d:tag,d:sampleTimestampID#b,d:locationId,d:createTimestampID#b,d:sid,d:destroyTimestamp#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:custom,d:customer_info,d:wasteBinId,d:tenantId,d:parentDomain,d:receiveTimestamp#b,d:createTimestamp#b,d:complaintOnDate#b') TBLPROPERTIES('hbase.table.name' = 'cim.CustomerComplaints') 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1319 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.CustomerComplaints_location(sampleTimestamp bigint,complaintOnDate bigint,private string,complaint_description string,city String,timezone String,receiveTimestamp bigint,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,providerDetails string,customerId string,customer_info string,tag string,destroyTimestampID int,isValid boolean,custom string,label string,wasteBinId string,tenantId string,destroyTimestamp bigint) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 695 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.CustomerComplaints(complaintOnDate bigint,sampleTimestamp bigint,private string,complaint_description string,city string,timezone string,WasteBindimensionid bigint,receiveTimestamp bigint,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,providerDetails string,customerId string,timezoneoffset int,customer_info string,tag string,destroyTimestampID int,custom string,isValid boolean,label string,wasteBinId string,tenantId string,destroyTimestamp bigint) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 736 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.TemporalMobilityStats_connection ( rowkey string,entityId string,entityType string,sid string,tenantId String,sampletimestamp bigint,receivetimestamp bigint,name string,destroytimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:entityId,d:entityType,d:sid,d:tenantId#b,d:sampletimestamp#b,d:receivetimestamp#b,d:name,d:destroytimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.TemporalMobilityStats.connection') 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 714 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.TemporalMobilityStats_connection_bak(receivetimestamp bigint,entityType string,sampletimestamp bigint,name string,tenantId String,entityId string,destroytimestamp bigint,sid string) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 330 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.TemporalMobilityStats_hbase ( rowkey string,tenantId string,sid string,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:tenantId,d:sid,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.TemporalMobilityStats') 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 677 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.TemporalMobilityStats_location(sampleTimestamp bigint,sampleTimestampID int,city String,locationId String,timezone String,isValid boolean,tenantId string,receiveTimestamp bigint,sid string) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 338 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TemporalMobilityStats(sampleTimestamp bigint,sampleTimestampID int,city string,locationId string,timezone String,isValid boolean,tenantId string,timezoneoffset int,receiveTimestamp bigint,sid string) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 352 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SolarRadiationSensor_state_hourly(week int,timeid int,city string,reflectedGlobalRadiation_max double,weekday int,sid string,directSolarRadiation_min double,directSolarRadiation_avg double,directSolarRadiation_Source string,dimensionid bigint,hour int,diffuseSkyRadiation_min double,diffuseSkyRadiation_avg double,diffuseSkyRadiation_Source string,day int,directSolarRadiation_max double,monthweek int,count int,diffuseSkyRadiation_max double,month int,locationid string,reflectedGlobalRadiation_min double,reflectedGlobalRadiation_avg double,reflectedGlobalRadiation_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 797 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SolarRadiationData_state_hourly(week int,timeid int,city string,reflectedGlobalRadiation_max double,weekday int,sid string,directSolarRadiation_min double,directSolarRadiation_avg double,directSolarRadiation_Source string,dimensionid bigint,hour int,diffuseSkyRadiation_min double,diffuseSkyRadiation_avg double,diffuseSkyRadiation_Source string,day int,directSolarRadiation_max double,monthweek int,count int,diffuseSkyRadiation_max double,month int,locationid string,reflectedGlobalRadiation_min double,reflectedGlobalRadiation_avg double,reflectedGlobalRadiation_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 795 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SolarRadiationData_state(sampleTimestamp bigint,reflectedGlobalRadiation_expiresAt bigint,timezone string,diffuseSkyRadiation_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,directSolarRadiation_Source string,reflectedGlobalRadiation double,diffuseSkyRadiation_accuracy double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,locationId string,directSolarRadiation_expiresAt bigint,timezoneoffset int,diffuseSkyRadiation_Source string,day int,reflectedGlobalRadiation_reliability double,directSolarRadiation double,isValid boolean,directSolarRadiation_accuracy double,tenantId string,diffuseSkyRadiation double,directSolarRadiation_reliability double,diffuseSkyRadiation_expiresAt bigint,startTimestamp bigint,reflectedGlobalRadiation_Source string) partitioned by (year int,month int,city string) stored as parquet 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1032 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SolarRadiationSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,reflectedGlobalRadiation_expiresAt bigint,timezone string,diffuseSkyRadiation_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,directSolarRadiation_Source string,reflectedGlobalRadiation double,diffuseSkyRadiation_accuracy double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,locationId string,directSolarRadiation_expiresAt bigint,geocoordinates_latitude double,timezoneoffset int,diffuseSkyRadiation_Source string,day int,reflectedGlobalRadiation_reliability double,directSolarRadiation double,geocoordinates_altitude double,isValid boolean,directSolarRadiation_accuracy double,tenantId string,diffuseSkyRadiation double,directSolarRadiation_reliability double,diffuseSkyRadiation_expiresAt bigint,startTimestamp bigint,reflectedGlobalRadiation_Source string) partitioned by (year int,month int,city string) stored as parquet 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1128 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.SolarRadiationSensor_state_hbase ( rowkey string,directSolarRadiation double,sid string,sampleTimestamp bigint,diffuseSkyRadiation_Source string,directSolarRadiation_expiresAt bigint,isValid boolean,lastUpdated bigint,diffuseSkyRadiation_reliability double,diffuseSkyRadiation double,directSolarRadiation_Source string,startTimestamp bigint,directSolarRadiation_reliability double,tenantId string,reflectedGlobalRadiation_expiresAt bigint,diffuseSkyRadiation_expiresAt bigint,parentEntityType string,receiveTimestamp bigint,reflectedGlobalRadiation_Source string,sampleTimestampID int,reflectedGlobalRadiation_reliability double,reflectedGlobalRadiation double,diffuseSkyRadiation_accuracy double,reflectedGlobalRadiation_accuracy double,directSolarRadiation_accuracy double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:directSolarRadiation#b,d:sid,d:sampleTimestamp#b,d:diffuseSkyRadiation_Source,d:directSolarRadiation_expiresAt#b,d:isValid#b,d:lastUpdated#b,d:diffuseSkyRadiation_reliability#b,d:diffuseSkyRadiation#b,d:directSolarRadiation_Source,d:startTimestamp#b,d:directSolarRadiation_reliability#b,d:tenantId,d:reflectedGlobalRadiation_expiresAt#b,d:diffuseSkyRadiation_expiresAt#b,d:parentEntityType,d:receiveTimestamp#b,d:reflectedGlobalRadiation_Source,d:sampleTimestampID#b,d:reflectedGlobalRadiation_reliability#b,d:reflectedGlobalRadiation#b,d:diffuseSkyRadiation_accuracy#b,d:reflectedGlobalRadiation_accuracy#b,d:directSolarRadiation_accuracy#b') TBLPROPERTIES('hbase.table.name' = 'cim.SolarRadiationSensor.state') 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1822 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.SolarRadiationData_state_period(sampleTimestamp bigint,reflectedGlobalRadiation_expiresAt bigint,isValid boolean,diffuseSkyRadiation_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,directSolarRadiation_Source string,reflectedGlobalRadiation double,diffuseSkyRadiation_accuracy double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,directSolarRadiation_accuracy double,directSolarRadiation_expiresAt bigint,tenantId string,diffuseSkyRadiation double,directSolarRadiation_reliability double,diffuseSkyRadiation_Source string,startTimestamp bigint,diffuseSkyRadiation_expiresAt bigint,reflectedGlobalRadiation_reliability double,directSolarRadiation double,reflectedGlobalRadiation_Source string) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 907 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.SolarRadiationSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,reflectedGlobalRadiation_expiresAt bigint,diffuseSkyRadiation_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,directSolarRadiation_Source string,reflectedGlobalRadiation double,diffuseSkyRadiation_accuracy double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,directSolarRadiation_expiresAt bigint,geocoordinates_latitude double,diffuseSkyRadiation_Source string,reflectedGlobalRadiation_reliability double,directSolarRadiation double,geocoordinates_altitude double,isValid boolean,directSolarRadiation_accuracy double,tenantId string,diffuseSkyRadiation double,directSolarRadiation_reliability double,startTimestamp bigint,diffuseSkyRadiation_expiresAt bigint,reflectedGlobalRadiation_Source string) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1003 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteBin_state(sampleTimestamp bigint,timezone string,fillRate double,tilt double,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,temperature double,timezoneoffset int,wasteSpaceId string,lastCollected bigint,destroyTimestampID int,day int,isValid boolean,weight double,fillLevel double,scheduledNextCollection bigint,volume double,tenantId string,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 724 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteBin_state_hbase ( rowkey string,fillRate double,createTimestampID int,temperature double,sid string,sampleTimestamp bigint,scheduledNextCollection bigint,isValid boolean,destroyTimestamp bigint,lastUpdated bigint,fillLevel double,destroyTimestampID int,startTimestamp bigint,tenantId string,receiveTimestamp bigint,lastCollected bigint,createTimestamp bigint,volume double,sampleTimestampID int,weight double,tilt double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:fillRate#b,d:createTimestampID#b,d:temperature#b,d:sid,d:sampleTimestamp#b,d:scheduledNextCollection#b,d:isValid#b,d:destroyTimestamp#b,d:lastUpdated#b,d:fillLevel#b,d:destroyTimestampID#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:lastCollected#b,d:createTimestamp#b,d:volume#b,d:sampleTimestampID#b,d:weight#b,d:tilt#b') TBLPROPERTIES('hbase.table.name' = 'cim.WasteBin.state') 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1150 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteBin_state_period(sampleTimestamp bigint,isValid boolean,fillRate double,weight double,scheduledNextCollection bigint,fillLevel double,receiveTimestamp bigint,tilt double,sid string,createTimestamp bigint,volume double,lastUpdated bigint,sampleTimestampID int,createTimestampID int,temperature double,tenantId string,destroyTimestampID int,lastCollected bigint,startTimestamp bigint,destroyTimestamp bigint) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 560 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteBin_hbase ( rowkey string,geocoordinates_longitude double,contentType string,sampleTimestamp bigint,maxWeightCollectionCapacity double,isValid boolean,providerDetails string,geohash string,active int,label string,sampleTimestampID int,locationId string,createTimestampID int,nickname string,maxVolumeCollectionCapacity double,status string,sid string,isAvalableForCollection string,scheduledNextCollection bigint,geocoordinates_altitude double,destroyTimestamp bigint,sensors string,private string,agencyId string,thirdPartyId string,lastUpdated bigint,fillLevelThreshold double,destroyTimestampID int,wasteSpaceId string,tenantId string,collectionType string,receiveTimestamp bigint,lastCollected bigint,createTimestamp bigint,geocoordinates_latitude double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:contentType,d:sampleTimestamp#b,d:maxWeightCollectionCapacity#b,d:isValid#b,d:providerDetails,d:geohash,d:active#b,d:label,d:sampleTimestampID#b,d:locationId,d:createTimestampID#b,d:nickname,d:maxVolumeCollectionCapacity#b,d:status,d:sid,d:isAvalableForCollection,d:scheduledNextCollection#b,d:geocoordinates_altitude#b,d:destroyTimestamp#b,d:sensors,d:private,d:agencyId,d:thirdPartyId,d:lastUpdated#b,d:fillLevelThreshold#b,d:destroyTimestampID#b,d:wasteSpaceId,d:tenantId,d:collectionType,d:receiveTimestamp#b,d:lastCollected#b,d:createTimestamp#b,d:geocoordinates_latitude#b') TBLPROPERTIES('hbase.table.name' = 'cim.WasteBin') 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1759 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteBin_location(geocoordinates_longitude double,sampleTimestamp bigint,private string,maxVolumeCollectionCapacity double,city String,timezone String,agencyId string,receiveTimestamp bigint,sid string,createTimestamp bigint,isAvalableForCollection string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,geohash string,createTimestampID int,providerDetails string,nickname string,geocoordinates_latitude double,destroyTimestampID int,wasteSpaceId string,lastCollected bigint,contentType string,geocoordinates_altitude double,isValid boolean,active int,label string,scheduledNextCollection bigint,collectionType string,sensors string,fillLevelThreshold double,tenantId string,maxWeightCollectionCapacity double,status string,destroyTimestamp bigint) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 928 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteBin(geocoordinates_longitude double,sampleTimestamp bigint,private string,city string,maxVolumeCollectionCapacity double,timezone string,agencyId string,receiveTimestamp bigint,sid string,createTimestamp bigint,isAvalableForCollection string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,createTimestampID int,geocoordinates_latitude double,nickname string,providerDetails string,timezoneoffset int,lastCollected bigint,destroyTimestampID int,wasteSpaceId string,contentType string,geocoordinates_altitude double,isValid boolean,active int,label string,scheduledNextCollection bigint,collectionType string,sensors string,WasteSpacedimensionid bigint,fillLevelThreshold double,tenantId string,maxWeightCollectionCapacity double,status string,destroyTimestamp bigint) 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 971 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltraVioletSensor_state_hourly(uvc_min double,uvc_avg double,vis_Source string,week int,timeid int,city string,weekday int,vis_max double,vuv_max double,sid string,uvb_min double,uvb_avg double,dimensionid bigint,hour int,vuv_min double,vuv_avg double,day int,uva_max double,uvb_max double,monthweek int,count int,uva_Source string,uva_min double,uva_avg double,uvb_Source string,uvc_Source string,uvc_max double,vuv_Source string,month int,locationid string,vis_min double,vis_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 704 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.439 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltraVioletData_state_hourly(uvc_min double,uvc_avg double,vis_Source string,week int,timeid int,city string,weekday int,vis_max double,vuv_max double,sid string,uvb_min double,uvb_avg double,dimensionid bigint,hour int,vuv_min double,vuv_avg double,day int,uva_max double,uvb_max double,monthweek int,count int,uva_Source string,uva_min double,uva_avg double,uvb_Source string,uvc_Source string,uvc_max double,vuv_Source string,month int,locationid string,vis_min double,vis_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 702 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.439 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltraVioletData_state(vis_expiresAt bigint,sampleTimestamp bigint,vis_Source string,timezone string,uvb_reliability double,uva_reliability double,uvc_reliability double,uvc_expiresAt bigint,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,uvb_expiresAt bigint,sampleTimestampID int,vuv double,locationId string,uvb_accuracy double,timezoneoffset int,vuv_reliability double,day int,uvb double,uva double,vis double,uvc_accuracy double,uvc double,isValid boolean,uva_Source string,uvb_Source string,uvc_Source string,vuv_Source string,vuv_expiresAt bigint,vuv_accuracy double,tenantId string,vis_reliability double,vis_accuracy double,uva_accuracy double,uva_expiresAt bigint,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 945 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.UltraVioletSensor_state(vis_expiresAt bigint,sampleTimestamp bigint,geocoordinates_longitude double,vis_Source string,timezone string,uvb_reliability double,uva_reliability double,uvc_reliability double,uvc_expiresAt bigint,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,uvb_expiresAt bigint,sampleTimestampID int,vuv double,locationId string,geocoordinates_latitude double,uvb_accuracy double,timezoneoffset int,vuv_reliability double,day int,uvb double,uva double,vis double,uvc_accuracy double,uvc double,geocoordinates_altitude double,isValid boolean,uva_Source string,uvb_Source string,uvc_Source string,vuv_Source string,vuv_expiresAt bigint,vuv_accuracy double,tenantId string,vis_reliability double,vis_accuracy double,uva_accuracy double,uva_expiresAt bigint,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1041 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.UltraVioletSensor_state_hbase ( rowkey string,uvc_expiresAt bigint,vis_reliability double,uvc_Source string,sampleTimestamp bigint,uvb_Source string,isValid boolean,vuv_reliability double,uvb_accuracy double,vis_expiresAt bigint,uvc double,uvb double,vuv_expiresAt bigint,uva double,vis double,parentEntityType string,uva_accuracy double,uva_Source string,uva_expiresAt bigint,vis_accuracy double,vuv double,sampleTimestampID int,vuv_accuracy double,uvb_expiresAt bigint,sid string,uvc_accuracy double,lastUpdated bigint,startTimestamp bigint,tenantId string,receiveTimestamp bigint,uvc_reliability double,vuv_Source string,uvb_reliability double,uva_reliability double,vis_Source string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:uvc_expiresAt#b,d:vis_reliability#b,d:uvc_Source,d:sampleTimestamp#b,d:uvb_Source,d:isValid#b,d:vuv_reliability#b,d:uvb_accuracy#b,d:vis_expiresAt#b,d:uvc#b,d:uvb#b,d:vuv_expiresAt#b,d:uva#b,d:vis#b,d:parentEntityType,d:uva_accuracy#b,d:uva_Source,d:uva_expiresAt#b,d:vis_accuracy#b,d:vuv#b,d:sampleTimestampID#b,d:vuv_accuracy#b,d:uvb_expiresAt#b,d:sid,d:uvc_accuracy#b,d:lastUpdated#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:uvc_reliability#b,d:vuv_Source,d:uvb_reliability#b,d:uva_reliability#b,d:vis_Source') TBLPROPERTIES('hbase.table.name' = 'cim.UltraVioletSensor.state') 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1614 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.UltraVioletData_state_period(sampleTimestamp bigint,vis_expiresAt bigint,vis_Source string,uvb_reliability double,uvc_expiresAt bigint,uvc_reliability double,uva_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,uvb_expiresAt bigint,sampleTimestampID int,vuv double,uvb_accuracy double,vuv_reliability double,uvb double,uva double,vis double,uvc_accuracy double,uvc double,isValid boolean,uva_Source string,uvc_Source string,uvb_Source string,vuv_Source string,vuv_expiresAt bigint,vuv_accuracy double,tenantId string,vis_reliability double,vis_accuracy double,uva_accuracy double,uva_expiresAt bigint,startTimestamp bigint) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 820 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.UltraVioletSensor_state_period(sampleTimestamp bigint,vis_expiresAt bigint,geocoordinates_longitude double,vis_Source string,uvb_reliability double,uvc_expiresAt bigint,uvc_reliability double,uva_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,uvb_expiresAt bigint,sampleTimestampID int,vuv double,geocoordinates_latitude double,uvb_accuracy double,vuv_reliability double,uvb double,uva double,vis double,uvc_accuracy double,uvc double,geocoordinates_altitude double,isValid boolean,uva_Source string,uvc_Source string,uvb_Source string,vuv_Source string,vuv_expiresAt bigint,vuv_accuracy double,tenantId string,vis_reliability double,vis_accuracy double,uva_accuracy double,uva_expiresAt bigint,startTimestamp bigint) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 916 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.VibratingWireSensor_state_hourly(thermistor_min double,thermistor_avg double,week int,timeid int,city string,engineeringValue_max double,frequency_max double,weekday int,temperature_max double,sid string,frequency_Source string,dimensionid bigint,hour int,temperature_min double,temperature_avg double,day int,monthweek int,count int,thermistor_Source string,frequency_min double,frequency_avg double,temperature_Source string,thermistor_max double,month int,locationid string,engineeringValue_min double,engineeringValue_avg double,engineeringValue_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 779 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.VibratingWireData_state_hourly(thermistor_min double,thermistor_avg double,week int,timeid int,city string,engineeringValue_max double,frequency_max double,weekday int,temperature_max double,sid string,frequency_Source string,dimensionid bigint,hour int,temperature_min double,temperature_avg double,day int,monthweek int,count int,thermistor_Source string,frequency_min double,frequency_avg double,temperature_Source string,thermistor_max double,month int,locationid string,engineeringValue_min double,engineeringValue_avg double,engineeringValue_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 777 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.VibratingWireData_state(sampleTimestamp bigint,timezone string,engineeringValue_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,frequency double,frequency_Source string,lastUpdated bigint,thermistor_accuracy double,sampleTimestampID int,locationId string,engineeringValue double,temperature double,timezoneoffset int,temperature_reliability double,temperature_accuracy double,day int,thermistor_reliability double,engineeringValue_accuracy double,frequency_reliability double,isValid boolean,thermistor_Source string,frequency_expiresAt bigint,thermistor_expiresAt bigint,thermistor double,temperature_Source string,frequency_accuracy double,tenantId string,engineeringValue_expiresAt bigint,temperature_expiresAt bigint,engineeringValue_Source string,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1024 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.VibratingWireSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,timezone string,engineeringValue_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,frequency double,frequency_Source string,lastUpdated bigint,thermistor_accuracy double,sampleTimestampID int,locationId string,engineeringValue double,temperature double,geocoordinates_latitude double,timezoneoffset int,temperature_reliability double,temperature_accuracy double,day int,thermistor_reliability double,engineeringValue_accuracy double,frequency_reliability double,geocoordinates_altitude double,isValid boolean,thermistor_Source string,frequency_expiresAt bigint,thermistor_expiresAt bigint,thermistor double,temperature_Source string,frequency_accuracy double,tenantId string,engineeringValue_expiresAt bigint,temperature_expiresAt bigint,engineeringValue_Source string,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1120 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.VibratingWireSensor_state_hbase ( rowkey string,temperature double,sampleTimestamp bigint,isValid boolean,engineeringValue double,frequency double,frequency_accuracy double,thermistor_Source string,temperature_expiresAt bigint,parentEntityType string,temperature_accuracy double,thermistor_expiresAt bigint,sampleTimestampID int,engineeringValue_expiresAt bigint,frequency_reliability double,thermistor_reliability double,sid string,engineeringValue_accuracy double,engineeringValue_Source string,temperature_Source string,lastUpdated bigint,thermistor_accuracy double,engineeringValue_reliability double,startTimestamp bigint,frequency_Source string,thermistor double,tenantId string,temperature_reliability double,frequency_expiresAt bigint,receiveTimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:temperature#b,d:sampleTimestamp#b,d:isValid#b,d:engineeringValue#b,d:frequency#b,d:frequency_accuracy#b,d:thermistor_Source,d:temperature_expiresAt#b,d:parentEntityType,d:temperature_accuracy#b,d:thermistor_expiresAt#b,d:sampleTimestampID#b,d:engineeringValue_expiresAt#b,d:frequency_reliability#b,d:thermistor_reliability#b,d:sid,d:engineeringValue_accuracy#b,d:engineeringValue_Source,d:temperature_Source,d:lastUpdated#b,d:thermistor_accuracy#b,d:engineeringValue_reliability#b,d:startTimestamp#b,d:frequency_Source,d:thermistor#b,d:tenantId,d:temperature_reliability#b,d:frequency_expiresAt#b,d:receiveTimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.VibratingWireSensor.state') 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1789 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.VibratingWireData_state_period(sampleTimestamp bigint,engineeringValue_reliability double,parentEntityType string,receiveTimestamp bigint,frequency double,sid string,frequency_Source string,lastUpdated bigint,thermistor_accuracy double,sampleTimestampID int,temperature double,engineeringValue double,temperature_accuracy double,temperature_reliability double,thermistor_reliability double,engineeringValue_accuracy double,frequency_reliability double,isValid boolean,thermistor_Source string,frequency_expiresAt bigint,thermistor_expiresAt bigint,thermistor double,temperature_Source string,frequency_accuracy double,engineeringValue_expiresAt bigint,tenantId string,temperature_expiresAt bigint,engineeringValue_Source string,startTimestamp bigint) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 899 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.VibratingWireSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,engineeringValue_reliability double,parentEntityType string,receiveTimestamp bigint,frequency double,sid string,frequency_Source string,lastUpdated bigint,thermistor_accuracy double,sampleTimestampID int,temperature double,engineeringValue double,geocoordinates_latitude double,temperature_accuracy double,temperature_reliability double,thermistor_reliability double,engineeringValue_accuracy double,frequency_reliability double,geocoordinates_altitude double,isValid boolean,thermistor_Source string,frequency_expiresAt bigint,thermistor_expiresAt bigint,thermistor double,temperature_Source string,frequency_accuracy double,engineeringValue_expiresAt bigint,tenantId string,temperature_expiresAt bigint,engineeringValue_Source string,startTimestamp bigint) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 995 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientHumiditySensor_state_hourly(relativeHumidity_Source string,vapourPressure_min double,week int,vapourPressure_avg double,timeid int,city string,relativeHumidity_min double,relativeHumidity_avg double,specificHumidity_min double,specificHumidity_avg double,weekday int,mixingRatio_max double,dewPointTemperature_min double,mixingRatio_Source string,sid string,dewPointTemperature_Source string,absoluteHumidity_max double,dimensionid bigint,hour int,day int,dewPointTemperature_max double,absoluteHumidity_min double,absoluteHumidity_avg double,monthweek int,vapourPressure_Source string,absoluteHumidity_Source string,specificHumidity_Source string,count int,vapourPressure_max double,relativeHumidity_max double,dewPointTemperature_avg double,month int,specificHumidity_max double,locationid string,mixingRatio_min double,mixingRatio_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1067 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientHumidityData_state_hourly(relativeHumidity_Source string,vapourPressure_min double,week int,vapourPressure_avg double,timeid int,city string,relativeHumidity_min double,relativeHumidity_avg double,specificHumidity_min double,specificHumidity_avg double,weekday int,mixingRatio_max double,dewPointTemperature_min double,mixingRatio_Source string,sid string,dewPointTemperature_Source string,absoluteHumidity_max double,dimensionid bigint,hour int,day int,dewPointTemperature_max double,absoluteHumidity_min double,absoluteHumidity_avg double,monthweek int,vapourPressure_Source string,absoluteHumidity_Source string,specificHumidity_Source string,count int,vapourPressure_max double,relativeHumidity_max double,dewPointTemperature_avg double,month int,specificHumidity_max double,locationid string,mixingRatio_min double,mixingRatio_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1065 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientHumidityData_state(sampleTimestamp bigint,relativeHumidity_Source string,absoluteHumidity_expiresAt bigint,vapourPressure_reliability double,timezone string,specificHumidity_accuracy double,relativeHumidity_reliability double,vapourPressure_accuracy double,specificHumidity_expiresAt bigint,absoluteHumidity_reliability double,parentEntityType string,receiveTimestamp bigint,specificHumidity double,mixingRatio_Source string,sid string,dewPointTemperature_Source string,lastUpdated bigint,dewPointTemperature_accuracy double,relativeHumidity_accuracy double,sampleTimestampID int,locationId string,timezoneoffset int,day int,dewPointTemperature double,mixingRatio_reliability double,vapourPressure_Source string,mixingRatio_accuracy double,absoluteHumidity_Source string,specificHumidity_Source string,isValid boolean,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,specificHumidity_reliability double,dewPointTemperature_reliability double,relativeHumidity_expiresAt bigint,absoluteHumidity double,mixingRatio double,absoluteHumidity_accuracy double,tenantId string,relativeHumidity double,vapourPressure double,vapourPressure_expiresAt bigint,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1412 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientHumiditySensor_state(sampleTimestamp bigint,geocoordinates_longitude double,relativeHumidity_Source string,absoluteHumidity_expiresAt bigint,vapourPressure_reliability double,timezone string,specificHumidity_accuracy double,relativeHumidity_reliability double,vapourPressure_accuracy double,specificHumidity_expiresAt bigint,absoluteHumidity_reliability double,parentEntityType string,receiveTimestamp bigint,specificHumidity double,mixingRatio_Source string,sid string,dewPointTemperature_Source string,lastUpdated bigint,dewPointTemperature_accuracy double,relativeHumidity_accuracy double,sampleTimestampID int,locationId string,geocoordinates_latitude double,timezoneoffset int,day int,dewPointTemperature double,mixingRatio_reliability double,vapourPressure_Source string,mixingRatio_accuracy double,absoluteHumidity_Source string,specificHumidity_Source string,geocoordinates_altitude double,isValid boolean,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,specificHumidity_reliability double,dewPointTemperature_reliability double,relativeHumidity_expiresAt bigint,absoluteHumidity double,mixingRatio double,absoluteHumidity_accuracy double,tenantId string,relativeHumidity double,vapourPressure double,vapourPressure_expiresAt bigint,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1508 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.AmbientHumiditySensor_state_hbase ( rowkey string,dewPointTemperature_Source string,sampleTimestamp bigint,mixingRatio_Source string,absoluteHumidity_reliability double,vapourPressure double,isValid boolean,vapourPressure_accuracy double,absoluteHumidity double,specificHumidity_Source string,absoluteHumidity_Source string,relativeHumidity_Source string,parentEntityType string,vapourPressure_reliability double,specificHumidity double,dewPointTemperature_expiresAt bigint,sampleTimestampID int,vapourPressure_Source string,sid string,dewPointTemperature_reliability double,absoluteHumidity_accuracy double,mixingRatio_expiresAt bigint,relativeHumidity_expiresAt bigint,relativeHumidity double,vapourPressure_expiresAt bigint,specificHumidity_expiresAt bigint,dewPointTemperature double,lastUpdated bigint,dewPointTemperature_accuracy double,relativeHumidity_reliability double,specificHumidity_reliability double,mixingRatio double,startTimestamp bigint,tenantId string,mixingRatio_accuracy double,relativeHumidity_accuracy double,receiveTimestamp bigint,mixingRatio_reliability double,absoluteHumidity_expiresAt bigint,specificHumidity_accuracy double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:dewPointTemperature_Source,d:sampleTimestamp#b,d:mixingRatio_Source,d:absoluteHumidity_reliability#b,d:vapourPressure#b,d:isValid#b,d:vapourPressure_accuracy#b,d:absoluteHumidity#b,d:specificHumidity_Source,d:absoluteHumidity_Source,d:relativeHumidity_Source,d:parentEntityType,d:vapourPressure_reliability#b,d:specificHumidity#b,d:dewPointTemperature_expiresAt#b,d:sampleTimestampID#b,d:vapourPressure_Source,d:sid,d:dewPointTemperature_reliability#b,d:absoluteHumidity_accuracy#b,d:mixingRatio_expiresAt#b,d:relativeHumidity_expiresAt#b,d:relativeHumidity#b,d:vapourPressure_expiresAt#b,d:specificHumidity_expiresAt#b,d:dewPointTemperature#b,d:lastUpdated#b,d:dewPointTemperature_accuracy#b,d:relativeHumidity_reliability#b,d:specificHumidity_reliability#b,d:mixingRatio#b,d:startTimestamp#b,d:tenantId,d:mixingRatio_accuracy#b,d:relativeHumidity_accuracy#b,d:receiveTimestamp#b,d:mixingRatio_reliability#b,d:absoluteHumidity_expiresAt#b,d:specificHumidity_accuracy#b') TBLPROPERTIES('hbase.table.name' = 'cim.AmbientHumiditySensor.state') 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 2531 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientHumidityData_state_period(sampleTimestamp bigint,relativeHumidity_Source string,absoluteHumidity_expiresAt bigint,vapourPressure_reliability double,specificHumidity_accuracy double,vapourPressure_accuracy double,specificHumidity_expiresAt bigint,relativeHumidity_reliability double,absoluteHumidity_reliability double,parentEntityType string,receiveTimestamp bigint,specificHumidity double,mixingRatio_Source string,sid string,dewPointTemperature_Source string,lastUpdated bigint,dewPointTemperature_accuracy double,sampleTimestampID int,relativeHumidity_accuracy double,dewPointTemperature double,mixingRatio_reliability double,vapourPressure_Source string,mixingRatio_accuracy double,specificHumidity_Source string,absoluteHumidity_Source string,isValid boolean,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,specificHumidity_reliability double,dewPointTemperature_reliability double,relativeHumidity_expiresAt bigint,absoluteHumidity double,mixingRatio double,absoluteHumidity_accuracy double,vapourPressure double,relativeHumidity double,tenantId string,vapourPressure_expiresAt bigint,startTimestamp bigint) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1287 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientHumiditySensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,relativeHumidity_Source string,absoluteHumidity_expiresAt bigint,vapourPressure_reliability double,specificHumidity_accuracy double,vapourPressure_accuracy double,specificHumidity_expiresAt bigint,relativeHumidity_reliability double,absoluteHumidity_reliability double,parentEntityType string,receiveTimestamp bigint,specificHumidity double,mixingRatio_Source string,sid string,dewPointTemperature_Source string,lastUpdated bigint,dewPointTemperature_accuracy double,sampleTimestampID int,relativeHumidity_accuracy double,geocoordinates_latitude double,dewPointTemperature double,mixingRatio_reliability double,vapourPressure_Source string,mixingRatio_accuracy double,specificHumidity_Source string,absoluteHumidity_Source string,geocoordinates_altitude double,isValid boolean,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,specificHumidity_reliability double,dewPointTemperature_reliability double,relativeHumidity_expiresAt bigint,absoluteHumidity double,mixingRatio double,absoluteHumidity_accuracy double,vapourPressure double,relativeHumidity double,tenantId string,vapourPressure_expiresAt bigint,startTimestamp bigint) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1383 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.RoadSegmentLane_hbase ( rowkey string,roadSegmentId string,path_geoPoint string,tenantId string,sid string,laneType string,sampleTimestamp bigint,isValid boolean,laneNumber double,receiveTimestamp bigint,providerDetails string,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:roadSegmentId,d:path_geoPoint,d:tenantId,d:sid,d:laneType,d:sampleTimestamp#b,d:isValid#b,d:laneNumber#b,d:receiveTimestamp#b,d:providerDetails,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.RoadSegmentLane') 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 840 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.RoadSegmentLane_location(sampleTimestamp bigint,laneType string,city String,timezone String,isValid boolean,path_geoPoint string,laneNumber double,receiveTimestamp bigint,roadSegmentId string,sid string,sampleTimestampID int,locationId String,tenantId string,providerDetails string) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 431 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.RoadSegmentLane(sampleTimestamp bigint,laneType string,city string,timezone string,isValid boolean,path_geoPoint string,laneNumber double,receiveTimestamp bigint,roadSegmentId string,sid string,sampleTimestampID int,locationId string,RoadSegmentdimensionid bigint,tenantId string,providerDetails string,timezoneoffset int) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 475 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SSIDStats_state(sampleTimestamp bigint,nConnected int,timezone string,isValid boolean,receiveTimestamp bigint,ssid string,sid string,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 458 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.SSIDStats_state_hbase ( rowkey string,startTimestamp bigint,tenantId string,sid string,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,ssid string,sampleTimestampID int,nConnected int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:startTimestamp#b,d:tenantId,d:sid,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:ssid,d:sampleTimestampID#b,d:nConnected#b') TBLPROPERTIES('hbase.table.name' = 'cim.SSIDStats.state') 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 724 09:09:14.440 [LeaseRenewer:oozie@nameservice1] DEBUG org.apache.hadoop.hdfs.client.impl.LeaseRenewer - Lease renewer daemon for [] with renew id 1 executed 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.SSIDStats_state_period(sampleTimestamp bigint,nConnected int,sampleTimestampID int,isValid boolean,tenantId string,receiveTimestamp bigint,ssid string,startTimestamp bigint,sid string) 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 333 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AQIEntry_state_hourly(week int,code string,timeid int,sources string,city string,weekday int,sid string,dimensionid bigint,hour int,attributeName string,value_sum double,day int,sensitiveGroups string,announcement string,monthweek int,period string,advisory string,value_min double,entityType string,count int,entityId string,pollutant string,effects string,month int,locationid string,value_max double,category string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 634 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AQIEntry_state(sampleTimestamp bigint,code string,sources string,timezone string,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,sampleTimestampID int,locationId string,timezoneoffset int,attributeName string,value double,day int,sensitiveGroups string,announcement string,period string,advisory string,entityType string,isValid boolean,entityId string,pollutant string,effects string,tenantId string,category string,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 689 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.440 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.AQIEntry_state_hbase ( rowkey string,category string,attributeName string,entityType string,effects string,advisory string,sid string,sampleTimestamp bigint,code string,isValid boolean,pollutant string,lastUpdated bigint,value double,sensitiveGroups string,entityId string,startTimestamp bigint,tenantId string,period string,announcement string,parentEntityType string,receiveTimestamp bigint,sampleTimestampID int,sources string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:category,d:attributeName,d:entityType,d:effects,d:advisory,d:sid,d:sampleTimestamp#b,d:code,d:isValid#b,d:pollutant,d:lastUpdated#b,d:value#b,d:sensitiveGroups,d:entityId,d:startTimestamp#b,d:tenantId,d:period,d:announcement,d:parentEntityType,d:receiveTimestamp#b,d:sampleTimestampID#b,d:sources') TBLPROPERTIES('hbase.table.name' = 'cim.AQIEntry.state') 09:09:14.440 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1120 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AQIEntry_state_period(sampleTimestamp bigint,period string,advisory string,code string,sources string,entityType string,isValid boolean,entityId string,parentEntityType string,receiveTimestamp bigint,sid string,pollutant string,lastUpdated bigint,effects string,sampleTimestampID int,tenantId string,attributeName string,category string,value double,sensitiveGroups string,startTimestamp bigint,announcement string) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 564 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteSpace_hbase ( rowkey string,createTimestampID int,entityType string,nickname string,sid string,boundary_geoPoint string,sampleTimestamp bigint,isValid boolean,wasteDumpSpaceId string,destroyTimestamp bigint,providerDetails string,private string,thirdPartyId string,lastUpdated bigint,geohash string,destroyTimestampID int,active int,tenantId string,receiveTimestamp bigint,createTimestamp bigint,label string,sampleTimestampID int,locationId string,areaType string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:entityType,d:nickname,d:sid,d:boundary_geoPoint,d:sampleTimestamp#b,d:isValid#b,d:wasteDumpSpaceId,d:destroyTimestamp#b,d:providerDetails,d:private,d:thirdPartyId,d:lastUpdated#b,d:geohash,d:destroyTimestampID#b,d:active#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:label,d:sampleTimestampID#b,d:locationId,d:areaType') TBLPROPERTIES('hbase.table.name' = 'cim.WasteSpace') 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1210 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteSpace_location(sampleTimestamp bigint,private string,city String,timezone String,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,geohash string,areaType string,nickname string,providerDetails string,destroyTimestampID int,entityType string,isValid boolean,active int,label string,tenantId string,wasteDumpSpaceId string,destroyTimestamp bigint) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 634 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteSpace(sampleTimestamp bigint,private string,city string,timezone string,WasteDumpSpacedimensionid bigint,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,areaType string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,destroyTimestampID int,entityType string,isValid boolean,active int,label string,tenantId string,wasteDumpSpaceId string,destroyTimestamp bigint) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 681 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.FloatingParkingSession_event_hbase ( rowkey string,createTimestampID int,sid string,sampleTimestamp bigint,parkingSpaceId string,isValid boolean,destroyTimestamp bigint,vehicleDetails string,destroyTimestampID int,farePaid double,tenantId string,receiveTimestamp bigint,createTimestamp bigint,tag string,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:sid,d:sampleTimestamp#b,d:parkingSpaceId,d:isValid#b,d:destroyTimestamp#b,d:vehicleDetails,d:destroyTimestampID#b,d:farePaid#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:tag,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.FloatingParkingSession.event') 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 993 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.FloatingParkingSession_event_period(sampleTimestamp bigint,parkingSpaceId string,vehicleDetails string,isValid boolean,farePaid double,receiveTimestamp bigint,sid string,createTimestamp bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,tag string,destroyTimestampID int,destroyTimestamp bigint) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 478 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.FloatingParkingSession_event(sampleTimestamp bigint,parkingSpaceId string,vehicleDetails string,timezone string,isValid boolean,farePaid double,receiveTimestamp bigint,sid string,createTimestamp bigint,ParkingSpacedimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,timezoneoffset int,tag string,destroyTimestampID int,day int,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 616 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.nodest_FloatingParkingSession_event(sampleTimestamp bigint,parkingSpaceId string,vehicleDetails string,timezone string,isValid boolean,farePaid double,receiveTimestamp bigint,sid string,createTimestamp bigint,ParkingSpacedimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,timezoneoffset int,tag string,destroyTimestampID int,day int,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 623 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.BarometerSensor_state_hourly(monthweek int,week int,timeid int,city string,pressure_max double,count int,weekday int,sid string,dimensionid bigint,month int,hour int,pressure_min double,pressure_avg double,locationid string,pressure_Source string,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 470 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.BarometerData_state_hourly(monthweek int,week int,timeid int,city string,pressure_max double,count int,weekday int,sid string,dimensionid bigint,month int,hour int,pressure_min double,pressure_avg double,locationid string,pressure_Source string,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 468 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.BarometerData_state(sampleTimestamp bigint,timezone string,isValid boolean,parentEntityType string,pressure double,receiveTimestamp bigint,sid string,lastUpdated bigint,pressure_expiresAt bigint,sampleTimestampID int,pressure_Source string,locationId string,pressure_reliability double,tenantId string,timezoneoffset int,day int,startTimestamp bigint,pressure_accuracy double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 596 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.BarometerSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,timezone string,geocoordinates_altitude double,isValid boolean,parentEntityType string,pressure double,receiveTimestamp bigint,sid string,lastUpdated bigint,pressure_expiresAt bigint,sampleTimestampID int,pressure_Source string,locationId string,pressure_reliability double,tenantId string,geocoordinates_latitude double,timezoneoffset int,day int,startTimestamp bigint,pressure_accuracy double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 692 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.BarometerSensor_state_hbase ( rowkey string,pressure double,sid string,sampleTimestamp bigint,isValid boolean,pressure_expiresAt bigint,pressure_reliability double,lastUpdated bigint,startTimestamp bigint,tenantId string,pressure_Source string,parentEntityType string,receiveTimestamp bigint,pressure_accuracy double,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:pressure#b,d:sid,d:sampleTimestamp#b,d:isValid#b,d:pressure_expiresAt#b,d:pressure_reliability#b,d:lastUpdated#b,d:startTimestamp#b,d:tenantId,d:pressure_Source,d:parentEntityType,d:receiveTimestamp#b,d:pressure_accuracy#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.BarometerSensor.state') 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 984 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.BarometerData_state_period(sampleTimestamp bigint,isValid boolean,pressure double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,pressure_expiresAt bigint,sampleTimestampID int,pressure_Source string,pressure_reliability double,tenantId string,startTimestamp bigint,pressure_accuracy double) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 471 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.BarometerSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,geocoordinates_altitude double,isValid boolean,pressure double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,pressure_expiresAt bigint,sampleTimestampID int,pressure_Source string,pressure_reliability double,tenantId string,geocoordinates_latitude double,startTimestamp bigint,pressure_accuracy double) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 567 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingSpot_state_hourly(parkingSpaceId string,monthweek int,expectedRevenue_min double,week int,timeid int,city string,expectedRevenue_sum double,count int,weekday int,sid string,occupied_min int,dimensionid bigint,month int,hour int,occupied_sum int,locationid string,expectedRevenue_max double,day int,occupied_max int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 537 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingSpot_state(parkingSpaceId string,sampleTimestamp bigint,expectedRevenue double,timezone string,isValid boolean,receiveTimestamp bigint,sid string,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,day int,occupied int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 529 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ParkingSpot_state_hbase ( rowkey string,startTimestamp bigint,expectedRevenue double,sid string,tenantId string,occupied int,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,sampleTimestampID int,lastUpdated bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:startTimestamp#b,d:expectedRevenue#b,d:sid,d:tenantId,d:occupied#b,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:sampleTimestampID#b,d:lastUpdated#b') TBLPROPERTIES('hbase.table.name' = 'cim.ParkingSpot.state') 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 783 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ParkingSpot_state_period(sampleTimestamp bigint,expectedRevenue double,lastUpdated bigint,sampleTimestampID int,isValid boolean,tenantId string,receiveTimestamp bigint,startTimestamp bigint,occupied int,sid string) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 363 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ParkingSpot_hbase ( rowkey string,geocoordinates_longitude double,opParams_zoneType string,sid string,sampleTimestamp bigint,parkingSpaceId string,isValid boolean,geocoordinates_altitude double,lastUpdated bigint,videoStream string,tenantId string,receiveTimestamp bigint,label string,geocoordinates_latitude double,opParams_maxDurationMinutes int,sampleTimestampID int,locationId string,operatedBy string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:opParams_zoneType,d:sid,d:sampleTimestamp#b,d:parkingSpaceId,d:isValid#b,d:geocoordinates_altitude#b,d:lastUpdated#b,d:videoStream,d:tenantId,d:receiveTimestamp#b,d:label,d:geocoordinates_latitude#b,d:opParams_maxDurationMinutes#b,d:sampleTimestampID#b,d:locationId,d:operatedBy') TBLPROPERTIES('hbase.table.name' = 'cim.ParkingSpot') 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1104 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ParkingSpot_location(geocoordinates_longitude double,sampleTimestamp bigint,parkingSpaceId string,city String,geocoordinates_altitude double,timezone String,isValid boolean,operatedBy string,receiveTimestamp bigint,label string,sid string,opParams_zoneType string,lastUpdated bigint,videoStream string,sampleTimestampID int,locationId String,tenantId string,geocoordinates_latitude double,opParams_maxDurationMinutes int) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 570 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingSpot(geocoordinates_longitude double,sampleTimestamp bigint,parkingSpaceId string,city string,timezone string,geocoordinates_altitude double,isValid boolean,operatedBy string,label string,receiveTimestamp bigint,sid string,opParams_zoneType string,lastUpdated bigint,videoStream string,ParkingSpacedimensionid bigint,sampleTimestampID int,locationId string,tenantId string,geocoordinates_latitude double,timezoneoffset int,opParams_maxDurationMinutes int) 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 615 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WeatherStationSensor_state_hourly(hailIntensityHits_min double,vapourPressure_min double,hailAccumulationHits_avg double,relativeHumidity_avg double,weekday int,rainDuration_min double,rainPeakIntensity_max double,mixingRatio_Source string,windSpeed_max double,hailIntensityHits_max double,windSpeed_Source string,rainAccumulation_avg double,hailDuration_max double,absoluteHumidity_Source string,windDirection_avg double,diffuseSkyRadiation_max double,temperature_Source string,pressure_avg double,hailDuration_min double,hailPeakIntensity_Source string,pressure_Source string,hailIntensityHits_Source string,reflectedGlobalRadiation_min double,mixingRatio_min double,rainDuration_max double,relativeHumidity_Source string,rainPeakIntensity_Source string,timeid int,city string,hailAccumulationHits_Source string,hailPeakIntensity_min double,specificHumidity_min double,mixingRatio_max double,atmosphericRadiation_avg double,hailPeakIntensity_max double,directSolarRadiation_avg double,directSolarRadiation_Source string,absoluteHumidity_max double,rainIntensity_min double,temperature_avg double,emissionOfGroundSurface_min double,diffuseSkyRadiation_avg double,diffuseSkyRadiation_Source string,hailDuration_Source string,absoluteHumidity_min double,emissionOfGroundSurface_max double,rainIntensity_max double,vapourPressure_Source string,specificHumidity_Source string,atmosphericRadiation_Source string,rainDuration_Source string,specificHumidity_max double,locationid string,windSpeed_min double,rainPeakIntensity_avg double,reflectedGlobalRadiation_Source string,hailIntensityHits_avg double,vapourPressure_avg double,pressure_max double,hailAccumulationHits_min double,relativeHumidity_min double,dewPointTemperature_min double,dewPointTemperature_Source string,dimensionid bigint,rainDuration_avg double,day int,dewPointTemperature_max double,rainAccumulation_min double,count int,windDirection_min double,rainAccumulation_max double,month int,pressure_min double,hailDuration_avg double,reflectedGlobalRadiation_avg double,mixingRatio_avg double,windDirection_Source string,windDirection_max double,week int,rainAccumulation_Source string,reflectedGlobalRadiation_max double,hailPeakIntensity_avg double,specificHumidity_avg double,temperature_max double,sid string,atmosphericRadiation_min double,emissionOfGroundSurface_Source string,directSolarRadiation_min double,hour int,rainIntensity_avg double,temperature_min double,emissionOfGroundSurface_avg double,diffuseSkyRadiation_min double,directSolarRadiation_max double,absoluteHumidity_avg double,monthweek int,vapourPressure_max double,rainIntensity_Source string,relativeHumidity_max double,atmosphericRadiation_max double,dewPointTemperature_avg double,hailAccumulationHits_max double,rainPeakIntensity_min double,windSpeed_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 3017 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.441 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WeatherStationData_state_hourly(hailIntensityHits_min double,vapourPressure_min double,hailAccumulationHits_avg double,relativeHumidity_avg double,weekday int,rainDuration_min double,rainPeakIntensity_max double,mixingRatio_Source string,windSpeed_max double,hailIntensityHits_max double,windSpeed_Source string,rainAccumulation_avg double,hailDuration_max double,absoluteHumidity_Source string,windDirection_avg double,diffuseSkyRadiation_max double,temperature_Source string,pressure_avg double,hailDuration_min double,hailPeakIntensity_Source string,pressure_Source string,hailIntensityHits_Source string,reflectedGlobalRadiation_min double,mixingRatio_min double,rainDuration_max double,relativeHumidity_Source string,rainPeakIntensity_Source string,timeid int,city string,hailAccumulationHits_Source string,hailPeakIntensity_min double,specificHumidity_min double,mixingRatio_max double,atmosphericRadiation_avg double,hailPeakIntensity_max double,directSolarRadiation_avg double,directSolarRadiation_Source string,absoluteHumidity_max double,rainIntensity_min double,temperature_avg double,emissionOfGroundSurface_min double,diffuseSkyRadiation_avg double,diffuseSkyRadiation_Source string,hailDuration_Source string,absoluteHumidity_min double,emissionOfGroundSurface_max double,rainIntensity_max double,vapourPressure_Source string,specificHumidity_Source string,atmosphericRadiation_Source string,rainDuration_Source string,specificHumidity_max double,locationid string,windSpeed_min double,rainPeakIntensity_avg double,reflectedGlobalRadiation_Source string,hailIntensityHits_avg double,vapourPressure_avg double,pressure_max double,hailAccumulationHits_min double,relativeHumidity_min double,dewPointTemperature_min double,dewPointTemperature_Source string,dimensionid bigint,rainDuration_avg double,day int,dewPointTemperature_max double,rainAccumulation_min double,count int,windDirection_min double,rainAccumulation_max double,month int,pressure_min double,hailDuration_avg double,reflectedGlobalRadiation_avg double,mixingRatio_avg double,windDirection_Source string,windDirection_max double,week int,rainAccumulation_Source string,reflectedGlobalRadiation_max double,hailPeakIntensity_avg double,specificHumidity_avg double,temperature_max double,sid string,atmosphericRadiation_min double,emissionOfGroundSurface_Source string,directSolarRadiation_min double,hour int,rainIntensity_avg double,temperature_min double,emissionOfGroundSurface_avg double,diffuseSkyRadiation_min double,directSolarRadiation_max double,absoluteHumidity_avg double,monthweek int,vapourPressure_max double,rainIntensity_Source string,relativeHumidity_max double,atmosphericRadiation_max double,dewPointTemperature_avg double,hailAccumulationHits_max double,rainPeakIntensity_min double,windSpeed_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.441 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 3015 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WeatherStationData_state(absoluteHumidity_expiresAt bigint,hailPeakIntensity_accuracy double,vapourPressure_reliability double,mixingRatio_Source string,emissionOfGroundSurface double,lastUpdated bigint,windSpeed_expiresAt bigint,diffuseSkyRadiation_accuracy double,atmosphericRadiation_reliability double,rainPeakIntensity_expiresAt bigint,rainDuration_expiresAt bigint,atmosphericRadiation_accuracy double,windSpeed_Source string,temperature_reliability double,reflectedGlobalRadiation_reliability double,windDirection_accuracy double,pressure_accuracy double,hailDuration_expiresAt bigint,windSpeed_reliability double,rainDuration double,absoluteHumidity_Source string,rainPeakIntensity double,specificHumidity_reliability double,temperature_Source string,windDirection_expiresAt bigint,rainIntensity_accuracy double,hailPeakIntensity_Source string,pressure_Source string,hailIntensityHits_Source string,directSolarRadiation_accuracy double,rainAccumulation_accuracy double,hailAccumulationHits_accuracy double,relativeHumidity_Source string,rainPeakIntensity_Source string,hailAccumulationHits_Source string,specificHumidity_accuracy double,diffuseSkyRadiation_reliability double,rainPeakIntensity_reliability double,hailIntensityHits double,specificHumidity_expiresAt bigint,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,hailPeakIntensity_reliability double,directSolarRadiation_Source string,locationId string,directSolarRadiation_expiresAt bigint,temperature double,timezoneoffset int,diffuseSkyRadiation_Source string,hailDuration_Source string,vapourPressure_Source string,specificHumidity_Source string,isValid boolean,windSpeed_accuracy double,atmosphericRadiation_Source string,hailPeakIntensity_expiresAt bigint,dewPointTemperature_reliability double,emissionOfGroundSurface_accuracy double,rainDuration_Source string,pressure_reliability double,atmosphericRadiation_expiresAt bigint,tenantId string,diffuseSkyRadiation double,rainIntensity_expiresAt bigint,hailDuration_accuracy double,hailDuration double,reflectedGlobalRadiation_Source string,emissionOfGroundSurface_reliability double,relativeHumidity_reliability double,vapourPressure_accuracy double,receiveTimestamp bigint,dewPointTemperature_Source string,hailAccumulationHits_expiresAt bigint,pressure_expiresAt bigint,relativeHumidity_accuracy double,temperature_accuracy double,windDirection double,windSpeed double,day int,mixingRatio_reliability double,mixingRatio_accuracy double,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,emissionOfGroundSurface_expiresAt bigint,pressure double,relativeHumidity_expiresAt bigint,absoluteHumidity double,absoluteHumidity_accuracy double,directSolarRadiation_reliability double,windDirection_Source string,diffuseSkyRadiation_expiresAt bigint,sampleTimestamp bigint,rainDuration_accuracy double,reflectedGlobalRadiation_expiresAt bigint,rainAccumulation_Source string,rainIntensity double,timezone string,absoluteHumidity_reliability double,parentEntityType string,specificHumidity double,sid string,emissionOfGroundSurface_Source string,rainIntensity_reliability double,dewPointTemperature_accuracy double,reflectedGlobalRadiation double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,atmosphericRadiation double,hailAccumulationHits double,rainPeakIntensity_accuracy double,dewPointTemperature double,directSolarRadiation double,hailIntensityHits_expiresAt bigint,windDirection_reliability double,hailIntensityHits_accuracy double,rainIntensity_Source string,rainDuration_reliability double,mixingRatio double,rainAccumulation_expiresAt bigint,hailPeakIntensity double,relativeHumidity double,vapourPressure double,temperature_expiresAt bigint,rainAccumulation double,hailDuration_reliability double,vapourPressure_expiresAt bigint,startTimestamp bigint,rainAccumulation_reliability double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 4092 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WeatherStationSensor_state(absoluteHumidity_expiresAt bigint,hailPeakIntensity_accuracy double,vapourPressure_reliability double,mixingRatio_Source string,emissionOfGroundSurface double,lastUpdated bigint,windSpeed_expiresAt bigint,diffuseSkyRadiation_accuracy double,atmosphericRadiation_reliability double,rainPeakIntensity_expiresAt bigint,rainDuration_expiresAt bigint,atmosphericRadiation_accuracy double,windSpeed_Source string,temperature_reliability double,reflectedGlobalRadiation_reliability double,windDirection_accuracy double,pressure_accuracy double,hailDuration_expiresAt bigint,windSpeed_reliability double,rainDuration double,absoluteHumidity_Source string,rainPeakIntensity double,specificHumidity_reliability double,temperature_Source string,windDirection_expiresAt bigint,rainIntensity_accuracy double,hailPeakIntensity_Source string,pressure_Source string,hailIntensityHits_Source string,directSolarRadiation_accuracy double,rainAccumulation_accuracy double,hailAccumulationHits_accuracy double,relativeHumidity_Source string,rainPeakIntensity_Source string,hailAccumulationHits_Source string,specificHumidity_accuracy double,diffuseSkyRadiation_reliability double,rainPeakIntensity_reliability double,hailIntensityHits double,specificHumidity_expiresAt bigint,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,hailPeakIntensity_reliability double,directSolarRadiation_Source string,locationId string,directSolarRadiation_expiresAt bigint,temperature double,timezoneoffset int,diffuseSkyRadiation_Source string,hailDuration_Source string,vapourPressure_Source string,specificHumidity_Source string,isValid boolean,windSpeed_accuracy double,atmosphericRadiation_Source string,hailPeakIntensity_expiresAt bigint,dewPointTemperature_reliability double,emissionOfGroundSurface_accuracy double,rainDuration_Source string,pressure_reliability double,atmosphericRadiation_expiresAt bigint,tenantId string,diffuseSkyRadiation double,rainIntensity_expiresAt bigint,hailDuration_accuracy double,hailDuration double,reflectedGlobalRadiation_Source string,emissionOfGroundSurface_reliability double,relativeHumidity_reliability double,vapourPressure_accuracy double,receiveTimestamp bigint,dewPointTemperature_Source string,hailAccumulationHits_expiresAt bigint,pressure_expiresAt bigint,relativeHumidity_accuracy double,geocoordinates_latitude double,temperature_accuracy double,windDirection double,windSpeed double,day int,mixingRatio_reliability double,mixingRatio_accuracy double,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,emissionOfGroundSurface_expiresAt bigint,pressure double,relativeHumidity_expiresAt bigint,absoluteHumidity double,absoluteHumidity_accuracy double,directSolarRadiation_reliability double,windDirection_Source string,diffuseSkyRadiation_expiresAt bigint,sampleTimestamp bigint,geocoordinates_longitude double,rainDuration_accuracy double,reflectedGlobalRadiation_expiresAt bigint,rainAccumulation_Source string,rainIntensity double,timezone string,absoluteHumidity_reliability double,parentEntityType string,specificHumidity double,sid string,emissionOfGroundSurface_Source string,rainIntensity_reliability double,dewPointTemperature_accuracy double,reflectedGlobalRadiation double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,atmosphericRadiation double,hailAccumulationHits double,rainPeakIntensity_accuracy double,dewPointTemperature double,directSolarRadiation double,hailIntensityHits_expiresAt bigint,geocoordinates_altitude double,windDirection_reliability double,hailIntensityHits_accuracy double,rainIntensity_Source string,rainDuration_reliability double,mixingRatio double,rainAccumulation_expiresAt bigint,hailPeakIntensity double,relativeHumidity double,vapourPressure double,temperature_expiresAt bigint,rainAccumulation double,hailDuration_reliability double,vapourPressure_expiresAt bigint,startTimestamp bigint,rainAccumulation_reliability double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 4188 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WeatherStationSensor_state_hbase ( rowkey string,windDirection double,rainDuration_accuracy double,mixingRatio_Source string,vapourPressure double,windSpeed_Source string,hailPeakIntensity double,diffuseSkyRadiation_reliability double,rainDuration_reliability double,relativeHumidity_Source string,parentEntityType string,windDirection_reliability double,hailIntensityHits_expiresAt bigint,specificHumidity double,sampleTimestampID int,reflectedGlobalRadiation_accuracy double,rainPeakIntensity double,hailIntensityHits_accuracy double,sid string,rainDuration double,temperature_Source string,relativeHumidity double,specificHumidity_expiresAt bigint,dewPointTemperature_accuracy double,emissionOfGroundSurface_Source string,mixingRatio double,hailPeakIntensity_reliability double,hailIntensityHits_reliability double,reflectedGlobalRadiation_expiresAt bigint,diffuseSkyRadiation_expiresAt bigint,rainDuration_Source string,rainDuration_expiresAt bigint,reflectedGlobalRadiation_reliability double,reflectedGlobalRadiation double,diffuseSkyRadiation_accuracy double,dewPointTemperature_Source string,rainIntensity_expiresAt bigint,rainIntensity double,hailDuration_Source string,vapourPressure_accuracy double,directSolarRadiation_reliability double,hailDuration_accuracy double,temperature_accuracy double,reflectedGlobalRadiation_Source string,pressure_accuracy double,vapourPressure_reliability double,hailPeakIntensity_accuracy double,rainPeakIntensity_expiresAt bigint,vapourPressure_Source string,atmosphericRadiation_reliability double,windDirection_accuracy double,windSpeed_reliability double,dewPointTemperature_reliability double,absoluteHumidity_accuracy double,diffuseSkyRadiation_Source string,dewPointTemperature double,windDirection_expiresAt bigint,relativeHumidity_reliability double,startTimestamp bigint,mixingRatio_accuracy double,rainIntensity_reliability double,hailIntensityHits double,absoluteHumidity_expiresAt bigint,pressure double,absoluteHumidity_reliability double,directSolarRadiation_expiresAt bigint,isValid boolean,pressure_expiresAt bigint,atmosphericRadiation_Source string,emissionOfGroundSurface_reliability double,pressure_reliability double,windSpeed_expiresAt bigint,hailIntensityHits_Source string,specificHumidity_Source string,absoluteHumidity_Source string,rainAccumulation_Source string,hailAccumulationHits double,hailDuration double,mixingRatio_expiresAt bigint,hailDuration_reliability double,vapourPressure_expiresAt bigint,rainAccumulation_expiresAt bigint,windSpeed_accuracy double,hailAccumulationHits_expiresAt bigint,hailAccumulationHits_accuracy double,receiveTimestamp bigint,rainIntensity_Source string,directSolarRadiation_accuracy double,rainAccumulation_accuracy double,temperature double,sampleTimestamp bigint,emissionOfGroundSurface double,diffuseSkyRadiation double,rainAccumulation double,emissionOfGroundSurface_accuracy double,absoluteHumidity double,temperature_expiresAt bigint,hailPeakIntensity_expiresAt bigint,hailPeakIntensity_Source string,rainPeakIntensity_reliability double,dewPointTemperature_expiresAt bigint,rainAccumulation_reliability double,rainPeakIntensity_Source string,emissionOfGroundSurface_expiresAt bigint,directSolarRadiation double,rainPeakIntensity_accuracy double,relativeHumidity_expiresAt bigint,atmosphericRadiation_accuracy double,rainIntensity_accuracy double,atmosphericRadiation double,lastUpdated bigint,hailAccumulationHits_reliability double,hailAccumulationHits_Source string,specificHumidity_reliability double,directSolarRadiation_Source string,atmosphericRadiation_expiresAt bigint,tenantId string,pressure_Source string,windSpeed double,relativeHumidity_accuracy double,temperature_reliability double,mixingRatio_reliability double,windDirection_Source string,specificHumidity_accuracy double,hailDuration_expiresAt bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:windDirection#b,d:rainDuration_accuracy#b,d:mixingRatio_Source,d:vapourPressure#b,d:windSpeed_Source,d:hailPeakIntensity#b,d:diffuseSkyRadiation_reliability#b,d:rainDuration_reliability#b,d:relativeHumidity_Source,d:parentEntityType,d:windDirection_reliability#b,d:hailIntensityHits_expiresAt#b,d:specificHumidity#b,d:sampleTimestampID#b,d:reflectedGlobalRadiation_accuracy#b,d:rainPeakIntensity#b,d:hailIntensityHits_accuracy#b,d:sid,d:rainDuration#b,d:temperature_Source,d:relativeHumidity#b,d:specificHumidity_expiresAt#b,d:dewPointTemperature_accuracy#b,d:emissionOfGroundSurface_Source,d:mixingRatio#b,d:hailPeakIntensity_reliability#b,d:hailIntensityHits_reliability#b,d:reflectedGlobalRadiation_expiresAt#b,d:diffuseSkyRadiation_expiresAt#b,d:rainDuration_Source,d:rainDuration_expiresAt#b,d:reflectedGlobalRadiation_reliability#b,d:reflectedGlobalRadiation#b,d:diffuseSkyRadiation_accuracy#b,d:dewPointTemperature_Source,d:rainIntensity_expiresAt#b,d:rainIntensity#b,d:hailDuration_Source,d:vapourPressure_accuracy#b,d:directSolarRadiation_reliability#b,d:hailDuration_accuracy#b,d:temperature_accuracy#b,d:reflectedGlobalRadiation_Source,d:pressure_accuracy#b,d:vapourPressure_reliability#b,d:hailPeakIntensity_accuracy#b,d:rainPeakIntensity_expiresAt#b,d:vapourPressure_Source,d:atmosphericRadiation_reliability#b,d:windDirection_accuracy#b,d:windSpeed_reliability#b,d:dewPointTemperature_reliability#b,d:absoluteHumidity_accuracy#b,d:diffuseSkyRadiation_Source,d:dewPointTemperature#b,d:windDirection_expiresAt#b,d:relativeHumidity_reliability#b,d:startTimestamp#b,d:mixingRatio_accuracy#b,d:rainIntensity_reliability#b,d:hailIntensityHits#b,d:absoluteHumidity_expiresAt#b,d:pressure#b,d:absoluteHumidity_reliability#b,d:directSolarRadiation_expiresAt#b,d:isValid#b,d:pressure_expiresAt#b,d:atmosphericRadiation_Source,d:emissionOfGroundSurface_reliability#b,d:pressure_reliability#b,d:windSpeed_expiresAt#b,d:hailIntensityHits_Source,d:specificHumidity_Source,d:absoluteHumidity_Source,d:rainAccumulation_Source,d:hailAccumulationHits#b,d:hailDuration#b,d:mixingRatio_expiresAt#b,d:hailDuration_reliability#b,d:vapourPressure_expiresAt#b,d:rainAccumulation_expiresAt#b,d:windSpeed_accuracy#b,d:hailAccumulationHits_expiresAt#b,d:hailAccumulationHits_accuracy#b,d:receiveTimestamp#b,d:rainIntensity_Source,d:directSolarRadiation_accuracy#b,d:rainAccumulation_accuracy#b,d:temperature#b,d:sampleTimestamp#b,d:emissionOfGroundSurface#b,d:diffuseSkyRadiation#b,d:rainAccumulation#b,d:emissionOfGroundSurface_accuracy#b,d:absoluteHumidity#b,d:temperature_expiresAt#b,d:hailPeakIntensity_expiresAt#b,d:hailPeakIntensity_Source,d:rainPeakIntensity_reliability#b,d:dewPointTemperature_expiresAt#b,d:rainAccumulation_reliability#b,d:rainPeakIntensity_Source,d:emissionOfGroundSurface_expiresAt#b,d:directSolarRadiation#b,d:rainPeakIntensity_accuracy#b,d:relativeHumidity_expiresAt#b,d:atmosphericRadiation_accuracy#b,d:rainIntensity_accuracy#b,d:atmosphericRadiation#b,d:lastUpdated#b,d:hailAccumulationHits_reliability#b,d:hailAccumulationHits_Source,d:specificHumidity_reliability#b,d:directSolarRadiation_Source,d:atmosphericRadiation_expiresAt#b,d:tenantId,d:pressure_Source,d:windSpeed#b,d:relativeHumidity_accuracy#b,d:temperature_reliability#b,d:mixingRatio_reliability#b,d:windDirection_Source,d:specificHumidity_accuracy#b,d:hailDuration_expiresAt#b') TBLPROPERTIES('hbase.table.name' = 'cim.WeatherStationSensor.state') 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 7602 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WeatherStationData_state_period(absoluteHumidity_expiresAt bigint,vapourPressure_reliability double,hailPeakIntensity_accuracy double,mixingRatio_Source string,emissionOfGroundSurface double,lastUpdated bigint,windSpeed_expiresAt bigint,diffuseSkyRadiation_accuracy double,atmosphericRadiation_reliability double,rainPeakIntensity_expiresAt bigint,rainDuration_expiresAt bigint,atmosphericRadiation_accuracy double,windSpeed_Source string,temperature_reliability double,reflectedGlobalRadiation_reliability double,windDirection_accuracy double,pressure_accuracy double,windSpeed_reliability double,hailDuration_expiresAt bigint,rainDuration double,absoluteHumidity_Source string,rainPeakIntensity double,specificHumidity_reliability double,temperature_Source string,windDirection_expiresAt bigint,hailPeakIntensity_Source string,rainIntensity_accuracy double,hailIntensityHits_Source string,pressure_Source string,directSolarRadiation_accuracy double,rainAccumulation_accuracy double,hailAccumulationHits_accuracy double,relativeHumidity_Source string,rainPeakIntensity_Source string,hailAccumulationHits_Source string,diffuseSkyRadiation_reliability double,specificHumidity_accuracy double,specificHumidity_expiresAt bigint,hailIntensityHits double,rainPeakIntensity_reliability double,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,hailPeakIntensity_reliability double,directSolarRadiation_Source string,directSolarRadiation_expiresAt bigint,temperature double,diffuseSkyRadiation_Source string,hailDuration_Source string,vapourPressure_Source string,specificHumidity_Source string,isValid boolean,windSpeed_accuracy double,atmosphericRadiation_Source string,hailPeakIntensity_expiresAt bigint,dewPointTemperature_reliability double,emissionOfGroundSurface_accuracy double,rainDuration_Source string,pressure_reliability double,atmosphericRadiation_expiresAt bigint,tenantId string,diffuseSkyRadiation double,rainIntensity_expiresAt bigint,hailDuration_accuracy double,reflectedGlobalRadiation_Source string,hailDuration double,emissionOfGroundSurface_reliability double,vapourPressure_accuracy double,relativeHumidity_reliability double,receiveTimestamp bigint,dewPointTemperature_Source string,hailAccumulationHits_expiresAt bigint,pressure_expiresAt bigint,relativeHumidity_accuracy double,temperature_accuracy double,windDirection double,windSpeed double,mixingRatio_reliability double,mixingRatio_accuracy double,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,emissionOfGroundSurface_expiresAt bigint,pressure double,relativeHumidity_expiresAt bigint,absoluteHumidity double,absoluteHumidity_accuracy double,directSolarRadiation_reliability double,windDirection_Source string,diffuseSkyRadiation_expiresAt bigint,sampleTimestamp bigint,rainDuration_accuracy double,reflectedGlobalRadiation_expiresAt bigint,rainAccumulation_Source string,rainIntensity double,absoluteHumidity_reliability double,parentEntityType string,specificHumidity double,sid string,emissionOfGroundSurface_Source string,rainIntensity_reliability double,dewPointTemperature_accuracy double,reflectedGlobalRadiation double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,atmosphericRadiation double,hailAccumulationHits double,dewPointTemperature double,rainPeakIntensity_accuracy double,directSolarRadiation double,hailIntensityHits_expiresAt bigint,windDirection_reliability double,hailIntensityHits_accuracy double,rainIntensity_Source string,rainDuration_reliability double,mixingRatio double,hailPeakIntensity double,rainAccumulation_expiresAt bigint,vapourPressure double,relativeHumidity double,temperature_expiresAt bigint,startTimestamp bigint,hailDuration_reliability double,vapourPressure_expiresAt bigint,rainAccumulation double,rainAccumulation_reliability double) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 3967 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WeatherStationSensor_state_period(absoluteHumidity_expiresAt bigint,vapourPressure_reliability double,hailPeakIntensity_accuracy double,mixingRatio_Source string,emissionOfGroundSurface double,lastUpdated bigint,windSpeed_expiresAt bigint,diffuseSkyRadiation_accuracy double,atmosphericRadiation_reliability double,rainPeakIntensity_expiresAt bigint,rainDuration_expiresAt bigint,atmosphericRadiation_accuracy double,windSpeed_Source string,temperature_reliability double,reflectedGlobalRadiation_reliability double,windDirection_accuracy double,pressure_accuracy double,windSpeed_reliability double,hailDuration_expiresAt bigint,rainDuration double,absoluteHumidity_Source string,rainPeakIntensity double,specificHumidity_reliability double,temperature_Source string,windDirection_expiresAt bigint,hailPeakIntensity_Source string,rainIntensity_accuracy double,hailIntensityHits_Source string,pressure_Source string,directSolarRadiation_accuracy double,rainAccumulation_accuracy double,hailAccumulationHits_accuracy double,relativeHumidity_Source string,rainPeakIntensity_Source string,hailAccumulationHits_Source string,diffuseSkyRadiation_reliability double,specificHumidity_accuracy double,specificHumidity_expiresAt bigint,hailIntensityHits double,rainPeakIntensity_reliability double,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,hailPeakIntensity_reliability double,directSolarRadiation_Source string,directSolarRadiation_expiresAt bigint,temperature double,diffuseSkyRadiation_Source string,hailDuration_Source string,vapourPressure_Source string,specificHumidity_Source string,isValid boolean,windSpeed_accuracy double,atmosphericRadiation_Source string,hailPeakIntensity_expiresAt bigint,dewPointTemperature_reliability double,emissionOfGroundSurface_accuracy double,rainDuration_Source string,pressure_reliability double,atmosphericRadiation_expiresAt bigint,tenantId string,diffuseSkyRadiation double,rainIntensity_expiresAt bigint,hailDuration_accuracy double,reflectedGlobalRadiation_Source string,hailDuration double,emissionOfGroundSurface_reliability double,vapourPressure_accuracy double,relativeHumidity_reliability double,receiveTimestamp bigint,dewPointTemperature_Source string,hailAccumulationHits_expiresAt bigint,pressure_expiresAt bigint,relativeHumidity_accuracy double,geocoordinates_latitude double,temperature_accuracy double,windDirection double,windSpeed double,mixingRatio_reliability double,mixingRatio_accuracy double,dewPointTemperature_expiresAt bigint,mixingRatio_expiresAt bigint,emissionOfGroundSurface_expiresAt bigint,pressure double,relativeHumidity_expiresAt bigint,absoluteHumidity double,absoluteHumidity_accuracy double,directSolarRadiation_reliability double,windDirection_Source string,diffuseSkyRadiation_expiresAt bigint,sampleTimestamp bigint,geocoordinates_longitude double,rainDuration_accuracy double,reflectedGlobalRadiation_expiresAt bigint,rainAccumulation_Source string,rainIntensity double,absoluteHumidity_reliability double,parentEntityType string,specificHumidity double,sid string,emissionOfGroundSurface_Source string,rainIntensity_reliability double,dewPointTemperature_accuracy double,reflectedGlobalRadiation double,reflectedGlobalRadiation_accuracy double,sampleTimestampID int,atmosphericRadiation double,hailAccumulationHits double,dewPointTemperature double,rainPeakIntensity_accuracy double,directSolarRadiation double,hailIntensityHits_expiresAt bigint,geocoordinates_altitude double,windDirection_reliability double,hailIntensityHits_accuracy double,rainIntensity_Source string,rainDuration_reliability double,mixingRatio double,hailPeakIntensity double,rainAccumulation_expiresAt bigint,vapourPressure double,relativeHumidity double,temperature_expiresAt bigint,startTimestamp bigint,hailDuration_reliability double,vapourPressure_expiresAt bigint,rainAccumulation double,rainAccumulation_reliability double) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 4063 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientTemperatureSensor_state_hourly(monthweek int,week int,timeid int,city string,count int,weekday int,temperature_max double,sid string,temperature_Source string,dimensionid bigint,month int,hour int,locationid string,temperature_min double,temperature_avg double,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 491 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientTemperatureData_state_hourly(monthweek int,week int,timeid int,city string,count int,weekday int,temperature_max double,sid string,temperature_Source string,dimensionid bigint,month int,hour int,locationid string,temperature_min double,temperature_avg double,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 489 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientTemperatureData_state(sampleTimestamp bigint,timezone string,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,temperature_Source string,lastUpdated bigint,sampleTimestampID int,locationId string,tenantId string,temperature double,timezoneoffset int,temperature_reliability double,temperature_accuracy double,temperature_expiresAt bigint,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 620 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientTemperatureSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,timezone string,geocoordinates_altitude double,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,temperature_Source string,lastUpdated bigint,sampleTimestampID int,locationId string,tenantId string,temperature double,geocoordinates_latitude double,timezoneoffset int,temperature_reliability double,temperature_accuracy double,temperature_expiresAt bigint,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 716 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.AmbientTemperatureSensor_state_hbase ( rowkey string,temperature double,sid string,sampleTimestamp bigint,temperature_Source string,isValid boolean,lastUpdated bigint,startTimestamp bigint,temperature_expiresAt bigint,tenantId string,temperature_reliability double,parentEntityType string,temperature_accuracy double,receiveTimestamp bigint,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:temperature#b,d:sid,d:sampleTimestamp#b,d:temperature_Source,d:isValid#b,d:lastUpdated#b,d:startTimestamp#b,d:temperature_expiresAt#b,d:tenantId,d:temperature_reliability#b,d:parentEntityType,d:temperature_accuracy#b,d:receiveTimestamp#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.AmbientTemperatureSensor.state') 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1032 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientTemperatureData_state_period(sampleTimestamp bigint,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,temperature_Source string,lastUpdated bigint,sampleTimestampID int,temperature double,tenantId string,temperature_expiresAt bigint,temperature_reliability double,temperature_accuracy double,startTimestamp bigint) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 495 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientTemperatureSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,geocoordinates_altitude double,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,temperature_Source string,lastUpdated bigint,sampleTimestampID int,temperature double,tenantId string,geocoordinates_latitude double,temperature_expiresAt bigint,temperature_reliability double,temperature_accuracy double,startTimestamp bigint) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 591 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.MobilityPOMDirection_hbase ( rowkey string,sid string,sampleTimestamp bigint,isValid boolean,providerDetails string,thirdPartyId string,direction string,tenantId string,pomId string,receiveTimestamp bigint,directionLabel string,sampleTimestampID int,locationId string,bearing int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:sid,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:thirdPartyId,d:direction,d:tenantId,d:pomId,d:receiveTimestamp#b,d:directionLabel,d:sampleTimestampID#b,d:locationId,d:bearing#b') TBLPROPERTIES('hbase.table.name' = 'cim.MobilityPOMDirection') 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 864 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.MobilityPOMDirection_location(sampleTimestamp bigint,city String,bearing int,timezone String,isValid boolean,receiveTimestamp bigint,sid string,thirdPartyId string,directionLabel string,sampleTimestampID int,locationId String,pomId string,providerDetails string,tenantId string,direction string) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 444 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.MobilityPOMDirection(sampleTimestamp bigint,city string,timezone string,bearing int,isValid boolean,receiveTimestamp bigint,sid string,thirdPartyId string,directionLabel string,sampleTimestampID int,locationId string,pomId string,MobilityPOMdimensionid bigint,tenantId string,providerDetails string,timezoneoffset int,direction string) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 488 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.MobilityStats_connection ( rowkey string,entityId string,entityType string,sid string,tenantId String,sampletimestamp bigint,receivetimestamp bigint,name string,destroytimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:entityId,d:entityType,d:sid,d:tenantId#b,d:sampletimestamp#b,d:receivetimestamp#b,d:name,d:destroytimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.MobilityStats.connection') 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 698 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.MobilityStats_connection_bak(receivetimestamp bigint,entityType string,sampletimestamp bigint,name string,tenantId String,entityId string,destroytimestamp bigint,sid string) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 322 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.MobilityStats_hbase ( rowkey string,tenantId string,sid string,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:tenantId,d:sid,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.MobilityStats') 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 661 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.MobilityStats_location(sampleTimestamp bigint,sampleTimestampID int,city String,locationId String,timezone String,isValid boolean,tenantId string,receiveTimestamp bigint,sid string) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 330 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.MobilityStats(sampleTimestamp bigint,sampleTimestampID int,city string,locationId string,timezone String,isValid boolean,tenantId string,timezoneoffset int,receiveTimestamp bigint,sid string) 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 344 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.442 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.EnvironmentSensor_hbase ( rowkey string,deviceType string,sampleTimestamp bigint,isValid boolean,providerDetails string,model string,sensorType string,applicableDomain string,tag string,label string,type string,sampleTimestampID int,connectivityType string,locationId string,instanceId string,outsourceablePolicyTemplate string,operatedBy string,createTimestampID int,status string,sid string,destroyTimestamp bigint,rawHealth string,private string,thirdPartyId string,lastUpdated bigint,supportedMode string,mode string,policyHandler string,destroyTimestampID int,dependentOn string,isIndependent int,custom string,tenantId string,parentDomain string,receiveTimestamp bigint,createTimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:deviceType,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:model,d:sensorType,d:applicableDomain,d:tag,d:label,d:type,d:sampleTimestampID#b,d:connectivityType,d:locationId,d:instanceId,d:outsourceablePolicyTemplate,d:operatedBy,d:createTimestampID#b,d:status,d:sid,d:destroyTimestamp#b,d:rawHealth,d:private,d:thirdPartyId,d:lastUpdated#b,d:supportedMode,d:mode,d:policyHandler,d:destroyTimestampID#b,d:dependentOn,d:isIndependent#b,d:custom,d:tenantId,d:parentDomain,d:receiveTimestamp#b,d:createTimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.EnvironmentSensor') 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.442 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1608 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.EnvironmentSensor_location(sampleTimestamp bigint,private string,isIndependent int,city String,timezone String,receiveTimestamp bigint,type string,rawHealth string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,connectivityType string,mode string,thirdPartyId string,lastUpdated bigint,instanceId string,sampleTimestampID int,dependentOn string,locationId String,sensorType string,createTimestampID int,providerDetails string,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,isValid boolean,custom string,operatedBy string,label string,supportedMode string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 863 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.EnvironmentSensor(sampleTimestamp bigint,private string,isIndependent int,city string,timezone String,receiveTimestamp bigint,type string,rawHealth string,sid string,parentDomain string,createTimestamp bigint,connectivityType string,mode string,applicableDomain string,thirdPartyId string,lastUpdated bigint,instanceId string,dependentOn string,sampleTimestampID int,locationId string,sensorType string,createTimestampID int,providerDetails string,timezoneoffset int,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,isValid boolean,custom string,operatedBy string,label string,supportedMode string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 877 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.EnvironmentSensor_state(geocoordinates_longitude double,sampleTimestamp bigint,deviceState_connState_connected int,deviceState_connState_since bigint,geocoordinates_altitude double,timezone string,isValid boolean,deviceState_batteryPercentage double,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,geocoordinates_latitude double,timezoneoffset int,destroyTimestampID int,day int,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 771 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.EnvironmentSensor_state_hbase ( rowkey string,geocoordinates_longitude double,createTimestampID int,sid string,sampleTimestamp bigint,isValid boolean,geocoordinates_altitude double,deviceState_connState_connected int,deviceState_connState_since bigint,destroyTimestamp bigint,lastUpdated bigint,destroyTimestampID int,startTimestamp bigint,tenantId string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double,deviceState_batteryPercentage double,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:createTimestampID#b,d:sid,d:sampleTimestamp#b,d:isValid#b,d:geocoordinates_altitude#b,d:deviceState_connState_connected#b,d:deviceState_connState_since#b,d:destroyTimestamp#b,d:lastUpdated#b,d:destroyTimestampID#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b,d:deviceState_batteryPercentage#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.EnvironmentSensor.state') 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1293 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.EnvironmentSensor_state_period(geocoordinates_longitude double,sampleTimestamp bigint,deviceState_connState_connected int,deviceState_connState_since bigint,geocoordinates_altitude double,isValid boolean,deviceState_batteryPercentage double,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,sampleTimestampID int,createTimestampID int,tenantId string,geocoordinates_latitude double,destroyTimestampID int,startTimestamp bigint,destroyTimestamp bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 627 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.LightGrpCtrl_hbase ( rowkey string,geocoordinates_longitude double,deviceType string,sampleTimestamp bigint,isValid boolean,providerDetails string,geohash string,model string,applicableDomain string,tag string,label string,type string,sampleTimestampID int,connectivityType string,locationId string,instanceId string,outsourceablePolicyTemplate string,createTimestampID int,powerMeter string,status string,sid string,geocoordinates_altitude double,destroyTimestamp bigint,private string,thirdPartyId string,lastUpdated bigint,supportedMode string,mode string,policyHandler string,destroyTimestampID int,dependentOn string,isIndependent int,custom string,tenantId string,parentDomain string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:deviceType,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:geohash,d:model,d:applicableDomain,d:tag,d:label,d:type,d:sampleTimestampID#b,d:connectivityType,d:locationId,d:instanceId,d:outsourceablePolicyTemplate,d:createTimestampID#b,d:powerMeter,d:status,d:sid,d:geocoordinates_altitude#b,d:destroyTimestamp#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:supportedMode,d:mode,d:policyHandler,d:destroyTimestampID#b,d:dependentOn,d:isIndependent#b,d:custom,d:tenantId,d:parentDomain,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b') TBLPROPERTIES('hbase.table.name' = 'cim.LightGrpCtrl') 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1742 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.LightGrpCtrl_location(geocoordinates_longitude double,sampleTimestamp bigint,powerMeter string,private string,isIndependent int,city String,timezone String,receiveTimestamp bigint,type string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,connectivityType string,mode string,thirdPartyId string,lastUpdated bigint,instanceId string,sampleTimestampID int,dependentOn string,locationId String,geohash string,createTimestampID int,providerDetails string,geocoordinates_latitude double,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,geocoordinates_altitude double,isValid boolean,custom string,label string,supportedMode string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 932 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.LightGrpCtrl(powerMeter string,geocoordinates_longitude double,sampleTimestamp bigint,private string,isIndependent int,city string,timezone String,receiveTimestamp bigint,type string,sid string,parentDomain string,createTimestamp bigint,connectivityType string,mode string,applicableDomain string,thirdPartyId string,lastUpdated bigint,instanceId string,dependentOn string,sampleTimestampID int,locationId string,geohash string,createTimestampID int,geocoordinates_latitude double,providerDetails string,timezoneoffset int,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,geocoordinates_altitude double,isValid boolean,custom string,label string,supportedMode string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 946 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.LightGrpCtrl_state_hourly(monthweek int,week int,timeid int,powerConsumption_max double,city string,minutepoweron_sum int,intensityLevel_sum double,count int,weekday int,intensityLevel_max double,sid string,powerconsumption_count int,dimensionid bigint,month int,hour int,locationid string,powerConsumption_min double,intensitylevel_count int,intensityLevel_min double,day int,powerConsumption_sum double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 620 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.LightGrpCtrl_state(intensityLevel double,sampleTimestamp bigint,deviceState_connState_connected int,numLights int,timezone string,reliability double,receiveTimestamp bigint,sid string,createTimestamp bigint,numOnline int,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,day int,deviceState_connState_since bigint,powerConsumption double,isValid boolean,deviceState_batteryPercentage double,tenantId string,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 765 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.LightGrpCtrl_state_hbase ( rowkey string,numOnline int,powerConsumption double,createTimestampID int,sid string,intensityLevel double,sampleTimestamp bigint,isValid boolean,deviceState_connState_connected int,deviceState_connState_since bigint,destroyTimestamp bigint,lastUpdated bigint,reliability double,destroyTimestampID int,startTimestamp bigint,tenantId string,numLights int,receiveTimestamp bigint,createTimestamp bigint,deviceState_batteryPercentage double,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:numOnline#b,d:powerConsumption#b,d:createTimestampID#b,d:sid,d:intensityLevel#b,d:sampleTimestamp#b,d:isValid#b,d:deviceState_connState_connected#b,d:deviceState_connState_since#b,d:destroyTimestamp#b,d:lastUpdated#b,d:reliability#b,d:destroyTimestampID#b,d:startTimestamp#b,d:tenantId,d:numLights#b,d:receiveTimestamp#b,d:createTimestamp#b,d:deviceState_batteryPercentage#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.LightGrpCtrl.state') 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1281 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.LightGrpCtrl_state_period(intensityLevel double,sampleTimestamp bigint,deviceState_connState_connected int,deviceState_connState_since bigint,powerConsumption double,numLights int,isValid boolean,reliability double,deviceState_batteryPercentage double,receiveTimestamp bigint,sid string,createTimestamp bigint,numOnline int,lastUpdated bigint,sampleTimestampID int,createTimestampID int,tenantId string,destroyTimestampID int,startTimestamp bigint,destroyTimestamp bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 621 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.SpatialMobilityStats_connection ( rowkey string,entityId string,entityType string,sid string,tenantId String,sampletimestamp bigint,receivetimestamp bigint,name string,destroytimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:entityId,d:entityType,d:sid,d:tenantId#b,d:sampletimestamp#b,d:receivetimestamp#b,d:name,d:destroytimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.SpatialMobilityStats.connection') 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 712 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.SpatialMobilityStats_connection_bak(receivetimestamp bigint,entityType string,sampletimestamp bigint,name string,tenantId String,entityId string,destroytimestamp bigint,sid string) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 329 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.SpatialMobilityStats_hbase ( rowkey string,status string,tenantId string,sid string,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,since bigint,sampleTimestampID int,lastUpdated bigint,locationId string,createTime bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:status,d:tenantId,d:sid,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:since#b,d:sampleTimestampID#b,d:lastUpdated#b,d:locationId,d:createTime#b') TBLPROPERTIES('hbase.table.name' = 'cim.SpatialMobilityStats') 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 789 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.SpatialMobilityStats_location(sampleTimestamp bigint,city String,timezone String,isValid boolean,receiveTimestamp bigint,sid string,lastUpdated bigint,sampleTimestampID int,createTime bigint,locationId String,tenantId string,status string,since bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 401 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SpatialMobilityStats(sampleTimestamp bigint,city string,timezone String,isValid boolean,receiveTimestamp bigint,sid string,lastUpdated bigint,createTime bigint,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,status string,since bigint) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 415 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SpatialMobilityStats_plus(sampleTimestamp bigint,city string,timezone String,spatialmobilitystatsname string,isValid boolean,entityid string,receiveTimestamp bigint,sid string,lastUpdated bigint,entitytype string,createTime bigint,sampleTimestampID int,locationId string,tenantId string,mobilityisvalid boolean,timezoneoffset int,status string,since bigint,mobilitystatsname string) 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 535 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.443 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.SpatialMobilityStats_statistic_hbase ( rowkey string,startTimestamp bigint,dwellTime double,deltaTimestamp bigint,tenantId string,sid string,sampleTimestamp bigint,count int,isValid boolean,receiveTimestamp bigint,endTimestamp bigint,sampleTimestampID int,density double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:startTimestamp#b,d:dwellTime#b,d:deltaTimestamp#b,d:tenantId,d:sid,d:sampleTimestamp#b,d:count#b,d:isValid#b,d:receiveTimestamp#b,d:endTimestamp#b,d:sampleTimestampID#b,d:density#b') TBLPROPERTIES('hbase.table.name' = 'cim.SpatialMobilityStats.statistic') 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.443 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 861 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.SpatialMobilityStats_statistic_period(sampleTimestamp bigint,density double,year int,city string,spatialmobilitystatsname String,isValid boolean,count int,deltaTimestamp bigint,entityid String,receiveTimestamp bigint,sid string,month int,entitytype String,sampleTimestampID int,locationid String,tenantId string,mobilityisvalid boolean,timezoneoffset int,endTimestamp bigint,dwellTime double,startTimestamp bigint,mobilitystatsname String) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 588 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SpatialMobilityStats_statistic(sampleTimestamp bigint,density double,spatialmobilitystatsname String,isValid boolean,count int,deltaTimestamp bigint,entityid String,receiveTimestamp bigint,sid string,entitytype String,sampleTimestampID int,locationid String,tenantId string,mobilityisvalid boolean,timezoneoffset int,endTimestamp bigint,dwellTime double,startTimestamp bigint,mobilitystatsname String) partitioned by (year int,month int,city string) stored as parquet 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 621 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.SpatialMobilityStats_statistic_hourly(week int,timeid int,city string,spatialmobilitystatsname String,weekday int,entityid String,count_sum bigint,sid string,dwellTime_sum double,hour int,dwellTime_min double,dwellTime_avg Double,density_max double,timezoneoffset int,day int,mobilitystatsname String,density_sum double,dwellTime_max double,monthweek int,count_max int,isValid boolean,density_min double,density_avg Double,count_avg Double,entitytype String,month int,count_min int,locationid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 715 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteCollectionTrip_hbase ( rowkey string,entityType string,shortName string,sampleTimestamp bigint,isValid boolean,cimDataType string,providerDetails string,headsign string,geohash string,active int,expiresAt bigint,boundId int,label string,tripNextScheduleTime string,sampleTimestampID int,locationId string,createTimestampID int,nickname string,path_geoPoint string,vehicleId string,sid string,destroyTimestamp bigint,private string,agencyId string,thirdPartyId string,lastUpdated bigint,direction string,destroyTimestampID int,driverId string,validity string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,serviceId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:entityType,d:shortName,d:sampleTimestamp#b,d:isValid#b,d:cimDataType,d:providerDetails,d:headsign,d:geohash,d:active#b,d:expiresAt#b,d:boundId#b,d:label,d:tripNextScheduleTime,d:sampleTimestampID#b,d:locationId,d:createTimestampID#b,d:nickname,d:path_geoPoint,d:vehicleId,d:sid,d:destroyTimestamp#b,d:private,d:agencyId,d:thirdPartyId,d:lastUpdated#b,d:direction,d:destroyTimestampID#b,d:driverId,d:validity,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:serviceId') TBLPROPERTIES('hbase.table.name' = 'cim.WasteCollectionTrip') 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1513 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteCollectionTrip_location(sampleTimestamp bigint,private string,city String,timezone String,boundId int,agencyId string,receiveTimestamp bigint,sid string,createTimestamp bigint,headsign string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,geohash string,createTimestampID int,providerDetails string,tripNextScheduleTime string,nickname string,cimDataType string,vehicleId string,destroyTimestampID int,serviceId string,direction string,entityType string,isValid boolean,active int,label string,path_geoPoint string,expiresAt bigint,driverId string,tenantId string,validity string,shortName string,destroyTimestamp bigint) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 807 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteCollectionTrip(sampleTimestamp bigint,private string,city string,timezone String,boundId int,agencyId string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,headsign string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,createTimestampID int,nickname string,providerDetails string,tripNextScheduleTime string,timezoneoffset int,cimDataType string,vehicleId string,serviceId string,destroyTimestampID int,direction string,entityType string,isValid boolean,active int,path_geoPoint string,label string,expiresAt bigint,driverId string,tenantId string,validity string,shortName string,destroyTimestamp bigint) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 821 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.EnvironmentData_hbase ( rowkey string,geocoordinates_longitude double,createTimestampID int,sid string,sampleTimestamp bigint,isValid boolean,geocoordinates_altitude double,destroyTimestamp bigint,lastUpdated bigint,dataType string,destroyTimestampID int,tenantId string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:createTimestampID#b,d:sid,d:sampleTimestamp#b,d:isValid#b,d:geocoordinates_altitude#b,d:destroyTimestamp#b,d:lastUpdated#b,d:dataType,d:destroyTimestampID#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.EnvironmentData') 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1084 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.EnvironmentData_location(geocoordinates_longitude double,sampleTimestamp bigint,city String,geocoordinates_altitude double,timezone String,isValid boolean,dataType string,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,tenantId string,geocoordinates_latitude double,destroyTimestampID int,destroyTimestamp bigint) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 553 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.EnvironmentData(geocoordinates_longitude double,sampleTimestamp bigint,city string,geocoordinates_altitude double,timezone String,isValid boolean,dataType string,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,geocoordinates_latitude double,timezoneoffset int,destroyTimestampID int,destroyTimestamp bigint) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 567 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.PersonalDevice_hbase ( rowkey string,destroyTimestampID int,createTimestampID int,tenantId string,sid string,sampleTimestamp bigint,macAddress string,isValid boolean,receiveTimestamp bigint,createTimestamp bigint,destroyTimestamp bigint,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:destroyTimestampID#b,d:createTimestampID#b,d:tenantId,d:sid,d:sampleTimestamp#b,d:macAddress,d:isValid#b,d:receiveTimestamp#b,d:createTimestamp#b,d:destroyTimestamp#b,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.PersonalDevice') 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 872 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.PersonalDevice_state(sampleTimestamp bigint,timezone string,isValid boolean,receiveTimestamp bigint,sid string,createTimestamp bigint,entryTime bigint,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,timezoneoffset int,destroyTimestampID int,roiId string,day int,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 577 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.PersonalDevice_state_hbase ( rowkey string,createTimestampID int,sid string,sampleTimestamp bigint,isValid boolean,destroyTimestamp bigint,entryTime bigint,destroyTimestampID int,startTimestamp bigint,roiId string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:sid,d:sampleTimestamp#b,d:isValid#b,d:destroyTimestamp#b,d:entryTime#b,d:destroyTimestampID#b,d:startTimestamp#b,d:roiId,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.PersonalDevice.state') 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 915 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.PersonalDevice_state_period(sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,sid string,createTimestamp bigint,entryTime bigint,sampleTimestampID int,createTimestampID int,tenantId string,destroyTimestampID int,roiId string,startTimestamp bigint,destroyTimestamp bigint) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 433 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.HarmonicaSectionOfDay_hbase ( rowkey string,entityId string,attributeName string,entityType string,tenantId string,sid string,sampleTimestamp bigint,parentEntityType string,isValid boolean,receiveTimestamp bigint,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:entityId,d:attributeName,d:entityType,d:tenantId,d:sid,d:sampleTimestamp#b,d:parentEntityType,d:isValid#b,d:receiveTimestamp#b,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.HarmonicaSectionOfDay') 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 815 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.HarmonicaSectionOfDay_location(sampleTimestamp bigint,city String,entityType string,timezone String,isValid boolean,entityId string,parentEntityType string,receiveTimestamp bigint,sid string,sampleTimestampID int,locationId String,tenantId string,attributeName string) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 417 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HarmonicaSectionOfDay(sampleTimestamp bigint,city string,entityType string,timezone String,isValid boolean,entityId string,parentEntityType string,receiveTimestamp bigint,sid string,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,attributeName string) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 431 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.HarmonicaSectionOfDay_statistic_daily_hbase ( rowkey string,attributeName string,bgneq double,timeSlot_end string,entityType string,sid string,sampleTimestamp bigint,bgnIndex double,entityId string,startTimestamp bigint,timeSlot_start string,globalIndex double,evtIndex double,tenantId string,evteq double,startTime bigint,parentEntityType string,receiveTimestamp bigint,startTimestampID int,sampleTimestampID int,weight double,colour string,laeq double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:attributeName,d:bgneq#b,d:timeSlot_end,d:entityType,d:sid,d:sampleTimestamp#b,d:bgnIndex#b,d:entityId,d:startTimestamp#b,d:timeSlot_start,d:globalIndex#b,d:evtIndex#b,d:tenantId,d:evteq#b,d:startTime#b,d:parentEntityType,d:receiveTimestamp#b,d:startTimestampID#b,d:sampleTimestampID#b,d:weight#b,d:colour,d:laeq#b') TBLPROPERTIES('hbase.table.name' = 'cim.HarmonicaSectionOfDay.statistic.daily') 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1184 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.HarmonicaSectionOfDay_statistic_daily_period(sampleTimestamp bigint,year int,city string,evteq double,parentEntityType string,bgnIndex double,receiveTimestamp bigint,sid string,evtIndex double,startTimestampID int,laeq double,sampleTimestampID int,timezoneoffset int,attributeName string,startTime bigint,bgneq double,entityType string,weight double,entityId string,timeSlot_end string,timeSlot_start string,globalIndex double,colour string,month int,locationid String,tenantId string,startTimestamp bigint) 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 656 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HarmonicaSectionOfDay_statisticdata_daily(evteq_sum double,weekday int,evteq double,receiveTimestamp bigint,laeq_avg Double,laeq_min double,evtIndex double,startTimestampID int,laeq_sum double,day int,globalIndex_min double,globalIndex_avg Double,bgneq double,entityType string,evtIndex_sum double,weight double,timeSlot_end string,bgneq_max double,evteq_max double,month int,globalIndex_max double,bgneq_sum double,evteq_min double,evteq_avg Double,bgneq_min double,bgneq_avg Double,sampleTimestamp bigint,week int,timeid int,city string,bgnIndex_max double,evtIndex_max double,parentEntityType string,bgnIndex double,sid string,laeq double,hour int,sampleTimestampID int,globalIndex_sum double,timezoneoffset int,attributeName string,startTime bigint,evtIndex_min double,evtIndex_avg Double,monthweek int,entityId string,timeSlot_start string,globalIndex double,colour string,bgnIndex_min double,bgnIndex_avg Double,locationid string,laeq_max double,bgnIndex_sum double,startTimestamp bigint) partitioned by (year int, tenantid string) stored as parquet 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1209 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HarmonicaSectionOfDay_statistic_daily(evteq_sum double,weekday int,evteq double,receiveTimestamp bigint,laeq_avg Double,laeq_min double,evtIndex double,startTimestampID int,laeq_sum double,geocoordinates_latitude double,day int,globalIndex_min double,globalIndex_avg Double,bgneq double,entityType string,evtIndex_sum double,weight double,timeSlot_end string,bgneq_max double,evteq_max double,month int,globalIndex_max double,bgneq_sum double,evteq_min double,evteq_avg Double,bgneq_min double,bgneq_avg Double,sampleTimestamp bigint,geocoordinates_longitude double,week int,timeid int,city string,bgnIndex_max double,evtIndex_max double,parentEntityType string,bgnIndex double,sid string,laeq double,hour int,sampleTimestampID int,globalIndex_sum double,timezoneoffset int,attributeName string,startTime bigint,evtIndex_min double,evtIndex_avg Double,monthweek int,geocoordinates_altitude double,entityId string,timeSlot_start string,globalIndex double,colour string,bgnIndex_min double,bgnIndex_avg Double,locationid string,laeq_max double,bgnIndex_sum double,startTimestamp bigint) partitioned by (year int, tenantid string) stored as parquet 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1299 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.444 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Vehicle_hbase ( rowkey string,Colour string,bturl string,engineStopTime bigint,lpr string,sampleTimestamp bigint,isValid boolean,listedAs string,engineStartTime bigint,providerDetails string,active int,padding string,containerInfo string,wifiAccessPoint string,label string,type string,sampleTimestampID int,locationId string,createTimestampID int,nickname string,dimension string,sid string,bluetooth_devices string,destroyTimestamp bigint,Model string,private string,agencyId string,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,custom string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,Vin string,serviceOnDate bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:Colour,d:bturl,d:engineStopTime#b,d:lpr,d:sampleTimestamp#b,d:isValid#b,d:listedAs,d:engineStartTime#b,d:providerDetails,d:active#b,d:padding,d:containerInfo,d:wifiAccessPoint,d:label,d:type,d:sampleTimestampID#b,d:locationId,d:createTimestampID#b,d:nickname,d:dimension,d:sid,d:bluetooth_devices,d:destroyTimestamp#b,d:Model,d:private,d:agencyId,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:custom,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:Vin,d:serviceOnDate#b') TBLPROPERTIES('hbase.table.name' = 'cim.Vehicle') 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.444 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1528 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Vehicle_location(sampleTimestamp bigint,private string,city String,bturl string,timezone String,bluetooth_devices string,agencyId string,receiveTimestamp bigint,type string,Colour string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,engineStartTime bigint,sampleTimestampID int,locationId String,createTimestampID int,providerDetails string,nickname string,destroyTimestampID int,dimension string,serviceOnDate bigint,padding string,engineStopTime bigint,wifiAccessPoint string,isValid boolean,custom string,active int,label string,lpr string,listedAs string,containerInfo string,Model string,tenantId string,Vin string,destroyTimestamp bigint) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 820 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Vehicle(sampleTimestamp bigint,private string,city string,bturl string,timezone String,bluetooth_devices string,agencyId string,receiveTimestamp bigint,type string,sid string,Colour string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,engineStartTime bigint,sampleTimestampID int,locationId string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,destroyTimestampID int,dimension string,serviceOnDate bigint,padding string,engineStopTime bigint,wifiAccessPoint string,isValid boolean,custom string,active int,label string,lpr string,listedAs string,tenantId string,Model string,containerInfo string,Vin string,destroyTimestamp bigint) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 834 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Vehicle_state(geocoordinates_longitude double,sampleTimestamp bigint,timezone string,receiveTimestamp bigint,speed double,sid string,createTimestamp bigint,lastUpdated bigint,wheelchairBoarding int,maximumLoad double,dimensionid bigint,engineStartTime bigint,sampleTimestampID int,fuelConsumed double,locationId string,createTimestampID int,geocoordinates_latitude double,timezoneoffset int,isOnDuty string,destroyTimestampID int,day int,serviceOnDate bigint,numClient int,engineStopTime bigint,geocoordinates_altitude double,isValid boolean,velocity double,vehicleBrakeUsedCount int,serviceTime int,carrier int,fuelType int,tenantId string,isTripOn string,seatingCapacity double,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 946 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Vehicle_state_hbase ( rowkey string,geocoordinates_longitude double,speed double,isOnDuty string,engineStopTime bigint,sampleTimestamp bigint,isValid boolean,engineStartTime bigint,isTripOn string,seatingCapacity double,numClient int,velocity double,vehicleBrakeUsedCount int,sampleTimestampID int,createTimestampID int,sid string,geocoordinates_altitude double,carrier int,destroyTimestamp bigint,lastUpdated bigint,fuelConsumed double,destroyTimestampID int,serviceTime int,startTimestamp bigint,maximumLoad double,fuelType int,tenantId string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double,wheelchairBoarding int,serviceOnDate bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:speed#b,d:isOnDuty,d:engineStopTime#b,d:sampleTimestamp#b,d:isValid#b,d:engineStartTime#b,d:isTripOn,d:seatingCapacity#b,d:numClient#b,d:velocity#b,d:vehicleBrakeUsedCount#b,d:sampleTimestampID#b,d:createTimestampID#b,d:sid,d:geocoordinates_altitude#b,d:carrier#b,d:destroyTimestamp#b,d:lastUpdated#b,d:fuelConsumed#b,d:destroyTimestampID#b,d:serviceTime#b,d:startTimestamp#b,d:maximumLoad#b,d:fuelType#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b,d:wheelchairBoarding#b,d:serviceOnDate#b') TBLPROPERTIES('hbase.table.name' = 'cim.Vehicle.state') 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1615 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Vehicle_state_period(geocoordinates_longitude double,sampleTimestamp bigint,receiveTimestamp bigint,speed double,sid string,createTimestamp bigint,lastUpdated bigint,wheelchairBoarding int,maximumLoad double,engineStartTime bigint,sampleTimestampID int,fuelConsumed double,createTimestampID int,geocoordinates_latitude double,isOnDuty string,destroyTimestampID int,serviceOnDate bigint,numClient int,engineStopTime bigint,geocoordinates_altitude double,isValid boolean,velocity double,vehicleBrakeUsedCount int,serviceTime int,carrier int,fuelType int,isTripOn string,tenantId string,seatingCapacity double,startTimestamp bigint,destroyTimestamp bigint) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 802 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.LightZone_hbase ( rowkey string,deviceType string,boundary_geoPoint string,sampleTimestamp bigint,isValid boolean,providerDetails string,geohash string,applicableDomain string,tag string,label string,type string,sampleTimestampID int,locationId string,instanceId string,outsourceablePolicyTemplate string,createTimestampID int,status string,sid string,destroyTimestamp bigint,private string,thirdPartyId string,lastUpdated bigint,policyHandler string,destroyTimestampID int,dependentOn string,isIndependent int,custom string,tenantId string,parentDomain string,receiveTimestamp bigint,createTimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:deviceType,d:boundary_geoPoint,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:geohash,d:applicableDomain,d:tag,d:label,d:type,d:sampleTimestampID#b,d:locationId,d:instanceId,d:outsourceablePolicyTemplate,d:createTimestampID#b,d:status,d:sid,d:destroyTimestamp#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:policyHandler,d:destroyTimestampID#b,d:dependentOn,d:isIndependent#b,d:custom,d:tenantId,d:parentDomain,d:receiveTimestamp#b,d:createTimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.LightZone') 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1451 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.LightZone_location(sampleTimestamp bigint,private string,isIndependent int,city String,timezone String,boundary_geoPoint string,receiveTimestamp bigint,type string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,thirdPartyId string,lastUpdated bigint,instanceId string,sampleTimestampID int,dependentOn string,locationId String,geohash string,createTimestampID int,providerDetails string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,isValid boolean,custom string,label string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 772 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.LightZone(sampleTimestamp bigint,private string,isIndependent int,city string,timezone String,boundary_geoPoint string,receiveTimestamp bigint,type string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,thirdPartyId string,lastUpdated bigint,instanceId string,dependentOn string,sampleTimestampID int,locationId string,geohash string,createTimestampID int,providerDetails string,timezoneoffset int,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,isValid boolean,custom string,label string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 786 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.LightZone_state(intensityLevel double,sampleTimestamp bigint,numOn int,timezone string,isValid boolean,receiveTimestamp bigint,sid string,createTimestamp bigint,numOff int,lastUpdated bigint,total int,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,timezoneoffset int,destroyTimestampID int,day int,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 614 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.LightZone_state_hbase ( rowkey string,numOn int,createTimestampID int,sid string,intensityLevel double,sampleTimestamp bigint,isValid boolean,numOff int,destroyTimestamp bigint,lastUpdated bigint,destroyTimestampID int,total int,startTimestamp bigint,tenantId string,receiveTimestamp bigint,createTimestamp bigint,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:numOn#b,d:createTimestampID#b,d:sid,d:intensityLevel#b,d:sampleTimestamp#b,d:isValid#b,d:numOff#b,d:destroyTimestamp#b,d:lastUpdated#b,d:destroyTimestampID#b,d:total#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.LightZone.state') 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 991 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.LightZone_state_period(intensityLevel double,sampleTimestamp bigint,numOn int,isValid boolean,receiveTimestamp bigint,sid string,createTimestamp bigint,numOff int,lastUpdated bigint,total int,sampleTimestampID int,createTimestampID int,tenantId string,destroyTimestampID int,startTimestamp bigint,destroyTimestamp bigint) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 470 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.EntryPOM_hbase ( rowkey string,tenantId string,sid string,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,pomDirId string,roiEntityRef_entityId string,sampleTimestampID int,locationId string,roiEntityRef_entityType string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:tenantId,d:sid,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:pomDirId,d:roiEntityRef_entityId,d:sampleTimestampID#b,d:locationId,d:roiEntityRef_entityType') TBLPROPERTIES('hbase.table.name' = 'cim.EntryPOM') 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 788 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.EntryPOM_location(sampleTimestamp bigint,roiEntityRef_entityId string,sampleTimestampID int,city String,locationId String,timezone String,isValid boolean,tenantId string,receiveTimestamp bigint,pomDirId string,roiEntityRef_entityType string,sid string) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 401 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.EntryPOM(sampleTimestamp bigint,roiEntityRef_entityId string,city string,timezone string,isValid boolean,receiveTimestamp bigint,roiEntityRef_entityType string,sid string,MobilityPOMDirectiondimensionid bigint,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,pomDirId string) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 454 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingArea_state_hourly(parkingSpaceId string,monthweek int,week int,timeid int,city string,count int,weekday int,sid string,occupied_min int,dimensionid bigint,month int,hour int,occupied_sum int,locationid string,day int,occupied_max int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 456 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingArea_state(parkingSpaceId string,sampleTimestamp bigint,timezone string,isValid boolean,receiveTimestamp bigint,sid string,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,day int,occupied int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 506 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ParkingArea_state_hbase ( rowkey string,startTimestamp bigint,sid string,tenantId string,occupied int,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,sampleTimestampID int,lastUpdated bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:startTimestamp#b,d:sid,d:tenantId,d:occupied#b,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:sampleTimestampID#b,d:lastUpdated#b') TBLPROPERTIES('hbase.table.name' = 'cim.ParkingArea.state') 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 740 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ParkingArea_state_period(sampleTimestamp bigint,lastUpdated bigint,sampleTimestampID int,isValid boolean,tenantId string,receiveTimestamp bigint,startTimestamp bigint,occupied int,sid string) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 340 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ParkingArea_hbase ( rowkey string,opParams_zoneType string,sid string,sampleTimestamp bigint,parkingSpaceId string,area_geoPoint string,isValid boolean,providerDetails string,lastUpdated bigint,tenantId string,receiveTimestamp bigint,label string,opParams_maxDurationMinutes int,sampleTimestampID int,locationId string,operatedBy string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:opParams_zoneType,d:sid,d:sampleTimestamp#b,d:parkingSpaceId,d:area_geoPoint,d:isValid#b,d:providerDetails,d:lastUpdated#b,d:tenantId,d:receiveTimestamp#b,d:label,d:opParams_maxDurationMinutes#b,d:sampleTimestampID#b,d:locationId,d:operatedBy') TBLPROPERTIES('hbase.table.name' = 'cim.ParkingArea') 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 970 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ParkingArea_location(sampleTimestamp bigint,parkingSpaceId string,city String,timezone String,isValid boolean,operatedBy string,receiveTimestamp bigint,label string,sid string,opParams_zoneType string,lastUpdated bigint,sampleTimestampID int,locationId String,providerDetails string,tenantId string,area_geoPoint string,opParams_maxDurationMinutes int) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 501 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.445 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ParkingArea(sampleTimestamp bigint,parkingSpaceId string,city string,timezone string,isValid boolean,operatedBy string,label string,receiveTimestamp bigint,sid string,opParams_zoneType string,lastUpdated bigint,ParkingSpacedimensionid bigint,sampleTimestampID int,locationId string,tenantId string,providerDetails string,timezoneoffset int,area_geoPoint string,opParams_maxDurationMinutes int) 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.445 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 546 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TransitRidership_state(sampleTimestamp bigint,nextStopInfo_stopId string,distanceTravelled double,prevStopInfo_stopId string,prevStopInfo_numDisembarked int,timezone string,prevStopInfo_departure bigint,tripId string,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,revenue double,dimensionid bigint,numPassengers int,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,tripStartTime bigint,destroyTimestampID int,day int,nextStopInfo_status string,tripDelaySecs double,nextStopInfo_eta bigint,isValid boolean,currStopInfo_arrival bigint,tripEndTime bigint,tenantId string,prevStopInfo_arrival bigint,prevStopInfo_numBoarded int,prevStopInfo_avgWaitTime double,startTimestamp bigint,currStopInfo_stopId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1011 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.TransitRidership_state_hbase ( rowkey string,nextStopInfo_stopId string,nextStopInfo_status string,nextStopInfo_eta bigint,sampleTimestamp bigint,prevStopInfo_departure bigint,isValid boolean,tripEndTime bigint,revenue double,currStopInfo_stopId string,prevStopInfo_stopId string,prevStopInfo_avgWaitTime double,prevStopInfo_numBoarded int,sampleTimestampID int,createTimestampID int,currStopInfo_arrival bigint,sid string,destroyTimestamp bigint,lastUpdated bigint,tripDelaySecs double,destroyTimestampID int,startTimestamp bigint,distanceTravelled double,tripStartTime bigint,tenantId string,numPassengers int,receiveTimestamp bigint,createTimestamp bigint,prevStopInfo_arrival bigint,prevStopInfo_numDisembarked int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:nextStopInfo_stopId,d:nextStopInfo_status,d:nextStopInfo_eta#b,d:sampleTimestamp#b,d:prevStopInfo_departure#b,d:isValid#b,d:tripEndTime#b,d:revenue#b,d:currStopInfo_stopId,d:prevStopInfo_stopId,d:prevStopInfo_avgWaitTime#b,d:prevStopInfo_numBoarded#b,d:sampleTimestampID#b,d:createTimestampID#b,d:currStopInfo_arrival#b,d:sid,d:destroyTimestamp#b,d:lastUpdated#b,d:tripDelaySecs#b,d:destroyTimestampID#b,d:startTimestamp#b,d:distanceTravelled#b,d:tripStartTime#b,d:tenantId,d:numPassengers#b,d:receiveTimestamp#b,d:createTimestamp#b,d:prevStopInfo_arrival#b,d:prevStopInfo_numDisembarked#b') TBLPROPERTIES('hbase.table.name' = 'cim.TransitRidership.state') 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1710 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.TransitRidership_state_period(sampleTimestamp bigint,nextStopInfo_stopId string,prevStopInfo_stopId string,distanceTravelled double,prevStopInfo_numDisembarked int,prevStopInfo_departure bigint,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,revenue double,sampleTimestampID int,numPassengers int,createTimestampID int,tripStartTime bigint,destroyTimestampID int,nextStopInfo_status string,tripDelaySecs double,nextStopInfo_eta bigint,isValid boolean,currStopInfo_arrival bigint,tripEndTime bigint,tenantId string,prevStopInfo_arrival bigint,prevStopInfo_numBoarded int,prevStopInfo_avgWaitTime double,startTimestamp bigint,currStopInfo_stopId string,destroyTimestamp bigint) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 853 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.TransitRidership_hbase ( rowkey string,createTimestampID int,vehicleId string,sid string,sampleTimestamp bigint,isValid boolean,destroyTimestamp bigint,tripEndTime bigint,providerDetails string,thirdPartyId string,lastUpdated bigint,geohash string,destroyTimestampID int,driverId string,tripStartTime bigint,tenantId string,receiveTimestamp bigint,createTimestamp bigint,label string,tag string,sampleTimestampID int,locationId string,source string,tripId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:vehicleId,d:sid,d:sampleTimestamp#b,d:isValid#b,d:destroyTimestamp#b,d:tripEndTime#b,d:providerDetails,d:thirdPartyId,d:lastUpdated#b,d:geohash,d:destroyTimestampID#b,d:driverId,d:tripStartTime#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:label,d:tag,d:sampleTimestampID#b,d:locationId,d:source,d:tripId') TBLPROPERTIES('hbase.table.name' = 'cim.TransitRidership') 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1195 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.TransitRidership_location(sampleTimestamp bigint,city String,timezone String,tripId string,receiveTimestamp bigint,source string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,geohash string,providerDetails string,vehicleId string,tripStartTime bigint,tag string,destroyTimestampID int,isValid boolean,label string,driverId string,tripEndTime bigint,tenantId string,destroyTimestamp bigint) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 627 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TransitRidership(sampleTimestamp bigint,city string,timezone string,tripId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,createTimestampID int,providerDetails string,timezoneoffset int,vehicleId string,tag string,tripStartTime bigint,destroyTimestampID int,isValid boolean,label string,TransitTripdimensionid bigint,driverId string,tripEndTime bigint,tenantId string,destroyTimestamp bigint) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 671 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.nodest_GenericEvent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1513 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.GenericEvent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1506 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingevent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1506 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.environmentevent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1510 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.lightevent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1504 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.trafficevent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1506 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.transitevent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1506 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.wasteevent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1504 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.mobilityevent_event(EventType string,boundary_geoPoint string,source string,receiveTimestamp bigint,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,analyticsDetails string,forecastClosureTime bigint,day int,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,timezone string,eventSid string,reliability string,description string,overCrowdingDetails string,cameraHealthDetails string,roadSegmentId string,providerDetails_providerId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,providerDetails_OUI string,expiresAt bigint,tenantId string,parkedVehicleDetails_expectedRevenue double,socialMediaEventDetails string,graffitiDetails string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1507 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.GenericEvent_event_hbase ( rowkey string,isValid boolean,roadSegmentLaneId string,analyticsDetails string,vehicleDetails string,provider string,expiresAt bigint,socialMediaEventDetails string,dwellTimeDetails string,weatherDetails string,sampleTimestampID int,locationId string,severity string,createTimestampID int,status string,sid string,personDetails string,geocoordinates_altitude double,eventTime bigint,destroyTimestamp bigint,providerDetails_providerId string,thirdPartyId string,queueDetails string,closureTime bigint,alertDetails string,parentDomain string,forecastClosureTime bigint,providerDetails_provider string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double,parkingAreaId string,roadSegmentId string,geocoordinates_longitude double,directionDetails string,boundary_geoPoint string,parkingSpaceId string,description string,sampleTimestamp bigint,graffitiDetails string,reliability string,parkedVehicleDetails_expectedRevenue double,seismicDetails string,providerDetails_OUI string,source string,cameraHealthDetails string,parkingSpotId string,lastUpdated bigint,eventSid string,objectDetails string,destroyTimestampID int,tenantId string,EventType string,overCrowdingDetails string,speedDetails string,visualInfo string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:isValid#b,d:roadSegmentLaneId,d:analyticsDetails,d:vehicleDetails,d:provider,d:expiresAt#b,d:socialMediaEventDetails,d:dwellTimeDetails,d:weatherDetails,d:sampleTimestampID#b,d:locationId,d:severity,d:createTimestampID#b,d:status,d:sid,d:personDetails,d:geocoordinates_altitude#b,d:eventTime#b,d:destroyTimestamp#b,d:providerDetails_providerId,d:thirdPartyId,d:queueDetails,d:closureTime#b,d:alertDetails,d:parentDomain,d:forecastClosureTime#b,d:providerDetails_provider,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b,d:parkingAreaId,d:roadSegmentId,d:geocoordinates_longitude#b,d:directionDetails,d:boundary_geoPoint,d:parkingSpaceId,d:description,d:sampleTimestamp#b,d:graffitiDetails,d:reliability,d:parkedVehicleDetails_expectedRevenue#b,d:seismicDetails,d:providerDetails_OUI,d:source,d:cameraHealthDetails,d:parkingSpotId,d:lastUpdated#b,d:eventSid,d:objectDetails,d:destroyTimestampID#b,d:tenantId,d:EventType,d:overCrowdingDetails,d:speedDetails,d:visualInfo') TBLPROPERTIES('hbase.table.name' = 'cim.GenericEvent.event') 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 2651 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.GenericEvent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1417 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.parkingevent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1417 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.environmentevent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1421 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.lightevent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1415 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.trafficevent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1417 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.transitevent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1417 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.wasteevent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1415 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.mobilityevent_event_period(receiveTimestamp bigint,boundary_geoPoint string,source string,objectDetails string,parentDomain string,createTimestamp bigint,personDetails string,closureTime bigint,lastUpdated bigint,analyticsDetails string,geocoordinates_latitude double,forecastClosureTime bigint,alertDetails string,seismicDetails string,parkingSpaceId string,speedDetails string,directionDetails string,roadSegmentLaneId string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,dwellTimeDetails string,weatherDetails string,reliability string,eventSid string,description string,overCrowdingDetails string,cameraHealthDetails string,providerDetails_providerId string,roadSegmentId string,sid string,thirdPartyId string,provider string,sampleTimestampID int,locationId string,createTimestampID int,eventTime bigint,destroyTimestampID int,severity string,parkingAreaId string,vehicleDetails string,geocoordinates_altitude double,isValid boolean,providerDetails_provider string,visualInfo string,queueDetails string,expiresAt bigint,providerDetails_OUI string,tenantId string,socialMediaEventDetails string,parkedVehicleDetails_expectedRevenue double,graffitiDetails string,destroyTimestamp bigint) partitioned by (eventtype string) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1418 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Light_state_hourly(week int,timeid int,powerConsumption_max double,grpCtrl string,city string,intensityLevel_sum double,weekday int,lightZoneId string,sid string,powerconsumption_count int,dimensionid bigint,hour int,powerConsumption_min double,day int,powerConsumption_sum double,monthweek int,minutepoweron_sum int,count int,intensityLevel_max double,month int,locationid string,intensitylevel_count int,intensityLevel_min double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 647 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Light_state(intensityLevel double,sampleTimestamp bigint,deviceState_connState_connected int,grpCtrl string,timezone string,reliability double,lightZoneId string,receiveTimestamp bigint,flash_minIntensityLevel double,sid string,createTimestamp bigint,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,flash_activate int,destroyTimestampID int,day int,deviceState_connState_since bigint,powerConsumption double,isValid boolean,deviceState_batteryPercentage double,flash_flashInterval double,tenantId string,flash_maxIntensityLevel double,flash_defaultFlashDuration double,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 906 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Light_state_hbase ( rowkey string,powerConsumption double,createTimestampID int,flash_maxIntensityLevel double,sid string,intensityLevel double,sampleTimestamp bigint,isValid boolean,deviceState_connState_connected int,deviceState_connState_since bigint,destroyTimestamp bigint,lastUpdated bigint,flash_minIntensityLevel double,reliability double,flash_flashInterval double,flash_activate int,destroyTimestampID int,startTimestamp bigint,tenantId string,receiveTimestamp bigint,createTimestamp bigint,flash_defaultFlashDuration double,deviceState_batteryPercentage double,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:powerConsumption#b,d:createTimestampID#b,d:flash_maxIntensityLevel#b,d:sid,d:intensityLevel#b,d:sampleTimestamp#b,d:isValid#b,d:deviceState_connState_connected#b,d:deviceState_connState_since#b,d:destroyTimestamp#b,d:lastUpdated#b,d:flash_minIntensityLevel#b,d:reliability#b,d:flash_flashInterval#b,d:flash_activate#b,d:destroyTimestampID#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:flash_defaultFlashDuration#b,d:deviceState_batteryPercentage#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.Light.state') 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1483 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Light_state_period(intensityLevel double,sampleTimestamp bigint,deviceState_connState_connected int,deviceState_connState_since bigint,powerConsumption double,isValid boolean,reliability double,deviceState_batteryPercentage double,receiveTimestamp bigint,flash_minIntensityLevel double,sid string,createTimestamp bigint,flash_flashInterval double,lastUpdated bigint,sampleTimestampID int,createTimestampID int,tenantId string,flash_activate int,destroyTimestampID int,flash_maxIntensityLevel double,startTimestamp bigint,flash_defaultFlashDuration double,destroyTimestamp bigint) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 728 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Light_hbase ( rowkey string,geocoordinates_longitude double,deviceType string,sampleTimestamp bigint,lightZoneId string,isValid boolean,providerDetails string,geohash string,model string,applicableDomain string,tag string,label string,type string,sampleTimestampID int,connectivityType string,locationId string,instanceId string,outsourceablePolicyTemplate string,createTimestampID int,powerMeter string,status string,sid string,geocoordinates_altitude double,destroyTimestamp bigint,private string,thirdPartyId string,lastUpdated bigint,supportedMode string,grpCtrl string,mode string,policyHandler string,destroyTimestampID int,dependentOn string,isIndependent int,custom string,tenantId string,parentDomain string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:deviceType,d:sampleTimestamp#b,d:lightZoneId,d:isValid#b,d:providerDetails,d:geohash,d:model,d:applicableDomain,d:tag,d:label,d:type,d:sampleTimestampID#b,d:connectivityType,d:locationId,d:instanceId,d:outsourceablePolicyTemplate,d:createTimestampID#b,d:powerMeter,d:status,d:sid,d:geocoordinates_altitude#b,d:destroyTimestamp#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:supportedMode,d:grpCtrl,d:mode,d:policyHandler,d:destroyTimestampID#b,d:dependentOn,d:isIndependent#b,d:custom,d:tenantId,d:parentDomain,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b') TBLPROPERTIES('hbase.table.name' = 'cim.Light') 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1786 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Light_location(geocoordinates_longitude double,sampleTimestamp bigint,powerMeter string,private string,isIndependent int,grpCtrl string,city String,timezone String,lightZoneId string,receiveTimestamp bigint,type string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,connectivityType string,mode string,thirdPartyId string,lastUpdated bigint,instanceId string,sampleTimestampID int,dependentOn string,locationId String,geohash string,createTimestampID int,providerDetails string,geocoordinates_latitude double,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,geocoordinates_altitude double,isValid boolean,custom string,label string,supportedMode string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 959 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Light(powerMeter string,geocoordinates_longitude double,sampleTimestamp bigint,private string,isIndependent int,grpCtrl string,city string,timezone string,lightZoneId string,receiveTimestamp bigint,type string,sid string,parentDomain string,createTimestamp bigint,mode string,applicableDomain string,connectivityType string,thirdPartyId string,lastUpdated bigint,instanceId string,dependentOn string,sampleTimestampID int,locationId string,geohash string,createTimestampID int,geocoordinates_latitude double,providerDetails string,timezoneoffset int,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,geocoordinates_altitude double,isValid boolean,custom string,label string,LightGrpCtrldimensionid bigint,supportedMode string,tenantId string,status string,destroyTimestamp bigint,LightZonedimensionid bigint) 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1032 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.446 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.MobilityPOM_hbase ( rowkey string,geocoordinates_longitude double,objectType string,sid string,sampleTimestamp bigint,isValid boolean,geocoordinates_altitude double,spatialSupported boolean,supportedBearing string,providerDetails string,thirdPartyId string,geohash string,tenantId string,receiveTimestamp bigint,label string,geocoordinates_latitude double,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:objectType,d:sid,d:sampleTimestamp#b,d:isValid#b,d:geocoordinates_altitude#b,d:spatialSupported#b,d:supportedBearing,d:providerDetails,d:thirdPartyId,d:geohash,d:tenantId,d:receiveTimestamp#b,d:label,d:geocoordinates_latitude#b,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.MobilityPOM') 09:09:14.446 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1078 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.MobilityPOM_location(geocoordinates_longitude double,sampleTimestamp bigint,supportedBearing string,city String,geocoordinates_altitude double,timezone String,isValid boolean,receiveTimestamp bigint,label string,objectType string,sid string,thirdPartyId string,sampleTimestampID int,locationId String,geohash string,providerDetails string,tenantId string,geocoordinates_latitude double,spatialSupported boolean) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 560 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.MobilityPOM(geocoordinates_longitude double,sampleTimestamp bigint,supportedBearing string,city string,geocoordinates_altitude double,timezone String,isValid boolean,label string,receiveTimestamp bigint,sid string,objectType string,thirdPartyId string,sampleTimestampID int,locationId string,geohash string,tenantId string,geocoordinates_latitude double,providerDetails string,timezoneoffset int,spatialSupported boolean) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 574 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Client_hbase ( rowkey string,zoneId string,createTimestampID int,sid string,sampleTimestamp bigint,isValid boolean,name string,destroyTimestamp bigint,providerDetails string,thirdPartyId string,lastUpdated bigint,agencyId string,destroyTimestampID int,custom string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,label string,tag string,sampleTimestampID int,locationId string,source string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:zoneId,d:createTimestampID#b,d:sid,d:sampleTimestamp#b,d:isValid#b,d:name,d:destroyTimestamp#b,d:providerDetails,d:thirdPartyId,d:lastUpdated#b,d:agencyId,d:destroyTimestampID#b,d:custom,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:label,d:tag,d:sampleTimestampID#b,d:locationId,d:source') TBLPROPERTIES('hbase.table.name' = 'cim.Client') 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1089 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Client_location(sampleTimestamp bigint,city String,timezone String,isValid boolean,custom string,agencyId string,receiveTimestamp bigint,label string,source string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,name string,providerDetails string,tenantId string,zoneId string,tag string,destroyTimestampID int,destroyTimestamp bigint) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 571 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Client(sampleTimestamp bigint,city string,timezone string,agencyId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,providerDetails string,timezoneoffset int,zoneId string,tag string,Agencydimensionid bigint,destroyTimestampID int,custom string,isValid boolean,label string,Zonedimensionid bigint,tenantId string,name string,destroyTimestamp bigint) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 633 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.InclinometerSensor_state_hourly(tiltZ_min double,tiltZ_avg double,week int,timeid int,city string,weekday int,tiltY_min double,tiltY_avg double,sid string,dimensionid bigint,hour int,tiltX_max double,day int,monthweek int,tiltY_max double,tiltZ_Source string,tiltY_Source string,count int,tiltX_Source string,tiltZ_max double,month int,locationid string,tiltX_min double,tiltX_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 603 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.InclinometerData_state_hourly(tiltZ_min double,tiltZ_avg double,week int,timeid int,city string,weekday int,tiltY_min double,tiltY_avg double,sid string,dimensionid bigint,hour int,tiltX_max double,day int,monthweek int,tiltY_max double,tiltZ_Source string,tiltY_Source string,count int,tiltX_Source string,tiltZ_max double,month int,locationid string,tiltX_min double,tiltX_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 601 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.InclinometerData_state(sampleTimestamp bigint,timezone string,parentEntityType string,receiveTimestamp bigint,tiltZ_accuracy double,sid string,lastUpdated bigint,tiltY_expiresAt bigint,sampleTimestampID int,locationId string,tiltX_accuracy double,timezoneoffset int,day int,tiltY_reliability double,tiltZ_Source string,tiltY_Source string,tiltX_reliability double,isValid boolean,tiltY double,tiltX_Source string,tiltZ double,tiltZ_reliability double,tiltX double,tiltY_accuracy double,tiltX_expiresAt bigint,tenantId string,tiltZ_expiresAt bigint,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 790 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.InclinometerSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,timezone string,parentEntityType string,receiveTimestamp bigint,tiltZ_accuracy double,sid string,lastUpdated bigint,tiltY_expiresAt bigint,sampleTimestampID int,locationId string,geocoordinates_latitude double,tiltX_accuracy double,timezoneoffset int,day int,tiltY_reliability double,tiltZ_Source string,tiltY_Source string,tiltX_reliability double,geocoordinates_altitude double,isValid boolean,tiltY double,tiltX_Source string,tiltZ double,tiltZ_reliability double,tiltX double,tiltY_accuracy double,tiltX_expiresAt bigint,tenantId string,tiltZ_expiresAt bigint,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 886 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.InclinometerSensor_state_hbase ( rowkey string,sid string,sampleTimestamp bigint,tiltX_accuracy double,isValid boolean,tiltX_reliability double,tiltY_accuracy double,tiltZ_expiresAt bigint,lastUpdated bigint,tiltY_reliability double,tiltZ_reliability double,startTimestamp bigint,tiltY_expiresAt bigint,tenantId string,tiltX_expiresAt bigint,parentEntityType string,tiltZ_Source string,receiveTimestamp bigint,tiltZ_accuracy double,tiltX double,tiltY double,tiltZ double,tiltX_Source string,sampleTimestampID int,tiltY_Source string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:sid,d:sampleTimestamp#b,d:tiltX_accuracy#b,d:isValid#b,d:tiltX_reliability#b,d:tiltY_accuracy#b,d:tiltZ_expiresAt#b,d:lastUpdated#b,d:tiltY_reliability#b,d:tiltZ_reliability#b,d:startTimestamp#b,d:tiltY_expiresAt#b,d:tenantId,d:tiltX_expiresAt#b,d:parentEntityType,d:tiltZ_Source,d:receiveTimestamp#b,d:tiltZ_accuracy#b,d:tiltX#b,d:tiltY#b,d:tiltZ#b,d:tiltX_Source,d:sampleTimestampID#b,d:tiltY_Source') TBLPROPERTIES('hbase.table.name' = 'cim.InclinometerSensor.state') 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1338 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.InclinometerData_state_period(sampleTimestamp bigint,tiltY_reliability double,tiltZ_Source string,tiltX_reliability double,tiltY_Source string,isValid boolean,tiltY double,tiltX_Source string,tiltZ double,parentEntityType string,receiveTimestamp bigint,tiltZ_accuracy double,tiltZ_reliability double,sid string,tiltX double,lastUpdated bigint,tiltY_accuracy double,tiltY_expiresAt bigint,tiltX_expiresAt bigint,sampleTimestampID int,tenantId string,tiltX_accuracy double,tiltZ_expiresAt bigint,startTimestamp bigint) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 665 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.InclinometerSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,parentEntityType string,receiveTimestamp bigint,tiltZ_accuracy double,sid string,lastUpdated bigint,tiltY_expiresAt bigint,sampleTimestampID int,geocoordinates_latitude double,tiltX_accuracy double,tiltY_reliability double,tiltZ_Source string,tiltX_reliability double,tiltY_Source string,geocoordinates_altitude double,isValid boolean,tiltY double,tiltX_Source string,tiltZ double,tiltZ_reliability double,tiltX double,tiltY_accuracy double,tiltX_expiresAt bigint,tenantId string,tiltZ_expiresAt bigint,startTimestamp bigint) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 761 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Route_hbase ( rowkey string,url string,way int,shortName string,boundary_geoPoint string,description string,sampleTimestamp bigint,rating double,isValid boolean,providerDetails string,geohash string,active int,tag string,label string,type string,sampleTimestampID int,locationId string,streetIntersectionsId string,createTimestampID int,nickname string,path_geoPoint string,status string,sid string,distance double,destroyTimestamp bigint,private string,agencyId string,thirdPartyId string,lastUpdated bigint,responsible string,destroyTimestampID int,custom string,longName string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,colour string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:url,d:way#b,d:shortName,d:boundary_geoPoint,d:description,d:sampleTimestamp#b,d:rating#b,d:isValid#b,d:providerDetails,d:geohash,d:active#b,d:tag,d:label,d:type,d:sampleTimestampID#b,d:locationId,d:streetIntersectionsId,d:createTimestampID#b,d:nickname,d:path_geoPoint,d:status,d:sid,d:distance#b,d:destroyTimestamp#b,d:private,d:agencyId,d:thirdPartyId,d:lastUpdated#b,d:responsible,d:destroyTimestampID#b,d:custom,d:longName,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:colour') TBLPROPERTIES('hbase.table.name' = 'cim.Route') 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1530 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Route_location(sampleTimestamp bigint,private string,distance double,city String,timezone String,rating double,description string,agencyId string,boundary_geoPoint string,receiveTimestamp bigint,type string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,geohash string,createTimestampID int,responsible string,providerDetails string,nickname string,tag string,destroyTimestampID int,streetIntersectionsId string,isValid boolean,custom string,active int,label string,path_geoPoint string,url string,way int,colour string,tenantId string,shortName string,status string,destroyTimestamp bigint,longName string) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 822 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Route(sampleTimestamp bigint,private string,distance double,city string,timezone String,rating double,description string,agencyId string,boundary_geoPoint string,receiveTimestamp bigint,type string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,responsible string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,streetIntersectionsId string,custom string,isValid boolean,active int,path_geoPoint string,label string,url string,way int,colour string,tenantId string,shortName string,longName string,status string,destroyTimestamp bigint) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 836 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HailSensor_state_hourly(hailIntensityHits_min double,hailIntensityHits_avg double,week int,timeid int,city string,hailAccumulationHits_Source string,hailPeakIntensity_min double,hailPeakIntensity_avg double,hailAccumulationHits_min double,hailAccumulationHits_avg double,weekday int,sid string,hailPeakIntensity_max double,dimensionid bigint,hour int,hailIntensityHits_max double,day int,hailDuration_Source string,monthweek int,hailDuration_max double,count int,month int,hailAccumulationHits_max double,hailDuration_min double,hailDuration_avg double,hailPeakIntensity_Source string,locationid string,hailIntensityHits_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 850 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HailData_state_hourly(hailIntensityHits_min double,hailIntensityHits_avg double,week int,timeid int,city string,hailAccumulationHits_Source string,hailPeakIntensity_min double,hailPeakIntensity_avg double,hailAccumulationHits_min double,hailAccumulationHits_avg double,weekday int,sid string,hailPeakIntensity_max double,dimensionid bigint,hour int,hailIntensityHits_max double,day int,hailDuration_Source string,monthweek int,hailDuration_max double,count int,month int,hailAccumulationHits_max double,hailDuration_min double,hailDuration_avg double,hailPeakIntensity_Source string,locationid string,hailIntensityHits_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 848 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HailData_state(sampleTimestamp bigint,hailAccumulationHits_Source string,hailPeakIntensity_accuracy double,timezone string,hailIntensityHits double,parentEntityType string,receiveTimestamp bigint,sid string,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,lastUpdated bigint,hailPeakIntensity_reliability double,hailAccumulationHits_expiresAt bigint,sampleTimestampID int,locationId string,timezoneoffset int,hailAccumulationHits double,day int,hailDuration_Source string,hailIntensityHits_expiresAt bigint,hailDuration_expiresAt bigint,isValid boolean,hailPeakIntensity_expiresAt bigint,hailIntensityHits_accuracy double,hailPeakIntensity_Source string,hailPeakIntensity double,hailIntensityHits_Source string,tenantId string,hailDuration_accuracy double,hailAccumulationHits_accuracy double,hailDuration_reliability double,startTimestamp bigint,hailDuration double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1115 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HailSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,hailAccumulationHits_Source string,hailPeakIntensity_accuracy double,timezone string,hailIntensityHits double,parentEntityType string,receiveTimestamp bigint,sid string,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,lastUpdated bigint,hailPeakIntensity_reliability double,hailAccumulationHits_expiresAt bigint,sampleTimestampID int,locationId string,geocoordinates_latitude double,timezoneoffset int,hailAccumulationHits double,day int,hailDuration_Source string,hailIntensityHits_expiresAt bigint,hailDuration_expiresAt bigint,geocoordinates_altitude double,isValid boolean,hailPeakIntensity_expiresAt bigint,hailIntensityHits_accuracy double,hailPeakIntensity_Source string,hailPeakIntensity double,hailIntensityHits_Source string,tenantId string,hailDuration_accuracy double,hailAccumulationHits_accuracy double,hailDuration_reliability double,startTimestamp bigint,hailDuration double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1211 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.HailSensor_state_hbase ( rowkey string,sampleTimestamp bigint,isValid boolean,hailPeakIntensity double,hailDuration_Source string,hailIntensityHits_Source string,hailDuration_accuracy double,hailPeakIntensity_expiresAt bigint,hailPeakIntensity_Source string,hailAccumulationHits double,parentEntityType string,hailIntensityHits_expiresAt bigint,sampleTimestampID int,hailPeakIntensity_accuracy double,hailIntensityHits_accuracy double,hailDuration double,sid string,hailDuration_reliability double,lastUpdated bigint,hailAccumulationHits_reliability double,hailAccumulationHits_expiresAt bigint,hailAccumulationHits_Source string,startTimestamp bigint,hailPeakIntensity_reliability double,hailIntensityHits_reliability double,tenantId string,hailAccumulationHits_accuracy double,hailIntensityHits double,receiveTimestamp bigint,hailDuration_expiresAt bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:sampleTimestamp#b,d:isValid#b,d:hailPeakIntensity#b,d:hailDuration_Source,d:hailIntensityHits_Source,d:hailDuration_accuracy#b,d:hailPeakIntensity_expiresAt#b,d:hailPeakIntensity_Source,d:hailAccumulationHits#b,d:parentEntityType,d:hailIntensityHits_expiresAt#b,d:sampleTimestampID#b,d:hailPeakIntensity_accuracy#b,d:hailIntensityHits_accuracy#b,d:hailDuration#b,d:sid,d:hailDuration_reliability#b,d:lastUpdated#b,d:hailAccumulationHits_reliability#b,d:hailAccumulationHits_expiresAt#b,d:hailAccumulationHits_Source,d:startTimestamp#b,d:hailPeakIntensity_reliability#b,d:hailIntensityHits_reliability#b,d:tenantId,d:hailAccumulationHits_accuracy#b,d:hailIntensityHits#b,d:receiveTimestamp#b,d:hailDuration_expiresAt#b') TBLPROPERTIES('hbase.table.name' = 'cim.HailSensor.state') 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1971 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.HailData_state_period(sampleTimestamp bigint,hailAccumulationHits_Source string,hailPeakIntensity_accuracy double,hailIntensityHits double,parentEntityType string,receiveTimestamp bigint,sid string,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,lastUpdated bigint,hailAccumulationHits_expiresAt bigint,hailPeakIntensity_reliability double,sampleTimestampID int,hailAccumulationHits double,hailDuration_Source string,hailIntensityHits_expiresAt bigint,hailDuration_expiresAt bigint,isValid boolean,hailPeakIntensity_expiresAt bigint,hailIntensityHits_accuracy double,hailPeakIntensity_Source string,hailPeakIntensity double,hailIntensityHits_Source string,tenantId string,hailDuration_accuracy double,hailDuration_reliability double,startTimestamp bigint,hailAccumulationHits_accuracy double,hailDuration double) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 990 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.HailSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,hailAccumulationHits_Source string,hailPeakIntensity_accuracy double,hailIntensityHits double,parentEntityType string,receiveTimestamp bigint,sid string,hailAccumulationHits_reliability double,hailIntensityHits_reliability double,lastUpdated bigint,hailAccumulationHits_expiresAt bigint,hailPeakIntensity_reliability double,sampleTimestampID int,geocoordinates_latitude double,hailAccumulationHits double,hailDuration_Source string,hailIntensityHits_expiresAt bigint,hailDuration_expiresAt bigint,geocoordinates_altitude double,isValid boolean,hailPeakIntensity_expiresAt bigint,hailIntensityHits_accuracy double,hailPeakIntensity_Source string,hailPeakIntensity double,hailIntensityHits_Source string,tenantId string,hailDuration_accuracy double,hailDuration_reliability double,startTimestamp bigint,hailAccumulationHits_accuracy double,hailDuration double) 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1086 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientNoiseSensor_state_hourly(week int,timeid int,city string,weekday int,laeqt_lamax_min double,laeqt_lamax_avg double,noise_max double,sid string,dimensionid bigint,hour int,noise_Source string,noise_min double,noise_avg double,day int,laeqt_lamin_max double,monthweek int,laeqt_laeq_min double,laeqt_laeq_avg double,count int,laeqt_lamin_min double,laeqt_lamin_avg double,month int,laeqt_laeq_max double,laeqt_lamax_max double,locationid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 665 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientNoiseData_state_hourly(week int,timeid int,city string,weekday int,laeqt_lamax_min double,laeqt_lamax_avg double,noise_max double,sid string,dimensionid bigint,hour int,noise_Source string,noise_min double,noise_avg double,day int,laeqt_lamin_max double,monthweek int,laeqt_laeq_min double,laeqt_laeq_avg double,count int,laeqt_lamin_min double,laeqt_lamin_avg double,month int,laeqt_laeq_max double,laeqt_lamax_max double,locationid string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 663 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientNoiseData_state(noise_expiresAt bigint,sampleTimestamp bigint,laeqt_lamax double,noise_reliability double,timezone string,isValid boolean,parentEntityType string,receiveTimestamp bigint,laeqt_laeq double,sid string,lastUpdated bigint,laeqt_lamin double,sampleTimestampID int,locationId string,noise_Source string,tenantId string,noise double,timezoneoffset int,noise_accuracy double,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 640 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientNoiseSensor_state(noise_expiresAt bigint,sampleTimestamp bigint,geocoordinates_longitude double,laeqt_lamax double,noise_reliability double,timezone string,geocoordinates_altitude double,isValid boolean,parentEntityType string,receiveTimestamp bigint,laeqt_laeq double,sid string,lastUpdated bigint,laeqt_lamin double,sampleTimestampID int,locationId string,noise_Source string,tenantId string,noise double,geocoordinates_latitude double,timezoneoffset int,noise_accuracy double,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 736 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.447 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.AmbientNoiseSensor_state_hbase ( rowkey string,noise_accuracy double,sid string,noise_expiresAt bigint,sampleTimestamp bigint,isValid boolean,laeqt_lamin double,noise_reliability double,lastUpdated bigint,noise double,startTimestamp bigint,tenantId string,laeqt_laeq double,parentEntityType string,receiveTimestamp bigint,noise_Source string,sampleTimestampID int,laeqt_lamax double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:noise_accuracy#b,d:sid,d:noise_expiresAt#b,d:sampleTimestamp#b,d:isValid#b,d:laeqt_lamin#b,d:noise_reliability#b,d:lastUpdated#b,d:noise#b,d:startTimestamp#b,d:tenantId,d:laeqt_laeq#b,d:parentEntityType,d:receiveTimestamp#b,d:noise_Source,d:sampleTimestampID#b,d:laeqt_lamax#b') TBLPROPERTIES('hbase.table.name' = 'cim.AmbientNoiseSensor.state') 09:09:14.447 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1063 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientNoiseData_state_period(noise_expiresAt bigint,sampleTimestamp bigint,laeqt_lamax double,noise_reliability double,isValid boolean,parentEntityType string,receiveTimestamp bigint,laeqt_laeq double,sid string,lastUpdated bigint,laeqt_lamin double,sampleTimestampID int,noise_Source string,noise double,tenantId string,noise_accuracy double,startTimestamp bigint) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 515 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientNoiseSensor_state_period(noise_expiresAt bigint,sampleTimestamp bigint,geocoordinates_longitude double,laeqt_lamax double,noise_reliability double,geocoordinates_altitude double,isValid boolean,parentEntityType string,receiveTimestamp bigint,laeqt_laeq double,sid string,lastUpdated bigint,laeqt_lamin double,sampleTimestampID int,noise_Source string,noise double,tenantId string,geocoordinates_latitude double,noise_accuracy double,startTimestamp bigint) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 611 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.MarkedEdge_hbase ( rowkey string,toPOM string,objectType string,path_geoPoint string,sid string,sampleTimestamp bigint,fromOrigin int,isValid boolean,geohash string,tenantId string,toDestination int,receiveTimestamp bigint,fromPOM string,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:toPOM,d:objectType,d:path_geoPoint,d:sid,d:sampleTimestamp#b,d:fromOrigin#b,d:isValid#b,d:geohash,d:tenantId,d:toDestination#b,d:receiveTimestamp#b,d:fromPOM,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.MarkedEdge') 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 860 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.MarkedEdge_location(sampleTimestamp bigint,toPOM string,city String,toDestination int,timezone String,isValid boolean,path_geoPoint string,receiveTimestamp bigint,objectType string,sid string,sampleTimestampID int,locationId String,geohash string,fromPOM string,tenantId string,fromOrigin int) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 442 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.MarkedEdge(sampleTimestamp bigint,toPOM string,city string,toDestination int,timezone String,isValid boolean,path_geoPoint string,receiveTimestamp bigint,sid string,objectType string,sampleTimestampID int,locationId string,geohash string,fromPOM string,tenantId string,timezoneoffset int,fromOrigin int) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 456 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.MarkedEdge_state(sampleTimestamp bigint,timeWindow int,timezone string,isValid boolean,receiveTimestamp bigint,sid string,numObjects int,dimensionid bigint,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,travelTimeSecs int,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 500 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.MarkedEdge_state_hbase ( rowkey string,startTimestamp bigint,numObjects int,sid string,tenantId string,sampleTimestamp bigint,timeWindow int,isValid boolean,receiveTimestamp bigint,travelTimeSecs int,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:startTimestamp#b,d:numObjects#b,d:sid,d:tenantId,d:sampleTimestamp#b,d:timeWindow#b,d:isValid#b,d:receiveTimestamp#b,d:travelTimeSecs#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.MarkedEdge.state') 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 775 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.MarkedEdge_state_period(sampleTimestamp bigint,timeWindow int,sampleTimestampID int,isValid boolean,tenantId string,receiveTimestamp bigint,travelTimeSecs int,startTimestamp bigint,numObjects int,sid string) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 356 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.RainSensor_state_hourly(week int,rainPeakIntensity_Source string,timeid int,city string,rainAccumulation_Source string,weekday int,rainDuration_min double,rainPeakIntensity_max double,sid string,dimensionid bigint,hour int,rainDuration_avg double,rainIntensity_min double,rainIntensity_avg double,day int,rainAccumulation_min double,rainAccumulation_avg double,monthweek int,rainIntensity_max double,count int,rainIntensity_Source string,rainDuration_Source string,rainAccumulation_max double,month int,locationid string,rainPeakIntensity_min double,rainPeakIntensity_avg double,rainDuration_max double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 818 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.RainData_state_hourly(week int,rainPeakIntensity_Source string,timeid int,city string,rainAccumulation_Source string,weekday int,rainDuration_min double,rainPeakIntensity_max double,sid string,dimensionid bigint,hour int,rainDuration_avg double,rainIntensity_min double,rainIntensity_avg double,day int,rainAccumulation_min double,rainAccumulation_avg double,monthweek int,rainIntensity_max double,count int,rainIntensity_Source string,rainDuration_Source string,rainAccumulation_max double,month int,locationid string,rainPeakIntensity_min double,rainPeakIntensity_avg double,rainDuration_max double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 816 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.RainData_state(sampleTimestamp bigint,rainDuration_accuracy double,rainPeakIntensity_Source string,rainAccumulation_Source string,rainIntensity double,timezone string,rainPeakIntensity_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,rainIntensity_reliability double,lastUpdated bigint,rainPeakIntensity_expiresAt bigint,sampleTimestampID int,rainDuration_expiresAt bigint,locationId string,timezoneoffset int,day int,rainPeakIntensity_accuracy double,rainDuration double,isValid boolean,rainPeakIntensity double,rainIntensity_Source string,rainDuration_Source string,rainDuration_reliability double,rainIntensity_accuracy double,rainAccumulation_expiresAt bigint,tenantId string,rainAccumulation_accuracy double,rainIntensity_expiresAt bigint,rainAccumulation double,startTimestamp bigint,rainAccumulation_reliability double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1075 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.RainSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,rainDuration_accuracy double,rainPeakIntensity_Source string,rainAccumulation_Source string,rainIntensity double,timezone string,rainPeakIntensity_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,rainIntensity_reliability double,lastUpdated bigint,rainPeakIntensity_expiresAt bigint,sampleTimestampID int,rainDuration_expiresAt bigint,locationId string,geocoordinates_latitude double,timezoneoffset int,day int,rainPeakIntensity_accuracy double,rainDuration double,geocoordinates_altitude double,isValid boolean,rainPeakIntensity double,rainIntensity_Source string,rainDuration_Source string,rainDuration_reliability double,rainIntensity_accuracy double,rainAccumulation_expiresAt bigint,tenantId string,rainAccumulation_accuracy double,rainIntensity_expiresAt bigint,rainAccumulation double,startTimestamp bigint,rainAccumulation_reliability double) partitioned by (year int,month int,city string) stored as parquet 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1171 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.RainSensor_state_hbase ( rowkey string,rainDuration_accuracy double,rainAccumulation_accuracy double,rainIntensity_expiresAt bigint,sampleTimestamp bigint,rainIntensity double,isValid boolean,rainAccumulation double,rainDuration_reliability double,rainAccumulation_Source string,parentEntityType string,rainPeakIntensity_reliability double,sampleTimestampID int,rainAccumulation_reliability double,rainPeakIntensity_expiresAt bigint,rainPeakIntensity_Source string,rainPeakIntensity double,sid string,rainDuration double,rainPeakIntensity_accuracy double,rainIntensity_accuracy double,rainAccumulation_expiresAt bigint,lastUpdated bigint,startTimestamp bigint,tenantId string,rainDuration_Source string,rainIntensity_reliability double,receiveTimestamp bigint,rainDuration_expiresAt bigint,rainIntensity_Source string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:rainDuration_accuracy#b,d:rainAccumulation_accuracy#b,d:rainIntensity_expiresAt#b,d:sampleTimestamp#b,d:rainIntensity#b,d:isValid#b,d:rainAccumulation#b,d:rainDuration_reliability#b,d:rainAccumulation_Source,d:parentEntityType,d:rainPeakIntensity_reliability#b,d:sampleTimestampID#b,d:rainAccumulation_reliability#b,d:rainPeakIntensity_expiresAt#b,d:rainPeakIntensity_Source,d:rainPeakIntensity#b,d:sid,d:rainDuration#b,d:rainPeakIntensity_accuracy#b,d:rainIntensity_accuracy#b,d:rainAccumulation_expiresAt#b,d:lastUpdated#b,d:startTimestamp#b,d:tenantId,d:rainDuration_Source,d:rainIntensity_reliability#b,d:receiveTimestamp#b,d:rainDuration_expiresAt#b,d:rainIntensity_Source') TBLPROPERTIES('hbase.table.name' = 'cim.RainSensor.state') 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1891 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.RainData_state_period(sampleTimestamp bigint,rainDuration_accuracy double,rainPeakIntensity_Source string,rainAccumulation_Source string,rainIntensity double,rainPeakIntensity_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,rainIntensity_reliability double,rainPeakIntensity_expiresAt bigint,sampleTimestampID int,rainDuration_expiresAt bigint,rainPeakIntensity_accuracy double,rainDuration double,isValid boolean,rainPeakIntensity double,rainIntensity_Source string,rainDuration_Source string,rainDuration_reliability double,rainIntensity_accuracy double,rainAccumulation_expiresAt bigint,tenantId string,rainAccumulation_accuracy double,rainIntensity_expiresAt bigint,rainAccumulation double,startTimestamp bigint,rainAccumulation_reliability double) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 950 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.RainSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,rainDuration_accuracy double,rainPeakIntensity_Source string,rainAccumulation_Source string,rainIntensity double,rainPeakIntensity_reliability double,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,rainIntensity_reliability double,rainPeakIntensity_expiresAt bigint,sampleTimestampID int,rainDuration_expiresAt bigint,geocoordinates_latitude double,rainPeakIntensity_accuracy double,rainDuration double,geocoordinates_altitude double,isValid boolean,rainPeakIntensity double,rainIntensity_Source string,rainDuration_Source string,rainDuration_reliability double,rainIntensity_accuracy double,rainAccumulation_expiresAt bigint,tenantId string,rainAccumulation_accuracy double,rainIntensity_expiresAt bigint,rainAccumulation double,startTimestamp bigint,rainAccumulation_reliability double) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1046 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteCollectionRidership_state(binCollectionStatus string,sampleTimestamp bigint,distance double,timezone string,receiveTimestamp bigint,sid string,createTimestamp bigint,lastUpdated bigint,dimensionid bigint,totalWasteCollected double,sampleTimestampID int,locationId string,createTimestampID int,totalBinsToVisit int,timezoneoffset int,destroyTimestampID int,day int,wasteCollectionTripId string,tripDelaySecs double,isValid boolean,binsToCollect string,serviceTime int,tenantId string,binsVisited int,startTimestamp bigint,status int,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 781 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteCollectionRidership_state_hbase ( rowkey string,binCollectionStatus string,createTimestampID int,status int,sid string,sampleTimestamp bigint,totalBinsToVisit int,isValid boolean,distance double,destroyTimestamp bigint,totalWasteCollected double,lastUpdated bigint,binsToCollect string,tripDelaySecs double,destroyTimestampID int,serviceTime int,startTimestamp bigint,tenantId string,receiveTimestamp bigint,createTimestamp bigint,binsVisited int,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:binCollectionStatus,d:createTimestampID#b,d:status#b,d:sid,d:sampleTimestamp#b,d:totalBinsToVisit#b,d:isValid#b,d:distance#b,d:destroyTimestamp#b,d:totalWasteCollected#b,d:lastUpdated#b,d:binsToCollect,d:tripDelaySecs#b,d:destroyTimestampID#b,d:serviceTime#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:binsVisited#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.WasteCollectionRidership.state') 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1251 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteCollectionRidership_state_period(binCollectionStatus string,sampleTimestamp bigint,distance double,tripDelaySecs double,isValid boolean,receiveTimestamp bigint,binsToCollect string,serviceTime int,sid string,createTimestamp bigint,lastUpdated bigint,totalWasteCollected double,sampleTimestampID int,createTimestampID int,tenantId string,totalBinsToVisit int,binsVisited int,destroyTimestampID int,startTimestamp bigint,status int,destroyTimestamp bigint) 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 608 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.448 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteCollectionRidership_hbase ( rowkey string,entityType string,sampleTimestamp bigint,isValid boolean,providerDetails string,geohash string,active int,label string,sampleTimestampID int,locationId string,createTimestampID int,nickname string,path_geoPoint string,vehicleId string,sid string,destroyTimestamp bigint,wasteCollectionTripId string,private string,agencyId string,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,driverId string,tenantId string,receiveTimestamp bigint,createTimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:entityType,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:geohash,d:active#b,d:label,d:sampleTimestampID#b,d:locationId,d:createTimestampID#b,d:nickname,d:path_geoPoint,d:vehicleId,d:sid,d:destroyTimestamp#b,d:wasteCollectionTripId,d:private,d:agencyId,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:driverId,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.WasteCollectionRidership') 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.448 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1296 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteCollectionRidership_location(sampleTimestamp bigint,private string,city String,timezone String,agencyId string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,geohash string,createTimestampID int,providerDetails string,nickname string,vehicleId string,destroyTimestampID int,wasteCollectionTripId string,entityType string,isValid boolean,active int,label string,path_geoPoint string,driverId string,tenantId string,destroyTimestamp bigint) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 682 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteCollectionRidership(sampleTimestamp bigint,private string,city string,timezone string,agencyId string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,vehicleId string,destroyTimestampID int,wasteCollectionTripId string,entityType string,isValid boolean,active int,path_geoPoint string,label string,WasteCollectionTripdimensionid bigint,driverId string,tenantId string,destroyTimestamp bigint) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 734 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.PowerMeter_hbase ( rowkey string,geocoordinates_longitude double,deviceType string,sampleTimestamp bigint,isValid boolean,providerDetails string,model string,sensorType string,regulatoryAgency string,applicableDomain string,tag string,label string,type string,sampleTimestampID int,connectivityType string,locationId string,instanceId string,outsourceablePolicyTemplate string,operatedBy string,createTimestampID int,status string,sid string,geocoordinates_altitude double,destroyTimestamp bigint,rawHealth string,private string,thirdPartyId string,lastUpdated bigint,supportedMode string,mode string,policyHandler string,destroyTimestampID int,dependentOn string,isIndependent int,custom string,tenantId string,parentDomain string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:geocoordinates_longitude#b,d:deviceType,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:model,d:sensorType,d:regulatoryAgency,d:applicableDomain,d:tag,d:label,d:type,d:sampleTimestampID#b,d:connectivityType,d:locationId,d:instanceId,d:outsourceablePolicyTemplate,d:operatedBy,d:createTimestampID#b,d:status,d:sid,d:geocoordinates_altitude#b,d:destroyTimestamp#b,d:rawHealth,d:private,d:thirdPartyId,d:lastUpdated#b,d:supportedMode,d:mode,d:policyHandler,d:destroyTimestampID#b,d:dependentOn,d:isIndependent#b,d:custom,d:tenantId,d:parentDomain,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b') TBLPROPERTIES('hbase.table.name' = 'cim.PowerMeter') 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1816 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.PowerMeter_location(geocoordinates_longitude double,sampleTimestamp bigint,private string,isIndependent int,city String,timezone String,receiveTimestamp bigint,type string,rawHealth string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,connectivityType string,mode string,thirdPartyId string,lastUpdated bigint,instanceId string,sampleTimestampID int,dependentOn string,locationId String,sensorType string,createTimestampID int,providerDetails string,geocoordinates_latitude double,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,geocoordinates_altitude double,isValid boolean,custom string,operatedBy string,label string,supportedMode string,regulatoryAgency string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 974 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.PowerMeter(geocoordinates_longitude double,sampleTimestamp bigint,private string,isIndependent int,city string,timezone String,receiveTimestamp bigint,type string,rawHealth string,sid string,parentDomain string,createTimestamp bigint,connectivityType string,mode string,applicableDomain string,thirdPartyId string,lastUpdated bigint,instanceId string,dependentOn string,sampleTimestampID int,locationId string,sensorType string,createTimestampID int,geocoordinates_latitude double,providerDetails string,timezoneoffset int,model string,outsourceablePolicyTemplate string,tag string,destroyTimestampID int,deviceType string,policyHandler string,geocoordinates_altitude double,isValid boolean,custom string,operatedBy string,label string,supportedMode string,regulatoryAgency string,tenantId string,status string,destroyTimestamp bigint) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 988 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.PowerMeter_state(sampleTimestamp bigint,demandFactor double,apparentEnergy double,fundamentalReactivePower double,deviceState_connState_connected int,timezone string,distortionPowerFactor double,reactiveEnergy double,receiveTimestamp bigint,displacementPowerFactor double,sid string,frequency double,thdi double,createTimestamp bigint,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,maxLoadSupported double,timezoneoffset int,activeEnergy double,destroyTimestampID int,day int,activePower double,fundamentalActivePower double,deviceState_connState_since bigint,reactivePower double,connectedLoad double,rmsi double,isValid boolean,apparentPower double,i1 double,deviceState_batteryPercentage double,rmsv double,thdv double,loadFactor double,tenantId string,v1 double,startTimestamp bigint,powerFactor double,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1104 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.PowerMeter_state_hbase ( rowkey string,activeEnergy double,thdi double,sampleTimestamp bigint,fundamentalReactivePower double,isValid boolean,deviceState_connState_connected int,deviceState_connState_since bigint,distortionPowerFactor double,frequency double,maxLoadSupported double,displacementPowerFactor double,thdv double,i1 double,reactivePower double,apparentPower double,connectedLoad double,deviceState_batteryPercentage double,sampleTimestampID int,demandFactor double,fundamentalActivePower double,createTimestampID int,sid string,apparentEnergy double,activePower double,powerFactor double,destroyTimestamp bigint,reactiveEnergy double,rmsi double,lastUpdated bigint,destroyTimestampID int,startTimestamp bigint,tenantId string,receiveTimestamp bigint,createTimestamp bigint,loadFactor double,rmsv double,v1 double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:activeEnergy#b,d:thdi#b,d:sampleTimestamp#b,d:fundamentalReactivePower#b,d:isValid#b,d:deviceState_connState_connected#b,d:deviceState_connState_since#b,d:distortionPowerFactor#b,d:frequency#b,d:maxLoadSupported#b,d:displacementPowerFactor#b,d:thdv#b,d:i1#b,d:reactivePower#b,d:apparentPower#b,d:connectedLoad#b,d:deviceState_batteryPercentage#b,d:sampleTimestampID#b,d:demandFactor#b,d:fundamentalActivePower#b,d:createTimestampID#b,d:sid,d:apparentEnergy#b,d:activePower#b,d:powerFactor#b,d:destroyTimestamp#b,d:reactiveEnergy#b,d:rmsi#b,d:lastUpdated#b,d:destroyTimestampID#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:loadFactor#b,d:rmsv#b,d:v1#b') TBLPROPERTIES('hbase.table.name' = 'cim.PowerMeter.state') 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1902 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.PowerMeter_state_period(sampleTimestamp bigint,demandFactor double,apparentEnergy double,fundamentalReactivePower double,deviceState_connState_connected int,distortionPowerFactor double,reactiveEnergy double,receiveTimestamp bigint,displacementPowerFactor double,thdi double,frequency double,sid string,createTimestamp bigint,lastUpdated bigint,sampleTimestampID int,createTimestampID int,maxLoadSupported double,activeEnergy double,destroyTimestampID int,activePower double,deviceState_connState_since bigint,fundamentalActivePower double,reactivePower double,connectedLoad double,rmsi double,isValid boolean,apparentPower double,i1 double,deviceState_batteryPercentage double,rmsv double,thdv double,loadFactor double,tenantId string,v1 double,startTimestamp bigint,powerFactor double,destroyTimestamp bigint) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 960 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.DirTemporalMobilityStats_hbase ( rowkey string,status string,sid string,sampleTimestamp bigint,temporalStatsId string,isValid boolean,lastUpdated bigint,createTime bigint,tenantId string,receiveTimestamp bigint,directionLabel string,sampleTimestampID int,locationId string,granularity int,bearing int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:status,d:sid,d:sampleTimestamp#b,d:temporalStatsId,d:isValid#b,d:lastUpdated#b,d:createTime#b,d:tenantId,d:receiveTimestamp#b,d:directionLabel,d:sampleTimestampID#b,d:locationId,d:granularity#b,d:bearing#b') TBLPROPERTIES('hbase.table.name' = 'cim.DirTemporalMobilityStats') 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 910 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.DirTemporalMobilityStats_location(sampleTimestamp bigint,city String,temporalStatsId string,bearing int,timezone String,isValid boolean,receiveTimestamp bigint,sid string,lastUpdated bigint,createTime bigint,directionLabel string,sampleTimestampID int,locationId String,granularity int,tenantId string,status string) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 465 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.DirTemporalMobilityStats(sampleTimestamp bigint,city string,temporalStatsId string,bearing int,timezone String,isValid boolean,receiveTimestamp bigint,sid string,lastUpdated bigint,directionLabel string,createTime bigint,sampleTimestampID int,granularity int,locationId string,tenantId string,timezoneoffset int,status string) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 479 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.DirTemporalMobilityStats_plus(sampleTimestamp bigint,temporalmobilitystatsname string,city string,temporalStatsId string,bearing int,timezone String,isValid boolean,entityid string,receiveTimestamp bigint,sid string,lastUpdated bigint,entitytype string,directionLabel string,createTime bigint,sampleTimestampID int,granularity int,locationId string,tenantId string,mobilityisvalid boolean,timezoneoffset int,status string,mobilitystatsname string) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 600 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.DirTemporalMobilityStats_statistic_hbase ( rowkey string,avgSpeed double,flow double,startTimestamp bigint,deltaTimestamp bigint,sid string,tenantId string,sampleTimestamp bigint,count int,isValid boolean,receiveTimestamp bigint,endTimestamp bigint,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:avgSpeed#b,d:flow#b,d:startTimestamp#b,d:deltaTimestamp#b,d:sid,d:tenantId,d:sampleTimestamp#b,d:count#b,d:isValid#b,d:receiveTimestamp#b,d:endTimestamp#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.DirTemporalMobilityStats.statistic') 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 861 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.DirTemporalMobilityStats_statistic_period(sampleTimestamp bigint,temporalmobilitystatsname String,year int,city string,bearing int,isValid boolean,count int,deltaTimestamp bigint,entityid String,receiveTimestamp bigint,sid string,avgSpeed double,month int,entitytype String,sampleTimestampID int,directionLabel string,locationid String,tenantId string,mobilityisvalid boolean,timezoneoffset int,endTimestamp bigint,flow double,startTimestamp bigint,mobilitystatsname String) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 623 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.DirTemporalMobilityStats_statistic(sampleTimestamp bigint,temporalmobilitystatsname String,bearing int,isValid boolean,count int,deltaTimestamp bigint,entityid String,receiveTimestamp bigint,sid string,avgSpeed double,entitytype String,sampleTimestampID int,directionLabel string,locationid String,tenantId string,mobilityisvalid boolean,timezoneoffset int,endTimestamp bigint,flow double,startTimestamp bigint,mobilitystatsname String) partitioned by (year int,month int,city string) stored as parquet 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 656 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.DirTemporalMobilityStats_statistic_hourly(week int,timeid int,city string,flow_sum double,weekday int,entityid String,count_sum int,sid string,hour int,directionLabel string,timezoneoffset int,day int,mobilitystatsname String,temporalmobilitystatsname String,monthweek int,bearing int,isValid boolean,avgSpeed_max double,flow_max double,count_avg Double,entitytype String,month int,avgSpeed_sum double,locationid string,flow_min double,flow_avg Double,avgSpeed_avg Double,avgSpeed_min double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 707 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.StopTimes_hbase ( rowkey string,dropOffType int,sampleTimestamp bigint,isValid boolean,providerDetails string,headsign string,departure bigint,label string,tag string,pickupType int,sampleTimestampID int,locationId string,source string,arrival bigint,stopId string,createTimestampID int,sid string,destroyTimestamp bigint,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,tenantId string,sequence int,receiveTimestamp bigint,createTimestamp bigint,tripId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:dropOffType#b,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:headsign,d:departure#b,d:label,d:tag,d:pickupType#b,d:sampleTimestampID#b,d:locationId,d:source,d:arrival#b,d:stopId,d:createTimestampID#b,d:sid,d:destroyTimestamp#b,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:tenantId,d:sequence#b,d:receiveTimestamp#b,d:createTimestamp#b,d:tripId') TBLPROPERTIES('hbase.table.name' = 'cim.StopTimes') 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1222 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.StopTimes_location(sampleTimestamp bigint,dropOffType int,arrival bigint,city String,timezone String,tripId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,headsign string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,providerDetails string,tag string,destroyTimestampID int,isValid boolean,stopId string,label string,pickupType int,sequence int,tenantId string,departure bigint,destroyTimestamp bigint) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 638 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.StopTimes(sampleTimestamp bigint,dropOffType int,city string,arrival bigint,timezone string,tripId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,headsign string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,Stopdimensionid bigint,isValid boolean,stopId string,label string,TransitTripdimensionid bigint,pickupType int,sequence int,tenantId string,departure bigint,destroyTimestamp bigint) 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 705 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.449 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ServiceCalendar_hbase ( rowkey string,sampleTimestamp bigint,isValid boolean,monday int,wednesday int,tuesday int,providerDetails string,friday int,active int,thursday int,tag string,label string,sampleTimestampID int,startDate bigint,locationId string,source string,saturday int,createTimestampID int,nickname string,status int,sid string,sunday int,destroyTimestamp bigint,private string,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,custom string,tenantId string,receiveTimestamp bigint,endDate bigint,createTimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:sampleTimestamp#b,d:isValid#b,d:monday#b,d:wednesday#b,d:tuesday#b,d:providerDetails,d:friday#b,d:active#b,d:thursday#b,d:tag,d:label,d:sampleTimestampID#b,d:startDate#b,d:locationId,d:source,d:saturday#b,d:createTimestampID#b,d:nickname,d:status#b,d:sid,d:sunday#b,d:destroyTimestamp#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:custom,d:tenantId,d:receiveTimestamp#b,d:endDate#b,d:createTimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.ServiceCalendar') 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.449 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1362 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ServiceCalendar_location(sampleTimestamp bigint,saturday int,private string,endDate bigint,city String,timezone String,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,providerDetails string,nickname string,wednesday int,friday int,tag string,destroyTimestampID int,monday int,isValid boolean,custom string,active int,thursday int,label string,sunday int,tuesday int,tenantId string,startDate bigint,status int,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 708 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ServiceCalendar(sampleTimestamp bigint,saturday int,private string,endDate bigint,city string,timezone String,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,nickname string,providerDetails string,wednesday int,friday int,timezoneoffset int,tag string,destroyTimestampID int,monday int,isValid boolean,custom string,thursday int,active int,label string,sunday int,tuesday int,tenantId string,startDate bigint,status int,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 722 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TerrestrialRadiationSensor_state_hourly(emissionOfGroundSurface_max double,monthweek int,week int,timeid int,city string,count int,weekday int,atmosphericRadiation_Source string,sid string,atmosphericRadiation_min double,atmosphericRadiation_avg double,emissionOfGroundSurface_Source string,atmosphericRadiation_max double,dimensionid bigint,month int,hour int,locationid string,emissionOfGroundSurface_min double,emissionOfGroundSurface_avg double,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 672 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TerrestrialRadiationData_state_hourly(emissionOfGroundSurface_max double,monthweek int,week int,timeid int,city string,count int,weekday int,atmosphericRadiation_Source string,sid string,atmosphericRadiation_min double,atmosphericRadiation_avg double,emissionOfGroundSurface_Source string,atmosphericRadiation_max double,dimensionid bigint,month int,hour int,locationid string,emissionOfGroundSurface_min double,emissionOfGroundSurface_avg double,day int) partitioned by (year int, tenantid string) stored as parquet 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 670 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TerrestrialRadiationData_state(sampleTimestamp bigint,emissionOfGroundSurface_reliability double,timezone string,isValid boolean,emissionOfGroundSurface_expiresAt bigint,parentEntityType string,receiveTimestamp bigint,atmosphericRadiation_Source string,emissionOfGroundSurface double,sid string,emissionOfGroundSurface_accuracy double,emissionOfGroundSurface_Source string,lastUpdated bigint,atmosphericRadiation_reliability double,sampleTimestampID int,atmosphericRadiation double,locationId string,atmosphericRadiation_expiresAt bigint,tenantId string,atmosphericRadiation_accuracy double,timezoneoffset int,day int,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 860 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TerrestrialRadiationSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,emissionOfGroundSurface_reliability double,timezone string,parentEntityType string,receiveTimestamp bigint,emissionOfGroundSurface double,sid string,emissionOfGroundSurface_Source string,lastUpdated bigint,atmosphericRadiation_reliability double,sampleTimestampID int,atmosphericRadiation double,locationId string,atmosphericRadiation_accuracy double,geocoordinates_latitude double,timezoneoffset int,day int,geocoordinates_altitude double,isValid boolean,emissionOfGroundSurface_expiresAt bigint,atmosphericRadiation_Source string,emissionOfGroundSurface_accuracy double,atmosphericRadiation_expiresAt bigint,tenantId string,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 956 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.TerrestrialRadiationSensor_state_hbase ( rowkey string,atmosphericRadiation_reliability double,emissionOfGroundSurface_expiresAt bigint,sid string,sampleTimestamp bigint,atmosphericRadiation_accuracy double,isValid boolean,atmosphericRadiation double,atmosphericRadiation_Source string,emissionOfGroundSurface_reliability double,emissionOfGroundSurface double,lastUpdated bigint,emissionOfGroundSurface_accuracy double,emissionOfGroundSurface_Source string,startTimestamp bigint,atmosphericRadiation_expiresAt bigint,tenantId string,parentEntityType string,receiveTimestamp bigint,sampleTimestampID int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:atmosphericRadiation_reliability#b,d:emissionOfGroundSurface_expiresAt#b,d:sid,d:sampleTimestamp#b,d:atmosphericRadiation_accuracy#b,d:isValid#b,d:atmosphericRadiation#b,d:atmosphericRadiation_Source,d:emissionOfGroundSurface_reliability#b,d:emissionOfGroundSurface#b,d:lastUpdated#b,d:emissionOfGroundSurface_accuracy#b,d:emissionOfGroundSurface_Source,d:startTimestamp#b,d:atmosphericRadiation_expiresAt#b,d:tenantId,d:parentEntityType,d:receiveTimestamp#b,d:sampleTimestampID#b') TBLPROPERTIES('hbase.table.name' = 'cim.TerrestrialRadiationSensor.state') 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1495 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.TerrestrialRadiationData_state_period(sampleTimestamp bigint,emissionOfGroundSurface_reliability double,isValid boolean,emissionOfGroundSurface_expiresAt bigint,parentEntityType string,receiveTimestamp bigint,atmosphericRadiation_Source string,emissionOfGroundSurface double,sid string,emissionOfGroundSurface_accuracy double,emissionOfGroundSurface_Source string,lastUpdated bigint,atmosphericRadiation_reliability double,sampleTimestampID int,atmosphericRadiation double,atmosphericRadiation_expiresAt bigint,atmosphericRadiation_accuracy double,tenantId string,startTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 735 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.TerrestrialRadiationSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,emissionOfGroundSurface_reliability double,geocoordinates_altitude double,isValid boolean,emissionOfGroundSurface_expiresAt bigint,parentEntityType string,receiveTimestamp bigint,atmosphericRadiation_Source string,emissionOfGroundSurface double,sid string,emissionOfGroundSurface_accuracy double,emissionOfGroundSurface_Source string,lastUpdated bigint,atmosphericRadiation_reliability double,sampleTimestampID int,atmosphericRadiation double,atmosphericRadiation_expiresAt bigint,atmosphericRadiation_accuracy double,tenantId string,geocoordinates_latitude double,startTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 831 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Driver_hbase ( rowkey string,shift string,isOnDuty string,sampleTimestamp bigint,isValid boolean,providerDetails string,isTripOn string,joinDate bigint,active int,canDriveVehicleTypes string,tag string,label string,driverDetails string,DriverBreakUsedStats string,sampleTimestampID int,locationId string,source string,createTimestampID int,nickname string,sid string,destroyTimestamp bigint,private string,thirdPartyId string,lastUpdated bigint,agencyId string,destroyTimestampID int,serviceTime int,custom string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,serviceOnDate bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:shift,d:isOnDuty,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:isTripOn,d:joinDate#b,d:active#b,d:canDriveVehicleTypes,d:tag,d:label,d:driverDetails,d:DriverBreakUsedStats,d:sampleTimestampID#b,d:locationId,d:source,d:createTimestampID#b,d:nickname,d:sid,d:destroyTimestamp#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:agencyId,d:destroyTimestampID#b,d:serviceTime#b,d:custom,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:serviceOnDate#b') TBLPROPERTIES('hbase.table.name' = 'cim.Driver') 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1435 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Driver_location(sampleTimestamp bigint,private string,city String,timezone String,shift string,agencyId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,DriverBreakUsedStats string,joinDate bigint,sampleTimestampID int,locationId String,createTimestampID int,providerDetails string,nickname string,isOnDuty string,tag string,destroyTimestampID int,serviceOnDate bigint,isValid boolean,custom string,active int,label string,serviceTime int,canDriveVehicleTypes string,driverDetails string,isTripOn string,tenantId string,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 762 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Driver(sampleTimestamp bigint,private string,city string,timezone string,shift string,agencyId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,DriverBreakUsedStats string,joinDate bigint,sampleTimestampID int,locationId string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,isOnDuty string,tag string,Agencydimensionid bigint,destroyTimestampID int,serviceOnDate bigint,custom string,isValid boolean,active int,label string,serviceTime int,canDriveVehicleTypes string,driverDetails string,tenantId string,isTripOn string,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 801 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Segment_hbase ( rowkey string,avgSpeed double,avgTime double,way int,boundary_geoPoint string,sampleTimestamp bigint,isValid boolean,providerDetails string,signalCount int,geohash string,active int,width double,tag string,label string,primaryRoad string,type int,sampleTimestampID int,locationId string,createTimestampID int,nickname string,path_geoPoint string,sid string,name string,distance double,destroyTimestamp bigint,private string,thirdPartyId string,lastUpdated bigint,responsible string,destroyTimestampID int,custom string,tenantId string,sequence int,routeId string,receiveTimestamp bigint,createTimestamp bigint,numberOfLanes int) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:avgSpeed#b,d:avgTime#b,d:way#b,d:boundary_geoPoint,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:signalCount#b,d:geohash,d:active#b,d:width#b,d:tag,d:label,d:primaryRoad,d:type#b,d:sampleTimestampID#b,d:locationId,d:createTimestampID#b,d:nickname,d:path_geoPoint,d:sid,d:name,d:distance#b,d:destroyTimestamp#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:responsible,d:destroyTimestampID#b,d:custom,d:tenantId,d:sequence#b,d:routeId,d:receiveTimestamp#b,d:createTimestamp#b,d:numberOfLanes#b') TBLPROPERTIES('hbase.table.name' = 'cim.Segment') 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1526 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Segment_location(sampleTimestamp bigint,private string,distance double,city String,timezone String,boundary_geoPoint string,primaryRoad string,receiveTimestamp bigint,type int,sid string,createTimestamp bigint,numberOfLanes int,thirdPartyId string,lastUpdated bigint,routeId string,avgTime double,sampleTimestampID int,signalCount int,locationId String,geohash string,createTimestampID int,responsible string,providerDetails string,nickname string,tag string,destroyTimestampID int,isValid boolean,custom string,active int,label string,path_geoPoint string,way int,avgSpeed double,sequence int,width double,name string,tenantId string,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 808 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Segment(sampleTimestamp bigint,private string,distance double,city string,timezone String,boundary_geoPoint string,primaryRoad string,receiveTimestamp bigint,type int,sid string,createTimestamp bigint,numberOfLanes int,thirdPartyId string,lastUpdated bigint,routeId string,avgTime double,sampleTimestampID int,locationId string,signalCount int,geohash string,responsible string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,custom string,isValid boolean,active int,path_geoPoint string,label string,way int,avgSpeed double,sequence int,tenantId string,name string,width double,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 822 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.ExitPOM_hbase ( rowkey string,tenantId string,sid string,sampleTimestamp bigint,isValid boolean,receiveTimestamp bigint,pomDirId string,roiEntityRef_entityId string,sampleTimestampID int,locationId string,roiEntityRef_entityType string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:tenantId,d:sid,d:sampleTimestamp#b,d:isValid#b,d:receiveTimestamp#b,d:pomDirId,d:roiEntityRef_entityId,d:sampleTimestampID#b,d:locationId,d:roiEntityRef_entityType') TBLPROPERTIES('hbase.table.name' = 'cim.ExitPOM') 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 786 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.ExitPOM_location(sampleTimestamp bigint,roiEntityRef_entityId string,sampleTimestampID int,city String,locationId String,timezone String,isValid boolean,tenantId string,receiveTimestamp bigint,pomDirId string,roiEntityRef_entityType string,sid string) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 400 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.ExitPOM(sampleTimestamp bigint,roiEntityRef_entityId string,city string,timezone string,isValid boolean,receiveTimestamp bigint,roiEntityRef_entityType string,sid string,MobilityPOMDirectiondimensionid bigint,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,pomDirId string) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 453 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Agency_hbase ( rowkey string,url string,sampleTimestamp bigint,isValid boolean,providerDetails string,active int,tag string,label string,phone string,type string,sampleTimestampID int,locationId string,source string,createTimestampID int,nickname string,sid string,name string,destroyTimestamp bigint,email string,private string,language string,thirdPartyId string,lastUpdated bigint,destroyTimestampID int,custom string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,fareURL string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:url,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:active#b,d:tag,d:label,d:phone,d:type,d:sampleTimestampID#b,d:locationId,d:source,d:createTimestampID#b,d:nickname,d:sid,d:name,d:destroyTimestamp#b,d:email,d:private,d:language,d:thirdPartyId,d:lastUpdated#b,d:destroyTimestampID#b,d:custom,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:fareURL') TBLPROPERTIES('hbase.table.name' = 'cim.Agency') 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1243 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Agency_location(sampleTimestamp bigint,private string,city String,timezone String,language string,source string,receiveTimestamp bigint,type string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,fareURL string,createTimestampID int,providerDetails string,nickname string,tag string,destroyTimestampID int,email string,isValid boolean,custom string,active int,label string,url string,phone string,name string,tenantId string,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 663 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Agency(sampleTimestamp bigint,private string,city string,timezone String,language string,source string,receiveTimestamp bigint,type string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,fareURL string,locationId string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,email string,custom string,isValid boolean,active int,label string,url string,phone string,tenantId string,name string,destroyTimestamp bigint) 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 677 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.450 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Zone_hbase ( rowkey string,createTimestampID int,sid string,boundary_geoPoint string,sampleTimestamp bigint,isValid boolean,name string,destroyTimestamp bigint,providerDetails string,private string,thirdPartyId string,lastUpdated bigint,geohash string,destroyTimestampID int,tenantId string,receiveTimestamp bigint,createTimestamp bigint,label string,tag string,type string,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:sid,d:boundary_geoPoint,d:sampleTimestamp#b,d:isValid#b,d:name,d:destroyTimestamp#b,d:providerDetails,d:private,d:thirdPartyId,d:lastUpdated#b,d:geohash,d:destroyTimestampID#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:label,d:tag,d:type,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.Zone') 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.450 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1103 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Zone_location(sampleTimestamp bigint,private string,city String,timezone String,isValid boolean,boundary_geoPoint string,receiveTimestamp bigint,label string,type string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,createTimestampID int,geohash string,name string,providerDetails string,tenantId string,tag string,destroyTimestampID int,destroyTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 578 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Zone(sampleTimestamp bigint,private string,city string,timezone String,isValid boolean,boundary_geoPoint string,label string,receiveTimestamp bigint,type string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,createTimestampID int,tenantId string,name string,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,destroyTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 592 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientLightSensor_state_hourly(monthweek int,week int,illuminance_max double,timeid int,city string,count int,weekday int,sid string,illuminance_min double,illuminance_avg double,dimensionid bigint,month int,hour int,locationid string,day int,illuminance_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 485 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientLightData_state_hourly(monthweek int,week int,illuminance_max double,timeid int,city string,count int,weekday int,sid string,illuminance_min double,illuminance_avg double,dimensionid bigint,month int,hour int,locationid string,day int,illuminance_Source string) partitioned by (year int, tenantid string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 483 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientLightData_state(sampleTimestamp bigint,timezone string,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,illuminance_accuracy double,illuminance_expiresAt bigint,illuminance double,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,illuminance_reliability double,day int,illuminance_Source string,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 614 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AmbientLightSensor_state(sampleTimestamp bigint,geocoordinates_longitude double,timezone string,geocoordinates_altitude double,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,illuminance_accuracy double,illuminance_expiresAt bigint,illuminance double,sampleTimestampID int,locationId string,tenantId string,geocoordinates_latitude double,timezoneoffset int,illuminance_reliability double,day int,illuminance_Source string,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 710 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.AmbientLightSensor_state_hbase ( rowkey string,sid string,sampleTimestamp bigint,isValid boolean,lastUpdated bigint,illuminance_accuracy double,illuminance double,startTimestamp bigint,tenantId string,illuminance_Source string,parentEntityType string,receiveTimestamp bigint,sampleTimestampID int,illuminance_expiresAt bigint,illuminance_reliability double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:sid,d:sampleTimestamp#b,d:isValid#b,d:lastUpdated#b,d:illuminance_accuracy#b,d:illuminance#b,d:startTimestamp#b,d:tenantId,d:illuminance_Source,d:parentEntityType,d:receiveTimestamp#b,d:sampleTimestampID#b,d:illuminance_expiresAt#b,d:illuminance_reliability#b') TBLPROPERTIES('hbase.table.name' = 'cim.AmbientLightSensor.state') 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1020 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientLightData_state_period(sampleTimestamp bigint,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,illuminance_accuracy double,illuminance_expiresAt bigint,illuminance double,sampleTimestampID int,tenantId string,illuminance_reliability double,startTimestamp bigint,illuminance_Source string) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 489 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AmbientLightSensor_state_period(sampleTimestamp bigint,geocoordinates_longitude double,geocoordinates_altitude double,isValid boolean,parentEntityType string,receiveTimestamp bigint,sid string,lastUpdated bigint,illuminance_accuracy double,illuminance_expiresAt bigint,illuminance double,sampleTimestampID int,tenantId string,geocoordinates_latitude double,illuminance_reliability double,startTimestamp bigint,illuminance_Source string) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 585 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.TransitTrip_hbase ( rowkey string,stopTimesId string,shortName string,sampleTimestamp bigint,isValid boolean,providerDetails string,bikesAllowed int,headsign string,geohash string,active int,expiresAt bigint,boundId int,tag string,label string,sampleTimestampID int,locationId string,source string,expectedRevenue double,createTimestampID int,nickname string,path_geoPoint string,status int,sid string,destroyTimestamp bigint,busNumber int,private string,thirdPartyId string,lastUpdated bigint,agencyId string,direction string,destroyTimestampID int,custom string,validity string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,routeId string,wheelchairAccessible int,serviceId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:stopTimesId,d:shortName,d:sampleTimestamp#b,d:isValid#b,d:providerDetails,d:bikesAllowed#b,d:headsign,d:geohash,d:active#b,d:expiresAt#b,d:boundId#b,d:tag,d:label,d:sampleTimestampID#b,d:locationId,d:source,d:expectedRevenue#b,d:createTimestampID#b,d:nickname,d:path_geoPoint,d:status#b,d:sid,d:destroyTimestamp#b,d:busNumber#b,d:private,d:thirdPartyId,d:lastUpdated#b,d:agencyId,d:direction,d:destroyTimestampID#b,d:custom,d:validity,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:routeId,d:wheelchairAccessible#b,d:serviceId') TBLPROPERTIES('hbase.table.name' = 'cim.TransitTrip') 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1624 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.TransitTrip_location(sampleTimestamp bigint,private string,stopTimesId string,city String,wheelchairAccessible int,timezone String,boundId int,agencyId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,headsign string,thirdPartyId string,lastUpdated bigint,routeId string,sampleTimestampID int,locationId String,geohash string,createTimestampID int,providerDetails string,bikesAllowed int,nickname string,tag string,destroyTimestampID int,serviceId string,direction string,expectedRevenue double,isValid boolean,custom string,active int,label string,path_geoPoint string,expiresAt bigint,busNumber int,tenantId string,validity string,shortName string,status int,destroyTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 864 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.TransitTrip(sampleTimestamp bigint,private string,stopTimesId string,city string,timezone string,wheelchairAccessible int,boundId int,agencyId string,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,headsign string,thirdPartyId string,lastUpdated bigint,routeId string,sampleTimestampID int,locationId string,geohash string,createTimestampID int,nickname string,providerDetails string,bikesAllowed int,timezoneoffset int,tag string,Agencydimensionid bigint,serviceId string,destroyTimestampID int,direction string,StopTimesdimensionid bigint,expectedRevenue double,custom string,isValid boolean,active int,path_geoPoint string,label string,expiresAt bigint,busNumber int,tenantId string,validity string,shortName string,Routedimensionid bigint,status int,destroyTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 955 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AirQualitySensor_state_hourly(pm2p5_max double,weekday int,co2_max double,cimno_min double,cimno_avg double,co_Source string,h2s_Source string,pm2p5_min double,pm2p5_avg double,dimensionid bigint,pm10_min double,pm10_avg double,day int,nh3_Source string,pb_Source string,pm2p5_Source string,cimno_Source string,co2_Source string,pb_max double,h2s_min double,h2s_avg double,count int,co_max double,so2_min double,so2_avg double,o3_max double,nh3_min double,nh3_avg double,no2_Source string,month int,pb_min double,pb_avg double,so2_max double,co2_min double,co2_avg double,cimno_max double,nh3_max double,week int,timeid int,city string,pm1_Source string,co_min double,co_avg double,so2_Source string,sid string,o3_min double,o3_avg double,hour int,h2s_max double,o3_Source string,no2_min double,no2_avg double,pm1_max double,monthweek int,no2_max double,pm10_Source string,locationid string,pm10_max double,pm1_min double,pm1_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1152 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AirQualityData_state_hourly(pm2p5_max double,weekday int,co2_max double,cimno_min double,cimno_avg double,co_Source string,h2s_Source string,pm2p5_min double,pm2p5_avg double,dimensionid bigint,pm10_min double,pm10_avg double,day int,nh3_Source string,pb_Source string,pm2p5_Source string,cimno_Source string,co2_Source string,pb_max double,h2s_min double,h2s_avg double,count int,co_max double,so2_min double,so2_avg double,o3_max double,nh3_min double,nh3_avg double,no2_Source string,month int,pb_min double,pb_avg double,so2_max double,co2_min double,co2_avg double,cimno_max double,nh3_max double,week int,timeid int,city string,pm1_Source string,co_min double,co_avg double,so2_Source string,sid string,o3_min double,o3_avg double,hour int,h2s_max double,o3_Source string,no2_min double,no2_avg double,pm1_max double,monthweek int,no2_max double,pm10_Source string,locationid string,pm10_max double,pm1_min double,pm1_avg double) partitioned by (year int, tenantid string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1150 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AirQualityData_state(pm2p5 double,cimno_accuracy double,nh3_accuracy double,receiveTimestamp bigint,pm2p5_reliability double,no2 double,co_Source string,h2s_Source string,lastUpdated bigint,cimno_expiresAt bigint,o3_accuracy double,pb_expiresAt bigint,co_reliability double,nh3_Source string,day int,pb_Source string,pm2p5_accuracy double,pm2p5_expiresAt bigint,o3 double,pm1_reliability double,pm2p5_Source string,cimno_Source string,co2_Source string,h2s_accuracy double,pb_accuracy double,pm10_accuracy double,o3_reliability double,no2_Source string,nh3_expiresAt bigint,cimno_reliability double,nh3 double,pm10_expiresAt bigint,pm1_expiresAt bigint,sampleTimestamp bigint,co2_reliability double,pm1_Source string,timezone string,co2 double,co2_expiresAt bigint,parentEntityType string,no2_accuracy double,so2_Source string,sid string,nh3_reliability double,pm1 double,sampleTimestampID int,locationId string,so2 double,co2_accuracy double,timezoneoffset int,o3_Source string,h2s_reliability double,h2s_expiresAt bigint,so2_accuracy double,so2_expiresAt bigint,isValid boolean,h2s double,pm10 double,so2_reliability double,cimno double,o3_expiresAt bigint,co double,no2_expiresAt bigint,pb double,co_accuracy double,co_expiresAt bigint,pm10_Source string,pb_reliability double,tenantId string,pm1_accuracy double,pm10_reliability double,no2_reliability double,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1605 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.AirQualitySensor_state(pm2p5 double,cimno_accuracy double,nh3_accuracy double,receiveTimestamp bigint,pm2p5_reliability double,no2 double,co_Source string,h2s_Source string,lastUpdated bigint,cimno_expiresAt bigint,o3_accuracy double,pb_expiresAt bigint,geocoordinates_latitude double,co_reliability double,nh3_Source string,day int,pb_Source string,pm2p5_accuracy double,pm2p5_expiresAt bigint,o3 double,pm1_reliability double,pm2p5_Source string,cimno_Source string,co2_Source string,h2s_accuracy double,pb_accuracy double,pm10_accuracy double,o3_reliability double,no2_Source string,nh3_expiresAt bigint,cimno_reliability double,nh3 double,pm10_expiresAt bigint,pm1_expiresAt bigint,sampleTimestamp bigint,geocoordinates_longitude double,co2_reliability double,pm1_Source string,timezone string,co2 double,co2_expiresAt bigint,parentEntityType string,no2_accuracy double,so2_Source string,sid string,nh3_reliability double,pm1 double,sampleTimestampID int,locationId string,so2 double,co2_accuracy double,timezoneoffset int,o3_Source string,h2s_reliability double,h2s_expiresAt bigint,so2_accuracy double,so2_expiresAt bigint,geocoordinates_altitude double,isValid boolean,h2s double,pm10 double,so2_reliability double,cimno double,o3_expiresAt bigint,co double,no2_expiresAt bigint,pb double,co_accuracy double,co_expiresAt bigint,pm10_Source string,pb_reliability double,tenantId string,pm1_accuracy double,pm10_reliability double,no2_reliability double,startTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1701 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.AirQualitySensor_state_hbase ( rowkey string,pm2p5 double,co2_Source string,nh3_Source string,isValid boolean,pb_reliability double,pm10_reliability double,pb_Source string,o3_accuracy double,pm10_Source string,cimno_expiresAt bigint,pb_expiresAt bigint,o3_expiresAt bigint,nh3_accuracy double,co_reliability double,co_expiresAt bigint,pm1_accuracy double,parentEntityType string,so2_reliability double,h2s_accuracy double,pm1_expiresAt bigint,sampleTimestampID int,pm2p5_Source string,co_Source string,pm1 double,pm1_Source string,sid string,so2_Source string,pm10_accuracy double,o3_reliability double,co2 double,no2_reliability double,cimno_accuracy double,no2_Source string,pm10_expiresAt bigint,pm10 double,receiveTimestamp bigint,cimno double,nh3 double,cimno_reliability double,no2_accuracy double,co2_expiresAt bigint,sampleTimestamp bigint,o3_Source string,co2_reliability double,pm2p5_reliability double,nh3_expiresAt bigint,h2s double,nh3_reliability double,h2s_expiresAt bigint,h2s_Source string,o3 double,pm2p5_accuracy double,co_accuracy double,no2_expiresAt bigint,so2 double,so2_expiresAt bigint,no2 double,pb double,pm1_reliability double,pm2p5_expiresAt bigint,co double,lastUpdated bigint,h2s_reliability double,cimno_Source string,so2_accuracy double,startTimestamp bigint,co2_accuracy double,tenantId string,pb_accuracy double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:pm2p5#b,d:co2_Source,d:nh3_Source,d:isValid#b,d:pb_reliability#b,d:pm10_reliability#b,d:pb_Source,d:o3_accuracy#b,d:pm10_Source,d:cimno_expiresAt#b,d:pb_expiresAt#b,d:o3_expiresAt#b,d:nh3_accuracy#b,d:co_reliability#b,d:co_expiresAt#b,d:pm1_accuracy#b,d:parentEntityType,d:so2_reliability#b,d:h2s_accuracy#b,d:pm1_expiresAt#b,d:sampleTimestampID#b,d:pm2p5_Source,d:co_Source,d:pm1#b,d:pm1_Source,d:sid,d:so2_Source,d:pm10_accuracy#b,d:o3_reliability#b,d:co2#b,d:no2_reliability#b,d:cimno_accuracy#b,d:no2_Source,d:pm10_expiresAt#b,d:pm10#b,d:receiveTimestamp#b,d:cimno#b,d:nh3#b,d:cimno_reliability#b,d:no2_accuracy#b,d:co2_expiresAt#b,d:sampleTimestamp#b,d:o3_Source,d:co2_reliability#b,d:pm2p5_reliability#b,d:nh3_expiresAt#b,d:h2s#b,d:nh3_reliability#b,d:h2s_expiresAt#b,d:h2s_Source,d:o3#b,d:pm2p5_accuracy#b,d:co_accuracy#b,d:no2_expiresAt#b,d:so2#b,d:so2_expiresAt#b,d:no2#b,d:pb#b,d:pm1_reliability#b,d:pm2p5_expiresAt#b,d:co#b,d:lastUpdated#b,d:h2s_reliability#b,d:cimno_Source,d:so2_accuracy#b,d:startTimestamp#b,d:co2_accuracy#b,d:tenantId,d:pb_accuracy#b') TBLPROPERTIES('hbase.table.name' = 'cim.AirQualitySensor.state') 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 2815 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AirQualityData_state_period(pm2p5 double,cimno_accuracy double,nh3_accuracy double,receiveTimestamp bigint,pm2p5_reliability double,co_Source string,no2 double,h2s_Source string,lastUpdated bigint,o3_accuracy double,cimno_expiresAt bigint,pb_expiresAt bigint,co_reliability double,nh3_Source string,pb_Source string,pm2p5_accuracy double,pm2p5_expiresAt bigint,pm2p5_Source string,o3 double,pm1_reliability double,co2_Source string,h2s_accuracy double,cimno_Source string,pb_accuracy double,pm10_accuracy double,o3_reliability double,no2_Source string,nh3_expiresAt bigint,cimno_reliability double,nh3 double,pm10_expiresAt bigint,pm1_expiresAt bigint,sampleTimestamp bigint,co2_reliability double,pm1_Source string,co2 double,co2_expiresAt bigint,parentEntityType string,no2_accuracy double,so2_Source string,sid string,nh3_reliability double,sampleTimestampID int,pm1 double,so2 double,co2_accuracy double,o3_Source string,h2s_reliability double,h2s_expiresAt bigint,so2_expiresAt bigint,so2_accuracy double,isValid boolean,pm10 double,h2s double,so2_reliability double,o3_expiresAt bigint,cimno double,co double,no2_expiresAt bigint,pb double,co_expiresAt bigint,co_accuracy double,pm10_Source string,pb_reliability double,tenantId string,pm1_accuracy double,pm10_reliability double,no2_reliability double,startTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1480 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.AirQualitySensor_state_period(pm2p5 double,cimno_accuracy double,nh3_accuracy double,receiveTimestamp bigint,pm2p5_reliability double,co_Source string,no2 double,h2s_Source string,lastUpdated bigint,o3_accuracy double,cimno_expiresAt bigint,pb_expiresAt bigint,geocoordinates_latitude double,co_reliability double,nh3_Source string,pb_Source string,pm2p5_accuracy double,pm2p5_expiresAt bigint,pm2p5_Source string,o3 double,pm1_reliability double,co2_Source string,h2s_accuracy double,cimno_Source string,pb_accuracy double,pm10_accuracy double,o3_reliability double,no2_Source string,nh3_expiresAt bigint,cimno_reliability double,nh3 double,pm10_expiresAt bigint,pm1_expiresAt bigint,sampleTimestamp bigint,geocoordinates_longitude double,co2_reliability double,pm1_Source string,co2 double,co2_expiresAt bigint,parentEntityType string,no2_accuracy double,so2_Source string,sid string,nh3_reliability double,sampleTimestampID int,pm1 double,so2 double,co2_accuracy double,o3_Source string,h2s_reliability double,h2s_expiresAt bigint,so2_expiresAt bigint,so2_accuracy double,geocoordinates_altitude double,isValid boolean,pm10 double,h2s double,so2_reliability double,o3_expiresAt bigint,cimno double,co double,no2_expiresAt bigint,pb double,co_expiresAt bigint,co_accuracy double,pm10_Source string,pb_reliability double,tenantId string,pm1_accuracy double,pm10_reliability double,no2_reliability double,startTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1576 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.FareRules_hbase ( rowkey string,createTimestampID int,sid string,transfers int,paymentMethod int,sampleTimestamp bigint,isValid boolean,destroyTimestamp bigint,providerDetails string,thirdPartyId string,lastUpdated bigint,transferDuration double,destroyTimestampID int,price double,tenantId string,routeId string,receiveTimestamp bigint,createTimestamp bigint,label string,tag string,sampleTimestampID int,locationId string,source string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:sid,d:transfers#b,d:paymentMethod#b,d:sampleTimestamp#b,d:isValid#b,d:destroyTimestamp#b,d:providerDetails,d:thirdPartyId,d:lastUpdated#b,d:transferDuration#b,d:destroyTimestampID#b,d:price#b,d:tenantId,d:routeId,d:receiveTimestamp#b,d:createTimestamp#b,d:label,d:tag,d:sampleTimestampID#b,d:locationId,d:source') TBLPROPERTIES('hbase.table.name' = 'cim.FareRules') 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1160 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.FareRules_location(sampleTimestamp bigint,city String,timezone String,isValid boolean,receiveTimestamp bigint,label string,source string,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,routeId string,sampleTimestampID int,transfers int,price double,locationId String,createTimestampID int,providerDetails string,tenantId string,paymentMethod int,tag string,destroyTimestampID int,transferDuration double,destroyTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 602 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.FareRules(sampleTimestamp bigint,city string,timezone String,source string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,lastUpdated bigint,routeId string,sampleTimestampID int,price double,transfers int,locationId string,createTimestampID int,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,isValid boolean,label string,tenantId string,paymentMethod int,transferDuration double,destroyTimestamp bigint) 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 616 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.451 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.OccupiedParkingArea_event_hbase ( rowkey string,expectedRevenue double,createTimestampID int,sid string,boundary_geoPoint string,sampleTimestamp bigint,isValid boolean,destroyTimestamp bigint,providerDetails string,thirdPartyId string,destroyTimestampID int,custom string,tenantId string,receiveTimestamp bigint,createTimestamp bigint,label string,tag string,sampleTimestampID int,locationId string,destroyReason string,parkingAreaId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:expectedRevenue#b,d:createTimestampID#b,d:sid,d:boundary_geoPoint,d:sampleTimestamp#b,d:isValid#b,d:destroyTimestamp#b,d:providerDetails,d:thirdPartyId,d:destroyTimestampID#b,d:custom,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:label,d:tag,d:sampleTimestampID#b,d:locationId,d:destroyReason,d:parkingAreaId') TBLPROPERTIES('hbase.table.name' = 'cim.OccupiedParkingArea.event') 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1162 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.451 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.OccupiedParkingArea_event_period(sampleTimestamp bigint,expectedRevenue double,parkingAreaId string,isValid boolean,custom string,boundary_geoPoint string,receiveTimestamp bigint,label string,sid string,createTimestamp bigint,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,providerDetails string,tenantId string,tag string,destroyTimestampID int,destroyReason string,destroyTimestamp bigint) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 575 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.OccupiedParkingArea_event(sampleTimestamp bigint,timezone string,ParkingAreadimensionid bigint,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,day int,expectedRevenue double,parkingAreaId string,isValid boolean,custom string,label string,tenantId string,destroyReason string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 712 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.nodest_OccupiedParkingArea_event(sampleTimestamp bigint,timezone string,ParkingAreadimensionid bigint,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,providerDetails string,timezoneoffset int,tag string,destroyTimestampID int,day int,expectedRevenue double,parkingAreaId string,isValid boolean,custom string,label string,tenantId string,destroyReason string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 719 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.HarmonicaIndexEntry_hbase ( rowkey string,entityId string,attributeName string,entityType string,sid string,tenantId string,sampleTimestamp bigint,parentEntityType string,isValid boolean,receiveTimestamp bigint,sampleTimestampID int,locationId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:entityId,d:attributeName,d:entityType,d:sid,d:tenantId,d:sampleTimestamp#b,d:parentEntityType,d:isValid#b,d:receiveTimestamp#b,d:sampleTimestampID#b,d:locationId') TBLPROPERTIES('hbase.table.name' = 'cim.HarmonicaIndexEntry') 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 811 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.HarmonicaIndexEntry_location(sampleTimestamp bigint,city String,entityType string,timezone String,isValid boolean,entityId string,parentEntityType string,receiveTimestamp bigint,sid string,sampleTimestampID int,locationId String,tenantId string,attributeName string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 415 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HarmonicaIndexEntry(sampleTimestamp bigint,city string,entityType string,timezone String,isValid boolean,entityId string,parentEntityType string,receiveTimestamp bigint,sid string,sampleTimestampID int,locationId string,tenantId string,timezoneoffset int,attributeName string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 429 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.HarmonicaIndexEntry_statistic_hourly_hbase ( rowkey string,attributeName string,bgneq double,entityType string,sid string,sampleTimestamp bigint,bgnIndex double,entityId string,startTimestamp bigint,evtIndex double,globalIndex double,tenantId string,evteq double,parentEntityType string,receiveTimestamp bigint,startTimestampID int,sampleTimestampID int,colour string,laeq double,hourId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:attributeName,d:bgneq#b,d:entityType,d:sid,d:sampleTimestamp#b,d:bgnIndex#b,d:entityId,d:startTimestamp#b,d:evtIndex#b,d:globalIndex#b,d:tenantId,d:evteq#b,d:parentEntityType,d:receiveTimestamp#b,d:startTimestampID#b,d:sampleTimestampID#b,d:colour,d:laeq#b,d:hourId') TBLPROPERTIES('hbase.table.name' = 'cim.HarmonicaIndexEntry.statistic.hourly') 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1075 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.HarmonicaIndexEntry_statistic_hourly_period(bgneq double,sampleTimestamp bigint,year int,city string,entityType string,evteq double,entityId string,parentEntityType string,bgnIndex double,receiveTimestamp bigint,sid string,evtIndex double,startTimestampID int,globalIndex double,colour string,laeq double,hourId string,month int,sampleTimestampID int,locationid String,tenantId string,timezoneoffset int,attributeName string,startTimestamp bigint) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 596 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HarmonicaIndexEntry_statisticdata_hourly(evteq_sum double,weekday int,evteq double,receiveTimestamp bigint,laeq_avg Double,laeq_min double,evtIndex double,startTimestampID int,laeq_sum double,day int,bgneq double,globalIndex_min double,globalIndex_avg Double,entityType string,evtIndex_sum double,bgneq_max double,evteq_max double,month int,bgneq_sum double,globalIndex_max double,evteq_min double,evteq_avg Double,bgneq_min double,bgneq_avg Double,sampleTimestamp bigint,week int,timeid int,city string,bgnIndex_max double,evtIndex_max double,parentEntityType string,bgnIndex double,sid string,laeq double,hourId string,hour int,sampleTimestampID int,globalIndex_sum double,timezoneoffset int,attributeName string,evtIndex_min double,evtIndex_avg Double,monthweek int,entityId string,globalIndex double,colour string,bgnIndex_min double,bgnIndex_avg Double,locationid string,bgnIndex_sum double,laeq_max double,startTimestamp bigint) partitioned by (year int, tenantid string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1149 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.HarmonicaIndexEntry_statistic_hourly(evteq_sum double,weekday int,evteq double,receiveTimestamp bigint,laeq_avg Double,laeq_min double,evtIndex double,startTimestampID int,laeq_sum double,geocoordinates_latitude double,day int,bgneq double,globalIndex_min double,globalIndex_avg Double,entityType string,evtIndex_sum double,bgneq_max double,evteq_max double,month int,bgneq_sum double,globalIndex_max double,evteq_min double,evteq_avg Double,bgneq_min double,bgneq_avg Double,sampleTimestamp bigint,geocoordinates_longitude double,week int,timeid int,city string,bgnIndex_max double,evtIndex_max double,parentEntityType string,bgnIndex double,sid string,laeq double,hourId string,hour int,sampleTimestampID int,globalIndex_sum double,timezoneoffset int,attributeName string,evtIndex_min double,evtIndex_avg Double,monthweek int,geocoordinates_altitude double,entityId string,globalIndex double,colour string,bgnIndex_min double,bgnIndex_avg Double,locationid string,bgnIndex_sum double,laeq_max double,startTimestamp bigint) partitioned by (year int, tenantid string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1239 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteDumpSpace_hbase ( rowkey string,contentType string,boundary_geoPoint string,sampleTimestamp bigint,maxWeightCollectionCapacity double,isValid boolean,providerDetails string,geohash string,active int,label string,sampleTimestampID int,locationId string,createTimestampID int,nickname string,maxVolumeCollectionCapacity double,status string,sid string,isAvalableForCollection string,destroyTimestamp bigint,private string,agencyId string,thirdPartyId string,lastUpdated bigint,fillLevelThreshold double,destroyTimestampID int,tenantId string,collectionType string,receiveTimestamp bigint,createTimestamp bigint) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:contentType,d:boundary_geoPoint,d:sampleTimestamp#b,d:maxWeightCollectionCapacity#b,d:isValid#b,d:providerDetails,d:geohash,d:active#b,d:label,d:sampleTimestampID#b,d:locationId,d:createTimestampID#b,d:nickname,d:maxVolumeCollectionCapacity#b,d:status,d:sid,d:isAvalableForCollection,d:destroyTimestamp#b,d:private,d:agencyId,d:thirdPartyId,d:lastUpdated#b,d:fillLevelThreshold#b,d:destroyTimestampID#b,d:tenantId,d:collectionType,d:receiveTimestamp#b,d:createTimestamp#b') TBLPROPERTIES('hbase.table.name' = 'cim.WasteDumpSpace') 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1479 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteDumpSpace_location(sampleTimestamp bigint,private string,maxVolumeCollectionCapacity double,city String,timezone String,agencyId string,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,isAvalableForCollection string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId String,geohash string,createTimestampID int,providerDetails string,nickname string,destroyTimestampID int,contentType string,isValid boolean,active int,label string,collectionType string,fillLevelThreshold double,tenantId string,maxWeightCollectionCapacity double,status string,destroyTimestamp bigint) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 778 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteDumpSpace(sampleTimestamp bigint,private string,maxVolumeCollectionCapacity double,city string,timezone String,agencyId string,boundary_geoPoint string,receiveTimestamp bigint,sid string,createTimestamp bigint,isAvalableForCollection string,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,geohash string,createTimestampID int,nickname string,providerDetails string,timezoneoffset int,destroyTimestampID int,contentType string,isValid boolean,active int,label string,collectionType string,fillLevelThreshold double,tenantId string,maxWeightCollectionCapacity double,status string,destroyTimestamp bigint) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 792 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.WasteDumpSpace_state(sampleTimestamp bigint,timezone string,isValid boolean,weight double,fillLevel double,receiveTimestamp bigint,sid string,createTimestamp bigint,volume double,lastUpdated bigint,dimensionid bigint,sampleTimestampID int,locationId string,createTimestampID int,tenantId string,timezoneoffset int,destroyTimestampID int,day int,startTimestamp bigint,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 611 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.WasteDumpSpace_state_hbase ( rowkey string,createTimestampID int,sid string,sampleTimestamp bigint,isValid boolean,destroyTimestamp bigint,lastUpdated bigint,fillLevel double,destroyTimestampID int,startTimestamp bigint,tenantId string,receiveTimestamp bigint,createTimestamp bigint,volume double,sampleTimestampID int,weight double) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:createTimestampID#b,d:sid,d:sampleTimestamp#b,d:isValid#b,d:destroyTimestamp#b,d:lastUpdated#b,d:fillLevel#b,d:destroyTimestampID#b,d:startTimestamp#b,d:tenantId,d:receiveTimestamp#b,d:createTimestamp#b,d:volume#b,d:sampleTimestampID#b,d:weight#b') TBLPROPERTIES('hbase.table.name' = 'cim.WasteDumpSpace.state') 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 979 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.WasteDumpSpace_state_period(sampleTimestamp bigint,isValid boolean,weight double,fillLevel double,receiveTimestamp bigint,sid string,createTimestamp bigint,volume double,lastUpdated bigint,sampleTimestampID int,createTimestampID int,tenantId string,destroyTimestampID int,startTimestamp bigint,destroyTimestamp bigint) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 467 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.nodest_Incident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1252 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.Incident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1245 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.parkingincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1252 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.environmentincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1256 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.lightincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1250 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.trafficincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1252 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.transitincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1252 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.wasteincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1250 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.mobilityincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1253 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cimdata.networkincident_event(incidentTypeName string,incidentType string,boundary_geoPoint string,receiveTimestamp bigint,deviceId string,parentDomain string,createTimestamp bigint,closureTime bigint,lastUpdated bigint,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,day int,parkingSpaceId string,deviceType string,codeDesc string,incidentSid string,roadSegmentLaneId string,eventDetails string,parkingSpotId string,status string,geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,code string,timezone string,reliability string,description string,roadSegmentId string,providerDetails_providerId string,sid string,applicableDomain string,thirdPartyId string,sampleTimestampID int,locationId string,createTimestampID int,timezoneoffset int,destroyTimestampID int,severity string,parkingAreaId string,geocoordinates_altitude double,isValid boolean,custom string,providerDetails_provider string,label string,providerDetails_OUI string,expiresAt bigint,tenantId string,destroyTimestamp bigint) partitioned by (year int,month int,city string) stored as parquet 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1252 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Hive command: create external table if not exists cim.Incident_event_hbase ( rowkey string,roadSegmentId string,geocoordinates_longitude double,deviceType string,boundary_geoPoint string,parkingSpaceId string,description string,sampleTimestamp bigint,code string,isValid boolean,roadSegmentLaneId string,reliability string,eventDetails string,eventSids string,expiresAt bigint,applicableDomain string,providerDetails_OUI string,codeDesc string,label string,sampleTimestampID int,locationId string,severity string,createTimestampID int,incidentTypeName string,incidentTime bigint,status string,sid string,geocoordinates_altitude double,parkingSpotId string,deviceId string,destroyTimestamp bigint,providerDetails_providerId string,thirdPartyId string,lastUpdated bigint,incidentSid string,destroyTimestampID int,custom string,incidentType string,closureTime bigint,tenantId string,parentDomain string,forecastClosureTime bigint,providerDetails_provider string,receiveTimestamp bigint,createTimestamp bigint,geocoordinates_latitude double,parkingAreaId string) ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,d:roadSegmentId,d:geocoordinates_longitude#b,d:deviceType,d:boundary_geoPoint,d:parkingSpaceId,d:description,d:sampleTimestamp#b,d:code,d:isValid#b,d:roadSegmentLaneId,d:reliability,d:eventDetails,d:eventSids,d:expiresAt#b,d:applicableDomain,d:providerDetails_OUI,d:codeDesc,d:label,d:sampleTimestampID#b,d:locationId,d:severity,d:createTimestampID#b,d:incidentTypeName,d:incidentTime#b,d:status,d:sid,d:geocoordinates_altitude#b,d:parkingSpotId,d:deviceId,d:destroyTimestamp#b,d:providerDetails_providerId,d:thirdPartyId,d:lastUpdated#b,d:incidentSid,d:destroyTimestampID#b,d:custom,d:incidentType,d:closureTime#b,d:tenantId,d:parentDomain,d:forecastClosureTime#b,d:providerDetails_provider,d:receiveTimestamp#b,d:createTimestamp#b,d:geocoordinates_latitude#b,d:parkingAreaId') TBLPROPERTIES('hbase.table.name' = 'cim.Incident.event') 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 2172 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 99 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.Incident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1156 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.parkingincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1163 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.environmentincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1167 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.lightincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1161 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.trafficincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1163 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.transitincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1163 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.wasteincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1161 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.mobilityincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1164 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: create table if not exists cim.networkincident_event_period(geocoordinates_longitude double,sampleTimestamp bigint,eventSids string,incidentTypeName string,code string,reliability string,description string,boundary_geoPoint string,receiveTimestamp bigint,roadSegmentId string,deviceId string,providerDetails_providerId string,sid string,parentDomain string,createTimestamp bigint,applicableDomain string,closureTime bigint,thirdPartyId string,lastUpdated bigint,sampleTimestampID int,locationId string,createTimestampID int,geocoordinates_latitude double,incidentTime bigint,forecastClosureTime bigint,destroyTimestampID int,deviceType string,parkingSpaceId string,severity string,parkingAreaId string,codeDesc string,geocoordinates_altitude double,incidentSid string,isValid boolean,custom string,providerDetails_provider string,roadSegmentLaneId string,label string,expiresAt bigint,providerDetails_OUI string,eventDetails string,parkingSpotId string,tenantId string,status string,destroyTimestamp bigint) partitioned by (incidenttype string) 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 1163 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 106 09:09:14.452 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: invalidate metadata 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.452 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 137 09:09:14.457 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.457 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.457 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.457 [main] INFO com.cisco.cim.oozie.action.ImpalaETLAction - End time before update: 2022-06-09T11:30+0530(1650371520000) 09:09:14.457 [main] INFO com.cisco.cim.oozie.action.ImpalaETLAction - End time is updated to 2022-04-20T14:02+0530(1650443520000) 09:09:14.457 [main] INFO com.cisco.cim.oozie.action.ImpalaETLAction - Will check user for:cimdata 09:09:14.457 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: invalidate metadata 09:09:14.457 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.457 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.457 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 137 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 53 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 132 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 136 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 125 09:09:14.463 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.463 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.463 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 96 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 159 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 131 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 759 09:09:14.463 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string: 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.463 [main] INFO com.cisco.cim.oozie.util.DBManager - Execute Impala command: select distinct city from cim.locations 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42 09:09:14.463 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 157 09:09:14.464 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 10639 09:09:14.464 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 83 09:09:14.464 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 40 09:09:14.464 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 83 09:09:14.464 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 40 java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) Caused by: org.apache.oozie.action.hadoop.JavaMain$JavaMainException: org.apache.hive.service.cli.HiveSQLException: RuntimeException: couldn't retrieve HBase table (cim.Locations) info: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RetriesExhaustedException: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RemoteWithExtrasException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104) at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:35) ... 16 more Caused by: org.apache.hive.service.cli.HiveSQLException: RuntimeException: couldn't retrieve HBase table (cim.Locations) info: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RetriesExhaustedException: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RemoteWithExtrasException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:266) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:252) at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:318) at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:259) at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:473) at com.cisco.cim.oozie.util.DBManager.executeImpalaCmdQuery(DBManager.java:150) at com.cisco.cim.oozie.util.ETL.getCities(ETL.java:296) at com.cisco.cim.oozie.util.ETL.ETLloop(ETL.java:1291) at com.cisco.cim.oozie.action.ImpalaETLAction.main(ImpalaETLAction.java:209) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55) ... 18 more Failing Oozie Launcher, RuntimeException: couldn't retrieve HBase table (cim.Locations) info: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RetriesExhaustedException: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RemoteWithExtrasException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) org.apache.hive.service.cli.HiveSQLException: RuntimeException: couldn't retrieve HBase table (cim.Locations) info: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RetriesExhaustedException: Failed after attempts=4, exceptions: Thu Jun 09 11:31:03 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) Thu Jun 09 11:31:04 IST 2022, RpcRetryingCaller{globalStartTime=1654754463926, pause=100, maxAttempts=4}, org.apache.hadoop.hbase.NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: NotServingRegionException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) CAUSED BY: RemoteWithExtrasException: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is not online on data-02.novalocal,16020,1652959331850 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3328) at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3305) at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1428) at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443) at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998) at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413) at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324) at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:266) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:252) at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:318) at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:259) at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:473) at com.cisco.cim.oozie.util.DBManager.executeImpalaCmdQuery(DBManager.java:150) at com.cisco.cim.oozie.util.ETL.getCities(ETL.java:296) at com.cisco.cim.oozie.util.ETL.ETLloop(ETL.java:1291) at com.cisco.cim.oozie.action.ImpalaETLAction.main(ImpalaETLAction.java:209) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104) at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) Oozie Launcher, uploading action data to HDFS sequence file: hdfs://nameservice1/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq 09:09:14.464 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking for FS supporting hdfs 09:09:14.464 [main] DEBUG org.apache.hadoop.fs.FileSystem - looking for configuration option fs.hdfs.impl 09:09:14.464 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking in service filesystems for implementation class 09:09:14.464 [main] DEBUG org.apache.hadoop.fs.FileSystem - FS for hdfs is class org.apache.hadoop.hdfs.DistributedFileSystem 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.use.legacy.blockreader.local = false 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.read.shortcircuit = false 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.domain.socket.data.traffic = false 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.domain.socket.path = /var/run/hdfs-sockets/dn 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0 09:09:14.464 [main] DEBUG org.apache.hadoop.security.token.Token - Cloned private token Kind: HDFS_DELEGATION_TOKEN, Service: 10.106.8.128:8020, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) from Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.HAUtilClient - Mapped HA service delegation token for logical URI hdfs://nameservice1/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq to namenode cm-hue-01.novalocal/10.106.8.128:8020 09:09:14.464 [main] DEBUG org.apache.hadoop.security.token.Token - Cloned private token Kind: HDFS_DELEGATION_TOKEN, Service: 10.106.8.129:8020, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) from Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for oozie: HDFS_DELEGATION_TOKEN owner=oozie, renewer=yarn, realUser=oozie/data-02.novalocal@CIM.IVSG.AUTH, issueDate=1654754401776, maxDate=1655359201776, sequenceNumber=76768, masterKeyId=681) 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.HAUtilClient - Mapped HA service delegation token for logical URI hdfs://nameservice1/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq to namenode name-01.novalocal/10.106.8.129:8020 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.use.legacy.blockreader.local = false 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.read.shortcircuit = false 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.domain.socket.data.traffic = false 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.domain.socket.path = /var/run/hdfs-sockets/dn 09:09:14.464 [main] DEBUG org.apache.hadoop.io.retry.RetryUtils - multipleLinearRandomRetry = null 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq: masked={ masked: rw-r--r--, unmasked: rw-rw-rw- } 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to cm-hue-01.novalocal/10.106.8.128:8020 09:09:14.464 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ClientNamenodeProtocolPB 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk= 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk=\",realm=\"default\",nonce=\"zL2eP58lfotUvo8G6Z7b4daGRpWZBXw0CK8uOGiu\",nc=00000001,cnonce=\"t4DvGFpt6qU2PFnN+jlKVaOAYCJ8PMc2hAxxSi5I\",digest-uri=\"/default\",maxbuf=65536,response=80dc2f8529448d74fe0d4e30ba07e2ae,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.464 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedActionException as:oozie (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.464 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719) 09:09:14.464 [main] WARN org.apache.hadoop.ipc.Client - Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.464 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedActionException as:oozie (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - closing ipc connection to cm-hue-01.novalocal/10.106.8.128:8020: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error org.apache.hadoop.ipc.RemoteException: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:374) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614) at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:799) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:795) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560) at org.apache.hadoop.ipc.Client.call(Client.java:1391) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy15.create(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:349) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy16.create(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1182) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1161) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1099) at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:464) at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:461) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:475) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:402) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1052) at org.apache.hadoop.io.SequenceFile$Writer.(SequenceFile.java:1168) at org.apache.hadoop.io.SequenceFile$RecordCompressWriter.(SequenceFile.java:1482) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:287) at org.apache.oozie.action.hadoop.SequenceFileWriterFactory.createSequenceFileWriter(SequenceFileWriterFactory.java:30) at org.apache.oozie.action.hadoop.HdfsOperations.uploadActionDataToHDFS(HdfsOperations.java:63) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:275) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8020 from oozie: closed 09:09:14.464 [main] DEBUG org.apache.hadoop.io.retry.RetryInvocationHandler - org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error, while invoking ClientNamenodeProtocolTranslatorPB.create over cm-hue-01.novalocal/10.106.8.128:8020. Trying to failover immediately. org.apache.hadoop.ipc.RemoteException: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499) at org.apache.hadoop.ipc.Client.call(Client.java:1445) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy15.create(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:349) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy16.create(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1182) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1161) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1099) at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:464) at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:461) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:475) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:402) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1052) at org.apache.hadoop.io.SequenceFile$Writer.(SequenceFile.java:1168) at org.apache.hadoop.io.SequenceFile$RecordCompressWriter.(SequenceFile.java:1482) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:287) at org.apache.oozie.action.hadoop.SequenceFileWriterFactory.createSequenceFileWriter(SequenceFileWriterFactory.java:30) at org.apache.oozie.action.hadoop.HdfsOperations.uploadActionDataToHDFS(HdfsOperations.java:63) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:275) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) 09:09:14.464 [main] DEBUG org.apache.hadoop.io.retry.RetryUtils - multipleLinearRandomRetry = null 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to name-01.novalocal/10.106.8.129:8020 09:09:14.464 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ClientNamenodeProtocolPB 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk= 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"AAVvb3ppZQR5YXJuJW9vemllL2RhdGEtMDIubm92YWxvY2FsQENJTS5JVlNHLkFVVEiKAYFHDF3wigGBaxjh8I0BK+COAqk=\",realm=\"default\",nonce=\"4YwDcFPe42quphDwDXkpFS69367sFSf2A+aXCkxw\",nc=00000001,cnonce=\"Ws5nx0N7xgWk12aLW/nVnsjSUybcAWr5SYKUlAdu\",digest-uri=\"/default\",maxbuf=65536,response=a8b3de78971ee8825817a155c2fb9ec1,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - Negotiated QOP is :auth 09:09:14.464 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie: starting, having connections 1 09:09:14.464 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #10 org.apache.hadoop.hdfs.protocol.ClientProtocol.create 09:09:14.464 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #10 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: create took 28ms 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - computePacketChunkSize: src=/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq, chunkSize=516, chunksPerPacket=126, packetSize=65016 09:09:14.464 [LeaseRenewer:oozie@nameservice1] DEBUG org.apache.hadoop.hdfs.client.impl.LeaseRenewer - Lease renewer daemon for [DFSClient_NONMAPREDUCE_1716711674_1] with renew id 1 started 09:09:14.464 [main] INFO org.apache.hadoop.io.compress.CodecPool - Got brand-new compressor [.deflate] 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - WriteChunk allocating new packet seqno=0, src=/user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq, packetSize=65016, chunksPerPacket=126, bytesCurBlock=0, DFSOutputStream:block==null 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.DataStreamer - Queued packet seqno: 0 offsetInBlock: 0 lastPacketInBlock: false lastByteOffsetInBlock: 1534, block==null 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.DataStreamer - Queued packet seqno: 1 offsetInBlock: 1534 lastPacketInBlock: true lastByteOffsetInBlock: 1534, block==null 09:09:14.464 [main] DEBUG org.apache.hadoop.hdfs.DataStreamer - block==null waiting for ack for: 1 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.hdfs.DataStreamer - stage=PIPELINE_SETUP_CREATE, block==null 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.hdfs.DataStreamer - Allocating new block: block==null 09:09:14.464 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #11 org.apache.hadoop.hdfs.protocol.ClientProtocol.addBlock 09:09:14.464 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #11 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: addBlock took 44ms 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.hdfs.DataStreamer - pipeline = [DatanodeInfoWithStorage[10.106.8.130:1004,DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99,DISK], DatanodeInfoWithStorage[10.106.8.132:1004,DS-49d8f531-ee28-49e2-b98e-98128258099b,DISK]], blk_1085484633_11743894 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.hdfs.DataStreamer - Connecting to datanode 10.106.8.130:1004 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.hdfs.DataStreamer - Send buf size 1838592 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 09:09:14.464 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #12 org.apache.hadoop.hdfs.protocol.ClientProtocol.getServerDefaults 09:09:14.464 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #12 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: getServerDefaults took 5ms 09:09:14.464 [Thread-16] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL client skipping handshake in secured configuration with privileged port for addr = /10.106.8.130, datanodeId = DatanodeInfoWithStorage[10.106.8.130:1004,DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99,DISK] 09:09:14.464 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894] DEBUG org.apache.hadoop.hdfs.DataStreamer - nodes [DatanodeInfoWithStorage[10.106.8.130:1004,DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99,DISK], DatanodeInfoWithStorage[10.106.8.132:1004,DS-49d8f531-ee28-49e2-b98e-98128258099b,DISK]] storageTypes [DISK, DISK] storageIDs [DS-3dbf9c6f-f04d-437f-877b-adfb39c65b99, DS-49d8f531-ee28-49e2-b98e-98128258099b] 09:09:14.464 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894] DEBUG org.apache.hadoop.hdfs.DataStreamer - blk_1085484633_11743894 sending packet seqno: 0 offsetInBlock: 0 lastPacketInBlock: false lastByteOffsetInBlock: 1534 09:09:14.464 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894] DEBUG org.apache.hadoop.hdfs.DataStreamer - stage=DATA_STREAMING, blk_1085484633_11743894 09:09:14.464 [ResponseProcessor for block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894] DEBUG org.apache.hadoop.hdfs.DataStreamer - DFSClient seqno: 0 reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 563348 flag: 0 flag: 0 09:09:14.464 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894] DEBUG org.apache.hadoop.hdfs.DataStreamer - blk_1085484633_11743894 sending packet seqno: 1 offsetInBlock: 1534 lastPacketInBlock: true lastByteOffsetInBlock: 1534 09:09:14.464 [ResponseProcessor for block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894] DEBUG org.apache.hadoop.hdfs.DataStreamer - DFSClient seqno: 1 reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 939525 flag: 0 flag: 0 09:09:14.464 [DataStreamer for file /user/oozie/oozie-oozi/0000084-220601112151640-oozie-oozi-W/impala-etl--java/action-data.seq block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894] DEBUG org.apache.hadoop.hdfs.DataStreamer - Closing old block BP-602310420-10.106.8.128-1602151336997:blk_1085484633_11743894 09:09:14.464 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie sending #13 org.apache.hadoop.hdfs.protocol.ClientProtocol.complete 09:09:14.464 [IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to name-01.novalocal/10.106.8.129:8020 from oozie got value #13 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: complete took 32ms Stopping AM 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms. 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to cm-hue-01.novalocal/10.106.8.128:8030 09:09:14.464 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:oozie (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795) 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB info:org.apache.hadoop.yarn.security.SchedulerSecurityInfo$1@859ea42 09:09:14.464 [main] DEBUG org.apache.hadoop.yarn.security.AMRMTokenSelector - Looking for a token with service 10.106.8.128:8030 09:09:14.464 [main] DEBUG org.apache.hadoop.yarn.security.AMRMTokenSelector - Token kind is YARN_AM_RM_TOKEN and the token's service name is 10.106.8.128:8030,10.106.8.129:8030 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Use TOKEN authentication for protocol ApplicationMasterProtocolPB 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting username: Cg4KCginARDjyc7ukTAQARD07oyP+/////8B 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting userPassword 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - SASL client callback: setting realm: default 09:09:14.464 [main] DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: INITIATE token: "charset=utf-8,username=\"Cg4KCginARDjyc7ukTAQARD07oyP+/////8B\",realm=\"default\",nonce=\"i0ATmvaIkIIrB95wEYbHLaFhKTQoV4JqooJIRPek\",nc=00000001,cnonce=\"mGPgfMAH5lYKiFY+dOIkaGVt8OWVPGbpoZbYv5Ol\",digest-uri=\"/default\",maxbuf=65536,response=33fda70b230cb93356487dda210826a6,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - Negotiated QOP is :auth 09:09:14.464 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie: starting, having connections 2 09:09:14.464 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie sending #14 org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB.finishApplicationMaster 09:09:14.464 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie got value #14 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: finishApplicationMaster took 15ms 09:09:14.464 [main] INFO org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl - Waiting for application to be successfully unregistered. 09:09:14.464 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie sending #15 org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB.finishApplicationMaster 09:09:14.464 [IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1848125895) connection to cm-hue-01.novalocal/10.106.8.128:8030 from oozie got value #15 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: finishApplicationMaster took 4ms 09:09:14.464 [main] DEBUG org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.async.AMRMClientAsync entered state STOPPED 09:09:14.464 [AMRM Heartbeater thread] DEBUG org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl - Heartbeater interrupted java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl$HeartbeatThread.run(AMRMClientAsyncImpl.java:289) 09:09:14.464 [main] DEBUG org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl entered state STOPPED 09:09:14.464 [main] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.464 [AMRM Callback Handler Thread] DEBUG org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl - Interrupted while waiting for queue java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048) at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442) at org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl$CallbackHandlerThread.run(AMRMClientAsyncImpl.java:310) Callback notification attempts left 0 Callback notification trying http://data-02.novalocal:11000/oozie/callback?id=0000084-220601112151640-oozie-oozi-W@impala-etl&status=FAILED Callback notification to http://data-02.novalocal:11000/oozie/callback?id=0000084-220601112151640-oozie-oozi-W@impala-etl&status=FAILED succeeded Callback notification succeeded 09:09:14.464 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.464 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.464 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.464 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client@311bf055 09:09:14.464 [Thread-5] DEBUG org.apache.hadoop.util.ShutdownHookManager - Completed shutdown in 0.004 seconds; Timeouts: 0 09:09:14.464 [Thread-5] DEBUG org.apache.hadoop.util.ShutdownHookManager - ShutdownHookManger completed shutdown. End of LogType:stdout *********************************************************************** [root@cm-hue-01 scc-dev]# ^C [root@cm-hue-01 scc-dev]#