Created on 08-18-2017 09:20 AM - edited 09-16-2022 05:06 AM
Problem summary:
I am unable to read from nested subdirectories from my Spark program, despite setting the required Hadoop configuration (see attempted).
I get the error below (full error in gist – further below)-
Exception in thread "main" java.io.FileNotFoundException: File /user/akhanolk/data/myq/parsed/myq-app-logs/to-be-compacted/flat-view-format/*/* does not exist.
Any help is appreciated.
Version:
Spark 2.2.0 } CDH 5.12 (upgraded Spark, Java)
Directory layout:
$ hdfs dfs -ls -R /user/akhanolk/data/myq/parsed/myq-app-logs/to-be-compacted/flat-view-format/*/part* | awk '{print $8}'
/user/akhanolk/data/myq/parsed/myq-app-logs/to-be-compacted/flat-view-format/batch_id=1502939225073/part-00000-3a44cd00-e895-4a01-9ab9-946064b739d4-c000.parquet
/user/akhanolk/data/myq/parsed/myq-app-logs/to-be-compacted/flat-view-format/batch_id=1502939234036/part-00000-cbd47353-0590-4cc1-b10d-c18886df1c25-c000.parquet
/user/akhanolk/data/myq/parsed/myq-app-logs/to-be-compacted/flat-view-format/batch_id=1502939238389/part-00000-a3d672fd-4b5c-4ad1-a85c-4c31829c3bd2-c000.parquet
Input directory parameter passed:
/user/akhanolk/data/myq/parsed/myq-app-logs/to-be-compacted/flat-view-format/*/*
Attempted (1):
Set parameter in code...
val sparkSession: SparkSession = SparkSession.builder().master("yarn").getOrCreate()
//Recursive glob support & loglevel
import sparkSession.implicits._
sparkSession.sparkContext.hadoopConfiguration.setBoolean("spark.hadoop.mapreduce.input.fileinputformat.input.dir.recursive", true)
Did not see the configuration in place in Spark UI.
Attempted (2):
Left the setting above as is, and added in spark-submit, on the CLI.
I do see the configuration in the Spark UI, but same error – it cannot traverse into the directory structure..
Command:
spark-submit --class com....bda.util.CompactParsedLogs --conf spark.hadoop.mapreduce.input.fileinputformat.input.dir.recursive=true ...
Code:
//Spark Session
val sparkSession: SparkSession = SparkSession.builder().master("yarn").getOrCreate()
//Recursive glob support
val conf= new SparkConf()
val cliRecursiveGlobConf=conf.get("spark.hadoop.mapreduce.input.fileinputformat.input.dir.recursive")
import sparkSession.implicits._
sparkSession.sparkContext.hadoopConfiguration.set("spark.hadoop.mapreduce.input.fileinputformat.input.dir.recursive", cliRecursiveGlobConf)
Error & overall output:
17/08/18 15:59:15 INFO spark.SparkContext: Running Spark version 2.2.0.cloudera1
17/08/18 15:59:16 INFO spark.SparkContext: Submitted application: com.chamberlain.bda.util.CompactParsedLogs
17/08/18 15:59:16 INFO spark.SecurityManager: Changing view acls to: akhanolk
17/08/18 15:59:16 INFO spark.SecurityManager: Changing modify acls to: akhanolk
17/08/18 15:59:16 INFO spark.SecurityManager: Changing view acls groups to:
17/08/18 15:59:16 INFO spark.SecurityManager: Changing modify acls groups to:
17/08/18 15:59:16 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(akhanolk); groups with view permissions: Set(); users with modify permissions: Set(akhanolk); groups with modify permissions: Set()
17/08/18 15:59:16 INFO util.Utils: Successfully started service 'sparkDriver' on port 45481.
17/08/18 15:59:16 INFO spark.SparkEnv: Registering MapOutputTracker
17/08/18 15:59:16 INFO spark.SparkEnv: Registering BlockManagerMaster
17/08/18 15:59:16 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/08/18 15:59:16 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/08/18 15:59:16 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-6f104040-1a4a-4645-a545-4d73da098e94
17/08/18 15:59:16 INFO memory.MemoryStore: MemoryStore started with capacity 912.3 MB
17/08/18 15:59:16 INFO spark.SparkEnv: Registering OutputCommitCoordinator
17/08/18 15:59:16 INFO util.log: Logging initialized @2062ms
17/08/18 15:59:17 INFO server.Server: jetty-9.3.z-SNAPSHOT
17/08/18 15:59:17 INFO server.Server: Started @2149ms
17/08/18 15:59:17 INFO server.AbstractConnector: Started ServerConnector@28c0b664{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
17/08/18 15:59:17 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d4d8fcf{/jobs,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1cefc4b3{/jobs/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6f6a7463{/jobs/job,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ca320ab{/jobs/job/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e53135d{/stages,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a7704c{/stages/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@619bd14c{/stages/stage,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7561db12{/stages/stage/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24b52d3e{/stages/pool,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e9c413e{/stages/pool/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5af5def9{/storage,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@36dce7ed{/storage/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33d05366{/storage/rdd,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7692cd34{/storage/rdd/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@32c0915e{/environment,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70f43b45{/environment/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10ad20cb{/executors,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c282004{/executors/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7bfc3126{/executors/threadDump,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53bc1328{/executors/threadDump/json,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c1e3314{/static,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f3c966c{/,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4102b1b1{/api,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@77b325b3{/jobs/job/kill,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e8e8651{/stages/stage/kill,null,AVAILABLE,@Spark}
17/08/18 15:59:17 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.5.0.5:4040
17/08/18 15:59:17 INFO spark.SparkContext: Added JAR file:/home/akhanolk/apps/myqIngest/streaming/MyQIngest-1.0.jar at spark://10.5.0.5:45481/jars/MyQIngest-1.0.jar with timestamp 1503071957190
17/08/18 15:59:17 INFO util.Utils: Using initial executors = 3, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
17/08/18 15:59:18 INFO client.RMProxy: Connecting to ResourceManager at cdh-mn-2b4cb552.cdh-cluster.dev/10.5.0.6:8032
17/08/18 15:59:18 INFO yarn.Client: Requesting a new application from cluster with 4 NodeManagers
17/08/18 15:59:18 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (36070 MB per container)
17/08/18 15:59:18 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
17/08/18 15:59:18 INFO yarn.Client: Setting up container launch context for our AM
17/08/18 15:59:18 INFO yarn.Client: Setting up the launch environment for our AM container
17/08/18 15:59:18 INFO yarn.Client: Preparing resources for our AM container
17/08/18 15:59:19 INFO yarn.Client: Uploading resource file:/tmp/spark-0e51fc77-0ed0-42fc-99a8-e98614820f13/__spark_conf__368044468245332078.zip -> hdfs://cdh-mn-2b4cb552.cdh-cluster.dev:8020/user/akhanolk/.sparkStaging/application_1501192010062_0278/__spark_conf__.zip
17/08/18 15:59:20 INFO spark.SecurityManager: Changing view acls to: akhanolk
17/08/18 15:59:20 INFO spark.SecurityManager: Changing modify acls to: akhanolk
17/08/18 15:59:20 INFO spark.SecurityManager: Changing view acls groups to:
17/08/18 15:59:20 INFO spark.SecurityManager: Changing modify acls groups to:
17/08/18 15:59:20 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(akhanolk); groups with view permissions: Set(); users with modify permissions: Set(akhanolk); groups with modify permissions: Set()
17/08/18 15:59:20 INFO yarn.Client: Submitting application application_1501192010062_0278 to ResourceManager
17/08/18 15:59:20 INFO impl.YarnClientImpl: Submitted application application_1501192010062_0278
17/08/18 15:59:20 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1501192010062_0278 and attemptId None
17/08/18 15:59:21 INFO yarn.Client: Application report for application_1501192010062_0278 (state: ACCEPTED)
17/08/18 15:59:21 INFO yarn.Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: root.users.akhanolk
start time: 1503071960167
final status: UNDEFINED
tracking URL: http://cdh-mn-2b4cb552.cdh-cluster.dev:8088/proxy/application_1501192010062_0278/
user: akhanolk
17/08/18 15:59:22 INFO yarn.Client: Application report for application_1501192010062_0278 (state: ACCEPTED)
17/08/18 15:59:23 INFO yarn.Client: Application report for application_1501192010062_0278 (state: ACCEPTED)
17/08/18 15:59:23 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
17/08/18 15:59:23 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> cdh-mn-2b4cb552.cdh-cluster.dev, PROXY_URI_BASES -> http://cdh-mn-2b4cb552.cdh-cluster.dev:8088/proxy/application_1501192010062_0278), /proxy/application_1501192010062_0278
17/08/18 15:59:23 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
17/08/18 15:59:24 INFO yarn.Client: Application report for application_1501192010062_0278 (state: RUNNING)
17/08/18 15:59:24 INFO yarn.Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: 10.5.0.8
ApplicationMaster RPC port: 0
queue: root.users.akhanolk
start time: 1503071960167
final status: UNDEFINED
tracking URL: http://cdh-mn-2b4cb552.cdh-cluster.dev:8088/proxy/application_1501192010062_0278/
user: akhanolk
17/08/18 15:59:24 INFO cluster.YarnClientSchedulerBackend: Application application_1501192010062_0278 has started running.
17/08/18 15:59:24 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37415.
17/08/18 15:59:24 INFO netty.NettyBlockTransferService: Server created on 10.5.0.5:37415
17/08/18 15:59:24 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/08/18 15:59:24 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.5.0.5, 37415, None)
17/08/18 15:59:24 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.5.0.5:37415 with 912.3 MB RAM, BlockManagerId(driver, 10.5.0.5, 37415, None)
17/08/18 15:59:24 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.5.0.5, 37415, None)
17/08/18 15:59:24 INFO storage.BlockManager: external shuffle service port = 7337
17/08/18 15:59:24 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.5.0.5, 37415, None)
17/08/18 15:59:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2ca54da9{/metrics/json,null,AVAILABLE,@Spark}
17/08/18 15:59:24 INFO scheduler.EventLoggingListener: Logging events to hdfs://cdh-mn-2b4cb552.cdh-cluster.dev:8020/user/spark/spark2ApplicationHistory/application_1501192010062_0278
17/08/18 15:59:24 INFO util.Utils: Using initial executors = 3, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
17/08/18 15:59:27 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.5.0.10:39138) with ID 3
17/08/18 15:59:27 INFO spark.ExecutorAllocationManager: New executor 3 has registered (new total is 1)
17/08/18 15:59:27 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.5.0.10:39142) with ID 1
17/08/18 15:59:27 INFO spark.ExecutorAllocationManager: New executor 1 has registered (new total is 2)
17/08/18 15:59:27 INFO storage.BlockManagerMasterEndpoint: Registering block manager cdh-wn-e043867b.cdh-cluster.dev:41719 with 366.3 MB RAM, BlockManagerId(3, cdh-wn-e043867b.cdh-cluster.dev, 41719, None)
17/08/18 15:59:27 INFO storage.BlockManagerMasterEndpoint: Registering block manager cdh-wn-e043867b.cdh-cluster.dev:42490 with 366.3 MB RAM, BlockManagerId(1, cdh-wn-e043867b.cdh-cluster.dev, 42490, None)
17/08/18 15:59:27 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.5.0.10:39144) with ID 2
17/08/18 15:59:27 INFO spark.ExecutorAllocationManager: New executor 2 has registered (new total is 3)
17/08/18 15:59:27 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
17/08/18 15:59:27 INFO storage.BlockManagerMasterEndpoint: Registering block manager cdh-wn-e043867b.cdh-cluster.dev:41433 with 366.3 MB RAM, BlockManagerId(2, cdh-wn-e043867b.cdh-cluster.dev, 41433, None)
17/08/18 15:59:27 INFO internal.SharedState: loading hive config file: file:/etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/hive-site.xml
17/08/18 15:59:27 INFO internal.SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
17/08/18 15:59:27 INFO internal.SharedState: Warehouse path is '/user/hive/warehouse'.
17/08/18 15:59:27 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26fadd98{/SQL,null,AVAILABLE,@Spark}
17/08/18 15:59:27 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3db6dd52{/SQL/json,null,AVAILABLE,@Spark}
17/08/18 15:59:27 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@23ad2d17{/SQL/execution,null,AVAILABLE,@Spark}
17/08/18 15:59:27 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25f0c5e7{/SQL/execution/json,null,AVAILABLE,@Spark}
17/08/18 15:59:27 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@18cf5c52{/static/sql,null,AVAILABLE,@Spark}
17/08/18 15:59:28 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.1.0 using file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-logging-1.1.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-exec-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-exec.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-jdbc-1.1.0-cdh5.12.0-standalone.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-jdbc-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-jdbc-standalone.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-jdbc.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-metastore-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-metastore.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-serde-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-serde.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-service-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-service.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/libfb303-0.9.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/libthrift-0.9.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/log4j-1.2.16.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hbase-client.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hbase-common.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hbase-hadoop-compat.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hbase-hadoop2-compat.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hbase-protocol.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hbase-server.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/htrace-core.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/ST4-4.0.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/accumulo-core-1.6.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/accumulo-fate-1.6.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/accumulo-start-1.6.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/accumulo-trace-1.6.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/activation-1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/ant-1.9.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/ant-launcher-1.9.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/antlr-2.7.7.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/antlr-runtime-3.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/apache-log4j-extras-1.2.17.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/asm-3.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/asm-commons-3.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/asm-tree-3.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/avro.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/bonecp-0.8.0.RELEASE.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/calcite-avatica-1.0.0-incubating.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/calcite-core-1.0.0-incubating.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/calcite-linq4j-1.0.0-incubating.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-beanutils-1.9.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-beanutils-core-1.8.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-cli-1.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-codec-1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-collections-3.2.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-compiler-2.7.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-compress-1.4.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-configuration-1.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-dbcp-1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-digester-1.8.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-el-1.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-httpclient-3.0.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-io-2.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-lang-2.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-lang3-3.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-math-2.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-pool-1.5.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/commons-vfs2-2.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/curator-client-2.6.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/curator-framework-2.6.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/curator-recipes-2.6.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/datanucleus-api-jdo-3.2.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/datanucleus-core-3.2.10.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/datanucleus-rdbms-3.2.9.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/derby-10.11.1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/eigenbase-properties-1.1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/findbugs-annotations-1.3.9-1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/groovy-all-2.4.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/gson-2.2.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/guava-14.0.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hamcrest-core-1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hbase-annotations.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/high-scale-lib-1.1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-accumulo-handler-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-accumulo-handler.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-ant-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-ant.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-beeline-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-beeline.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-cli-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-cli.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-common-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-common.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-contrib-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-contrib.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-hbase-handler-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-hbase-handler.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-hwi-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-hwi.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims-0.23-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims-0.23.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims-common-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims-common.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims-scheduler-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims-scheduler.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-shims.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-testutils.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jamon-runtime-2.3.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jackson-xc-1.9.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jackson-databind-2.2.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jackson-annotations-2.2.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/zookeeper.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/velocity-1.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/snappy-java-1.0.4.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/plexus-utils-1.5.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/paranamer-2.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/oro-2.0.8.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/httpclient-4.2.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/xz-1.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/tempus-fugit-1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/super-csv-2.2.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/stax-api-1.0.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/servlet-api-2.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/opencsv-2.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/metrics-jvm-3.0.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/metrics-json-3.0.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/metrics-core-3.0.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/maven-scm-provider-svnexe-1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/maven-scm-provider-svn-commons-1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/maven-scm-api-1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/mail-1.4.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/logredactor-1.0.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/junit-4.11.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jta-1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jsr305-3.0.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jsp-api-2.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jpam-1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/joda-time-1.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jline-2.12.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jetty-all-server-7.6.0.v20120127.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jetty-all-7.6.0.v20120127.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jersey-servlet-1.14.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jersey-server-1.14.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jdo-api-3.0.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jcommander-1.32.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jasper-runtime-5.5.23.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jasper-compiler-5.5.23.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/janino-2.7.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jackson-jaxrs-1.9.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/jackson-core-2.2.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/ivy-2.0.0-rc2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/parquet-hadoop-bundle.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/stringtemplate-3.2.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/regexp-1.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/httpcore-4.2.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/../hive/lib/hive-testutils-1.1.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/activation-1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/activation.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/apacheds-i18n-2.0.0-M15.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/apacheds-i18n.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/apacheds-kerberos-codec-2.0.0-M15.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/apacheds-kerberos-codec.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/api-asn1-api-1.0.0-M20.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/api-asn1-api.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/api-util-1.0.0-M20.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/api-util.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/avro.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/aws-java-sdk-bundle-1.11.134.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/aws-java-sdk-bundle.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/azure-data-lake-store-sdk-2.1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/azure-data-lake-store-sdk.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-beanutils-1.9.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-beanutils-core-1.8.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-beanutils-core.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-beanutils.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-cli-1.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-cli.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-codec-1.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-codec.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-collections-3.2.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-collections.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-compress-1.4.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-compress.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-configuration-1.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-configuration.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-digester-1.8.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-digester.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-httpclient-3.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-httpclient.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-io-2.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-io.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-lang-2.6.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-lang.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-logging-1.1.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-logging.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-math3-3.1.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-math3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-net-3.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/commons-net.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/curator-client-2.7.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/curator-client.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/curator-framework-2.7.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/curator-framework.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/curator-recipes-2.7.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/curator-recipes.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/gson-2.2.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/gson.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/guava-11.0.2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/guava.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-annotations-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-annotations.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-auth-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-auth.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-aws-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-aws.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-azure-datalake-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-azure-datalake.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-common-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-common.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-hdfs-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-hdfs.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-app-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-app.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-common-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-common.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-core-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-core.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-jobclient.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-mapreduce-client-shuffle.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-api-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-api.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-client-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-client.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-common-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-common.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-server-common-2.6.0-cdh5.12.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/hadoop-yarn-server-common.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/htrace-core4-4.0.1-incubating.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/htrace-core4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/httpclient-4.2.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/httpclient.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/httpcore-4.2.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/httpcore.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-annotations-2.2.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-annotations.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-core-2.2.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-core.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-databind-2.2.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-databind.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-jaxrs-1.8.8.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-jaxrs.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-xc-1.8.8.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jackson-xc.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/zookeeper.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xz.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xz-1.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xmlenc.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xmlenc-0.52.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xml-apis.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xml-apis-1.3.04.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xercesImpl.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/xercesImpl-2.9.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/stax-api.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/stax-api-1.0-2.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/snappy-java.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/snappy-java-1.0.4.1.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/slf4j-log4j12.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/slf4j-api.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/slf4j-api-1.7.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/servlet-api.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/servlet-api-2.5.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/protobuf-java.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/protobuf-java-2.5.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/paranamer.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/paranamer-2.3.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/netty.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/netty-3.10.5.Final.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/log4j.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/log4j-1.2.17.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/leveldbjni-all.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/leveldbjni-all-1.8.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jsr305.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jsr305-3.0.0.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jetty-util.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jetty-util-6.1.26.cloudera.4.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jersey-core.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jersey-core-1.9.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jersey-client.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jersey-client-1.9.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jaxb-api.jar:file:/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/lib/hadoop/client/jaxb-api-2.2.2.jar
17/08/18 15:59:29 INFO session.SessionState: Created local directory: /tmp/1142bed0-6a21-4016-88a2-2268c070d3b0_resources
17/08/18 15:59:29 INFO session.SessionState: Created HDFS directory: /tmp/hive/akhanolk/1142bed0-6a21-4016-88a2-2268c070d3b0
17/08/18 15:59:29 INFO session.SessionState: Created local directory: /tmp/akhanolk/1142bed0-6a21-4016-88a2-2268c070d3b0
17/08/18 15:59:29 INFO session.SessionState: Created HDFS directory: /tmp/hive/akhanolk/1142bed0-6a21-4016-88a2-2268c070d3b0/_tmp_space.db
17/08/18 15:59:29 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
17/08/18 15:59:29 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.1.0) is /user/hive/warehouse
17/08/18 15:59:29 INFO hive.metastore: Trying to connect to metastore with URI thrift://cdh-mn-2b4cb552.cdh-cluster.dev:9083
17/08/18 15:59:29 INFO hive.metastore: Opened a connection to metastore, current connections: 1
17/08/18 15:59:29 INFO hive.metastore: Connected to metastore.
17/08/18 15:59:29 INFO session.SessionState: Created local directory: /tmp/0e5ab2ac-30ca-4df8-bfaa-f25d8d18a0f2_resources
17/08/18 15:59:29 INFO session.SessionState: Created HDFS directory: /tmp/hive/akhanolk/0e5ab2ac-30ca-4df8-bfaa-f25d8d18a0f2
17/08/18 15:59:29 INFO session.SessionState: Created local directory: /tmp/akhanolk/0e5ab2ac-30ca-4df8-bfaa-f25d8d18a0f2
17/08/18 15:59:29 INFO session.SessionState: Created HDFS directory: /tmp/hive/akhanolk/0e5ab2ac-30ca-4df8-bfaa-f25d8d18a0f2/_tmp_space.db
17/08/18 15:59:29 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
17/08/18 15:59:29 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.1.0) is /user/hive/warehouse
17/08/18 15:59:29 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
Exception in thread "main" java.io.FileNotFoundException: File /user/akhanolk/data/myq/parsed/myq-app-logs/to-be-compacted/flat-view-format/batch_id=*/* does not exist.
at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:744)
at org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSystem.java:110)
at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:805)
at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:801)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:801)
at com.chamberlain.bda.util.CompactParsedLogs$.main(CompactParsedLogs.scala:47)
at com.chamberlain.bda.util.CompactParsedLogs.main(CompactParsedLogs.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Created 09-09-2017 10:15 AM
Hi,
Could you please try by creating external table on top of master directory and setting up below hive properties and read the table in HiveContext.
SET mapred.input.dir.recursive=true;
SET hive.mapred.supports.subdirectories=true;
I hope, this will work.
Thanks,
Manu