Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive query returns no results on S3 mapped table

Hive query returns no results on S3 mapped table

Contributor

I mapped an external table to an s3 bucket using s3a but I get no results

what am I doing wrong?

 

Details below

I am on CDH5.4.8-Packaging-Hive-2015-10-15_08-45-27

hive-1.1.0+cdh5.4.8+275-1.cdh5.4.8.p0.5~trusty -r Unknown

I followed the solution on https://community.cloudera.com/t5/Batch-SQL-Apache-Hive/Problems-with-Hive-and-S3/td-p/1799

as well as the instructions on http://www.cloudera.com/content/www/en-us/documentation/enterprise/latest/topics/impala_s3.html

and placed the following

 

<property>
<name>fs.s3a.access.key</name>
<value>MY_ACCESS_KEY</value>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value>MY_SECRET_KEY</value>
</property>

Note: my key does have a slash character in it

 

I placed the XML snippit into HDFS configuration 

both 

Cluster-wide Advanced Configuration Snippet (Safety Valve) for core-site.xml
and 
HDFS Client Advanced Configuration Snippet (Safety Valve) for hdfs-site.xml

 

 

I created a table as follows

 

create external table s3bucket.archived_logs
( logstring  string
)
partitioned by (host string, year string, month string, day string)
LOCATION 's3a://archivelogs/';

 

 

My AWS S3 bucket path as 

 

All Buckets /archivelogs/host=someServer.com/year=2016/month=01/day=06

inside are gzip files that are Server side encrypted with AWS AES-256

 

 

No errors occured during table creation 

but when I do a simple query like:

 

select
*
from s3bucket.s3_rippled_logs
where host='someServer.com'
  and year='2016'
  and month='01'
  and day='06'
limit 10;

I get no results

 

I tried running in Impala and hive verbose mode and cannot see any error

 

hive debug logs for query enclosed 

16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
16/01/16 03:15:18 [main]: DEBUG parse.VariableSubstitution: Substitution is on: select
*
from s3bucket.archived_logs
where host='someServer.com'
  and year='2016'
  and month='01'
  and day='06'
limit 10
16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
16/01/16 03:15:18 [main]: INFO parse.ParseDriver: Parsing command: select
*
from s3bucket.archived_logs
where host='someServer.com'
  and year='2016'
  and month='01'
  and day='06'
limit 10
16/01/16 03:15:18 [main]: INFO parse.ParseDriver: Parse Completed
16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=parse start=1452914118111 end=1452914118135 duration=24 from=org.apache.hadoop.hive.ql.Driver>
16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
16/01/16 03:15:18 [main]: INFO parse.CalcitePlanner: Starting Semantic Analysis
16/01/16 03:15:18 [main]: INFO parse.CalcitePlanner: Completed phase 1 of Semantic Analysis
16/01/16 03:15:18 [main]: INFO parse.CalcitePlanner: Get metadata for source tables
16/01/16 03:15:18 [main]: INFO parse.CalcitePlanner: Get metadata for subqueries
16/01/16 03:15:18 [main]: INFO parse.CalcitePlanner: Get metadata for destination tables
16/01/16 03:15:18 [main]: DEBUG hdfs.DFSClient: /tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1: masked=rwx------
16/01/16 03:15:18 [main]: DEBUG ipc.Client: The ping interval is 60000 ms.
16/01/16 03:15:18 [main]: DEBUG ipc.Client: Connecting to hadoop2-private.com/HadoopClient:8020
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple: starting, having connections 1
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #9
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #9
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 3ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #10
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #10
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/01/16 03:15:18 [main]: INFO ql.Context: New scratch dir is hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1
16/01/16 03:15:18 [main]: INFO parse.CalcitePlanner: Completed getting MetaData in Semantic Analysis
16/01/16 03:15:18 [main]: INFO parse.BaseSemanticAnalyzer: Not invoking CBO because the statement has too few joins
16/01/16 03:15:18 [main]: DEBUG hive.log: DDL: struct archived_logs { string logstring}
16/01/16 03:15:18 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[logstring] columnTypes=[string] separator=[[B@6ac97a40] nullstring=\N lastColumnTakesRest=false timestampFormats=null
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created Table Plan for archived_logs TS[0]
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created Filter Plan for null row schema: archived_logs{(logstring,logstring: string)(host,host: string)(year,year: string)(month,month: string)(day,day: string)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE: bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID: struct<transactionid:bigint,bucketid:int,rowid:bigint>)}
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: RR before GB archived_logs{(logstring,logstring: string)(host,host: string)(year,year: string)(month,month: string)(day,day: string)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE: bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID: struct<transactionid:bigint,bucketid:int,rowid:bigint>)}  after GB archived_logs{(logstring,logstring: string)(host,host: string)(year,year: string)(month,month: string)(day,day: string)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE: bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID: struct<transactionid:bigint,bucketid:int,rowid:bigint>)}
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: tree: (TOK_SELECT (TOK_SELEXPR TOK_ALLCOLREF))
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: genSelectPlan: input = archived_logs{(logstring,logstring: string)(host,host: string)(year,year: string)(month,month: string)(day,day: string)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE: bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID: struct<transactionid:bigint,bucketid:int,rowid:bigint>)}  starRr = null
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created Select Plan row schema: archived_logs{(logstring,_col0: string)(host,_col1: string)(year,_col2: string)(month,_col3: string)(day,_col4: string)}
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created Select Plan for clause: insclause-0
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created LimitOperator Plan for clause: insclause-0 row schema: archived_logs{(logstring,_col0: string)(host,_col1: string)(year,_col2: string)(month,_col3: string)(day,_col4: string)}
16/01/16 03:15:18 [main]: DEBUG ql.Context: Created staging dir = hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000/.hive-staging_hive_2016-01-16_03-15-18_111_3767039462042709885-1 for path = hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000
16/01/16 03:15:18 [main]: INFO common.FileUtils: Creating directory if it doesn't exist: hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000/.hive-staging_hive_2016-01-16_03-15-18_111_3767039462042709885-1
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #11
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #11
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #12
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #12
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 0ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #13
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #13
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 0ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #14
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #14
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/01/16 03:15:18 [main]: DEBUG hdfs.DFSClient: /tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000/.hive-staging_hive_2016-01-16_03-15-18_111_3767039462042709885-1: masked=rwxr-xr-x
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #15
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #15
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 2ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #16
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #16
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/01/16 03:15:18 [main]: DEBUG shims.HadoopShimsSecure: {-chgrp,-R,supergroup,hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000}
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #17
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #17
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #18
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #18
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getListing took 0ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #19
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #19
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getListing took 0ms
16/01/16 03:15:18 [main]: DEBUG shims.HadoopShimsSecure: Return value is :0
16/01/16 03:15:18 [main]: DEBUG shims.HadoopShimsSecure: {-chmod,-R,700,hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000}
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #20
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #20
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #21
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #21
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: setPermission took 0ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #22
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #22
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #23
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #23
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: setPermission took 1ms
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #24
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #24
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
16/01/16 03:15:18 [main]: DEBUG shims.HadoopShimsSecure: Return value is :0
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #25
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #25
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/01/16 03:15:18 [main]: DEBUG shims.HadoopShimsSecure: FileStatus{path=hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000; isDirectory=true; modification_time=1452914118404; access_time=0; owner=ripple; group=supergroup; permission=rwx------; isSymlink=false}
16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #26
16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #26
16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/01/16 03:15:18 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[_col0, _col1, _col2, _col3, _col4] columnTypes=[string, string, string, string, string] separator=[[B@674f9e04] nullstring=\N lastColumnTakesRest=false timestampFormats=null
16/01/16 03:15:18 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[_col0, _col1, _col2, _col3, _col4] columnTypes=[string, string, string, string, string] separator=[[B@1c0f4d99] nullstring=\N lastColumnTakesRest=false timestampFormats=null
16/01/16 03:15:18 [main]: INFO parse.CalcitePlanner: Set stats collection dir : hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000/.hive-staging_hive_2016-01-16_03-15-18_111_3767039462042709885-1/-ext-10002
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created FileSink Plan for clause: insclause-0dest_path: hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000 row schema: archived_logs{(logstring,_col0: string)(host,_col1: string)(year,_col2: string)(month,_col3: string)(day,_col4: string)}
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created Body Plan for Query Block null
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Created Plan for Query Block null
16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: Before logical optimization
TS[0]-FIL[1]-SEL[2]-LIM[3]-FS[4]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:TS[0] with rs:(logstring: string|{archived_logs}logstring,host: string|{archived_logs}host,year: string|{archived_logs}year,month: string|{archived_logs}month,day: string|{archived_logs}day,BLOCK__OFFSET__INSIDE__FILE: bigint|{archived_logs}block__offset__inside__file,INPUT__FILE__NAME: string|{archived_logs}input__file__name,ROW__ID: struct<transactionid:bigint,bucketid:int,rowid:bigint>|{archived_logs}row__id)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator TS[0]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:FIL[1] with rs:(logstring: string|{archived_logs}logstring,host: string|{archived_logs}host,year: string|{archived_logs}year,month: string|{archived_logs}month,day: string|{archived_logs}day,BLOCK__OFFSET__INSIDE__FILE: bigint|{archived_logs}block__offset__inside__file,INPUT__FILE__NAME: string|{archived_logs}input__file__name,ROW__ID: struct<transactionid:bigint,bucketid:int,rowid:bigint>|{archived_logs}row__id)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator FIL[1]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Old filter FIL[1] conditions:((((host = 'someServer.com') and (year = '2016')) and (month = '01')) and (day = '06'))
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Filter org.apache.hadoop.hive.ql.udf.generic.GenericUDFOPEqual@7c306ab8 is identified as a value assignment, propagate it.
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Filter org.apache.hadoop.hive.ql.udf.generic.GenericUDFOPEqual@4eae95ba is identified as a value assignment, propagate it.
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Filter org.apache.hadoop.hive.ql.udf.generic.GenericUDFOPEqual@27e80eb1 is identified as a value assignment, propagate it.
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Filter org.apache.hadoop.hive.ql.udf.generic.GenericUDFOPEqual@741c9aee is identified as a value assignment, propagate it.
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: New filter FIL[1] conditions:((((host = 'someServer.com') and (year = '2016')) and (month = '01')) and (day = '06'))
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column host: string with constant Const string someServer.com in FIL[1]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column year: string with constant Const string 2016 in FIL[1]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column month: string with constant Const string 01 in FIL[1]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column day: string with constant Const string 06 in FIL[1]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:SEL[2] with rs:(_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.month as archived_logs.month with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.host as archived_logs.host with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.year as archived_logs.year with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.day as archived_logs.day with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [_col1: string, _col3: string, _col2: string, _col4: string] to operator SEL[2]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col1: string with constant Const string someServer.com in SEL[2]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col2: string with constant Const string 2016 in SEL[2]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col3: string with constant Const string 01 in SEL[2]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col4: string with constant Const string 06 in SEL[2]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Folding expression:Column[host] -> Const string someServer.com
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Folding expression:Column[year] -> Const string 2016
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Folding expression:Column[month] -> Const string 01
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Folding expression:Column[day] -> Const string 06
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: New column list:(Column[logstring] Const string someServer.com Const string 2016 Const string 01 Const string 06)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:LIM[3] with rs:(_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.host as archived_logs.host with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.month as archived_logs.month with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.year as archived_logs.year with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Resolved archived_logs.day as archived_logs.day with rs: (_col0: string|{archived_logs}logstring,_col1: string|{archived_logs}host,_col2: string|{archived_logs}year,_col3: string|{archived_logs}month,_col4: string|{archived_logs}day)
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [_col1: string, _col3: string, _col2: string, _col4: string] to operator LIM[3]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col1: string with constant Const string someServer.com in LIM[3]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col2: string with constant Const string 2016 in LIM[3]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col3: string with constant Const string 01 in LIM[3]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcFactory: Replacing column _col4: string with constant Const string 06 in LIM[3]
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:FS[4] with rs:(_col0: string|{},_col1: string|{},_col2: string|{},_col3: string|{},_col4: string|{})
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Can't resolve archived_logs.host(_col1) from rs:(_col0: string|{},_col1: string|{},_col2: string|{},_col3: string|{},_col4: string|{})
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Can't resolve archived_logs.month(_col3) from rs:(_col0: string|{},_col1: string|{},_col2: string|{},_col3: string|{},_col4: string|{})
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Can't resolve archived_logs.year(_col2) from rs:(_col0: string|{},_col1: string|{},_col2: string|{},_col3: string|{},_col4: string|{})
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Can't resolve archived_logs.day(_col4) from rs:(_col0: string|{},_col1: string|{},_col2: string|{},_col3: string|{},_col4: string|{})
16/01/16 03:15:18 [main]: DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator FS[4]
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: Processing for FS(4)
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: Processing for LIM(3)
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: Processing for SEL(2)
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: Processing for FIL(1)
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: Pushdown Predicates of FIL For Alias : archived_logs
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(host = 'someServer.com')
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(year = '2016')
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(month = '01')
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(day = '06')
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: Processing for TS(0)
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: Pushdown Predicates of TS For Alias : archived_logs
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(host = 'someServer.com')
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(year = '2016')
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(month = '01')
16/01/16 03:15:18 [main]: INFO ppd.OpProcFactory: 	(day = '06')
16/01/16 03:15:18 [main]: DEBUG ppd.PredicatePushDown: After PPD:
TS[0]-FIL[5]-SEL[2]-LIM[3]-FS[4] 
... deprecated to meet limit ... 16/01/16 03:15:18 [main]: INFO ql.Driver: Semantic Analysis Completed 16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: validation start 16/01/16 03:15:18 [main]: DEBUG parse.CalcitePlanner: not validating writeEntity, because entity is neither table nor partition 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1452914118135 end=1452914118708 duration=573 from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[logstring] columnTypes=[string] separator=[[B@6c817f5b] nullstring=\N lastColumnTakesRest=false timestampFormats=null 16/01/16 03:15:18 [main]: INFO exec.TableScanOperator: Initializing Self TS[0] 16/01/16 03:15:18 [main]: INFO exec.TableScanOperator: Operator 0 TS initialized 16/01/16 03:15:18 [main]: INFO exec.TableScanOperator: Initializing children of 0 TS 16/01/16 03:15:18 [main]: INFO exec.FilterOperator: Initializing child 5 FIL 16/01/16 03:15:18 [main]: INFO exec.FilterOperator: Initializing Self FIL[5] 16/01/16 03:15:18 [main]: INFO exec.FilterOperator: Operator 5 FIL initialized 16/01/16 03:15:18 [main]: INFO exec.FilterOperator: Initializing children of 5 FIL 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: Initializing child 2 SEL 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: Initializing Self SEL[2] 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: SELECT struct<logstring:string,host:string,year:string,month:string,day:string> 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: Operator 2 SEL initialized 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: Initializing children of 2 SEL 16/01/16 03:15:18 [main]: INFO exec.LimitOperator: Initializing child 3 LIM 16/01/16 03:15:18 [main]: INFO exec.LimitOperator: Initializing Self LIM[3] 16/01/16 03:15:18 [main]: INFO exec.LimitOperator: Operator 3 LIM initialized 16/01/16 03:15:18 [main]: INFO exec.LimitOperator: Initializing children of 3 LIM 16/01/16 03:15:18 [main]: INFO exec.ListSinkOperator: Initializing child 6 OP 16/01/16 03:15:18 [main]: INFO exec.ListSinkOperator: Initializing Self OP[6] 16/01/16 03:15:18 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.DelimitedJSONSerDe initialized with: columnNames=[] columnTypes=[] separator=[[B@3177d830] nullstring=NULL lastColumnTakesRest=false timestampFormats=null 16/01/16 03:15:18 [main]: INFO exec.ListSinkOperator: Operator 6 OP initialized 16/01/16 03:15:18 [main]: INFO exec.ListSinkOperator: Initialization Done 6 OP 16/01/16 03:15:18 [main]: INFO Configuration.deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 16/01/16 03:15:18 [main]: INFO exec.LimitOperator: Initialization Done 3 LIM 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: Initialization Done 2 SEL 16/01/16 03:15:18 [main]: INFO exec.FilterOperator: Initialization Done 5 FIL 16/01/16 03:15:18 [main]: INFO exec.TableScanOperator: Initialization Done 0 TS 16/01/16 03:15:18 [main]: INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:archived_logs.logstring, type:string, comment:null), FieldSchema(name:archived_logs.host, type:string, comment:null), FieldSchema(name:archived_logs.year, type:string, comment:null), FieldSchema(name:archived_logs.month, type:string, comment:null), FieldSchema(name:archived_logs.day, type:string, comment:null)], properties:null) 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=compile start=1452914118110 end=1452914118732 duration=622 from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: DEBUG lockmgr.DummyTxnManager: Adding s3bucket@archived_logs to list of lock inputs 16/01/16 03:15:18 [main]: DEBUG lockmgr.DummyTxnManager: Adding hdfs://hadoop2-private.com:8020/tmp/hive/ripple/2e879c0e-ddcf-4dd8-973f-a4b1abd2cfe1/hive_2016-01-16_03-15-18_111_3767039462042709885-1/-mr-10000 to list of lock outputs 16/01/16 03:15:18 [main]: INFO ZooKeeperHiveLockManager: Acquiring lock for s3bucket with mode IMPLICIT 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 2,1 replyHeader:: 2,176093736733,0 request:: '/hive_zookeeper_namespace_hive/s3bucket,,v{s{31,s{'world,'anyone}}},0 response:: '/hive_zookeeper_namespace_hive/s3bucket 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Got ping response for sessionid: 0x15247ad1e7f01a6 after 2ms 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 3,1 replyHeader:: 3,176093736734,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/LOCK-SHARED-,#726970706c655f32303136303131363033313531355f36376162616537662d343139322d346338302d386464642d3331376564323239313332353a313435323931343131383733343a494d504c494349543a73656c656374a2aa66726f6d2073336275636b65742e73335f726970706c65645f6c6f6773a776865726520686f73743d276e6a6330312e726970706c652e636f6d27a2020616e6420796561723d273230313627a2020616e64206d6f6e74683d27303127a2020616e64206461793d27303627a6c696d69742031303a31302e3132352e3132322e323337,v{s{31,s{'world,'anyone}}},3 response:: '/hive_zookeeper_namespace_hive/s3bucket/LOCK-SHARED-0000000000 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 4,12 replyHeader:: 4,176093736734,0 request:: '/hive_zookeeper_namespace_hive/s3bucket,F response:: v{'LOCK-SHARED-0000000000},s{176093736733,176093736733,1452914118738,1452914118738,0,1,0,0,0,1,176093736734} 16/01/16 03:15:18 [main]: INFO ZooKeeperHiveLockManager: Acquiring lock for s3bucket/archived_logs with mode IMPLICIT 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 5,1 replyHeader:: 5,176093736735,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs,,v{s{31,s{'world,'anyone}}},0 response:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 6,1 replyHeader:: 6,176093736736,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs/LOCK-SHARED-,#726970706c655f32303136303131363033313531355f36376162616537662d343139322d346338302d386464642d3331376564323239313332353a313435323931343131383733343a494d504c494349543a73656c656374a2aa66726f6d2073336275636b65742e73335f726970706c65645f6c6f6773a776865726520686f73743d276e6a6330312e726970706c652e636f6d27a2020616e6420796561723d273230313627a2020616e64206d6f6e74683d27303127a2020616e64206461793d27303627a6c696d69742031303a31302e3132352e3132322e323337,v{s{31,s{'world,'anyone}}},3 response:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs/LOCK-SHARED-0000000000 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 7,12 replyHeader:: 7,176093736736,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs,F response:: v{'LOCK-SHARED-0000000000},s{176093736735,176093736735,1452914118749,1452914118749,0,1,0,0,0,1,176093736736} 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=acquireReadWriteLocks start=1452914118732 end=1452914118753 duration=21 from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO ql.Driver: Starting command(queryId=ripple_20160116031515_67abae7f-4192-4c80-8ddd-317ed2291325): select * from s3bucket.archived_logs where host='someServer.com' and year='2016' and month='01' and day='06' limit 10 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1452914118110 end=1452914118755 duration=645 from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=runTasks start=1452914118755 end=1452914118755 duration=0 from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1452914118753 end=1452914118755 duration=2 from=org.apache.hadoop.hive.ql.Driver> OK 16/01/16 03:15:18 [main]: INFO ql.Driver: OK 16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO ZooKeeperHiveLockManager: about to release lock for s3bucket/archived_logs 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 8,2 replyHeader:: 8,176093736737,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs/LOCK-SHARED-0000000000,-1 response:: null 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 9,12 replyHeader:: 9,176093736737,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs,F response:: v{},s{176093736735,176093736735,1452914118749,1452914118749,0,2,0,0,0,0,176093736737} 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 10,2 replyHeader:: 10,176093736738,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/archived_logs,-1 response:: null 16/01/16 03:15:18 [main]: INFO ZooKeeperHiveLockManager: about to release lock for s3bucket 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 11,2 replyHeader:: 11,176093736739,0 request:: '/hive_zookeeper_namespace_hive/s3bucket/LOCK-SHARED-0000000000,-1 response:: null 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 12,12 replyHeader:: 12,176093736739,0 request:: '/hive_zookeeper_namespace_hive/s3bucket,F response:: v{},s{176093736733,176093736733,1452914118738,1452914118738,0,4,0,0,0,0,176093736739} 16/01/16 03:15:18 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x15247ad1e7f01a6, packet:: clientPath:null serverPath:null finished:false header:: 13,2 replyHeader:: 13,176093736740,0 request:: '/hive_zookeeper_namespace_hive/s3bucket,-1 response:: null 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1452914118755 end=1452914118765 duration=10 from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=Driver.run start=1452914118110 end=1452914118765 duration=655 from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO exec.TableScanOperator: 0 finished. closing... 16/01/16 03:15:18 [main]: DEBUG exec.TableScanOperator: Closing child = FIL[5] 16/01/16 03:15:18 [main]: DEBUG exec.FilterOperator: allInitializedParentsAreClosed? parent.state = CLOSE 16/01/16 03:15:18 [main]: INFO exec.FilterOperator: 5 finished. closing... 16/01/16 03:15:18 [main]: DEBUG exec.FilterOperator: Closing child = SEL[2] 16/01/16 03:15:18 [main]: DEBUG exec.SelectOperator: allInitializedParentsAreClosed? parent.state = CLOSE 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: 2 finished. closing... 16/01/16 03:15:18 [main]: DEBUG exec.SelectOperator: Closing child = LIM[3] 16/01/16 03:15:18 [main]: DEBUG exec.LimitOperator: allInitializedParentsAreClosed? parent.state = CLOSE 16/01/16 03:15:18 [main]: INFO exec.LimitOperator: 3 finished. closing... 16/01/16 03:15:18 [main]: DEBUG exec.LimitOperator: Closing child = OP[6] 16/01/16 03:15:18 [main]: DEBUG exec.ListSinkOperator: allInitializedParentsAreClosed? parent.state = CLOSE 16/01/16 03:15:18 [main]: INFO exec.ListSinkOperator: 6 finished. closing... 16/01/16 03:15:18 [main]: INFO exec.ListSinkOperator: 6 Close done 16/01/16 03:15:18 [main]: INFO exec.LimitOperator: 3 Close done 16/01/16 03:15:18 [main]: INFO exec.SelectOperator: 2 Close done 16/01/16 03:15:18 [main]: INFO exec.FilterOperator: 5 Close done 16/01/16 03:15:18 [main]: INFO exec.TableScanOperator: 0 Close done 16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #27 16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #27 16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: delete took 1ms 16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #28 16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #28 16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: delete took 1ms 16/01/16 03:15:18 [IPC Parameter Sending Thread #0]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple sending #29 16/01/16 03:15:18 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple got value #29 16/01/16 03:15:18 [main]: DEBUG ipc.ProtobufRpcEngine: Call: delete took 1ms Time taken: 0.655 seconds 16/01/16 03:15:18 [main]: INFO CliDriver: Time taken: 0.655 seconds 16/01/16 03:15:18 [main]: INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver> 16/01/16 03:15:18 [main]: INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1452914118773 end=1452914118774 duration=1 from=org.apache.hadoop.hive.ql.Driver> hive> 16/01/16 03:15:28 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple: closed 16/01/16 03:15:28 [IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple]: DEBUG ipc.Client: IPC Client (591425495) connection to hadoop2-private.com/HadoopClient:8020 from ripple: stopped, remaining connections 0 16/01/16 03:16:12 [main-SendThread(hadoop5-private.com:2181)]: DEBUG zookeeper.ClientCnxn: Got ping response for sessionid: 0x15247ad1e7f01a6 after 0ms

 



 

1 REPLY 1
Highlighted

Re: Hive query returns no results on S3 mapped table

Master Guru
Does your partition show up if you do "SHOW PARTITIONS archived_logs"? If not, you may want to run an "MSCK REPAIR archived_logs" first.

Hive will not auto-pickup a provided HDFS path as a partition unless you explicitly define a partition existent under the metadata.