- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
HIVE show table throws error
- Labels:
-
Apache Hive
-
Apache Ranger
Created ‎01-24-2017 02:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Team
Iam getting below error while executing show tables
0: jdbc:hive2://rwlp508.rw.discoverfinancial.> show tables ;
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO ql.Driver: We are setting the hadoop caller context from HIVE_SSN_ID:6805bcb1-84bb-4c18-a394-4407e14bf3f4 to hive_20170124090139_016247db-7df1-4563-af40-0b3256f66efe
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO parse.ParseDriver: Parsing command: show tables
17/01/24 09:01:39 INFO parse.ParseDriver: Parse Completed
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=parse start=1485266499788 end=1485266499789 duration=1 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO metastore.HiveMetaStore: 6: get_database: default
17/01/24 09:01:39 INFO HiveMetaStore.audit: ugi=hive/rw***.COM ip=unknown-ip-addr cmd=get_database: default
17/01/24 09:01:39 INFO ql.Driver: Semantic Analysis Completed
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1485266499789 end=1485266499800 duration=11 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO exec.ListSinkOperator: Initializing operator OP[17]
17/01/24 09:01:39 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=doAuthorization from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=doAuthorization start=1485266499801 end=1485266499802 duration=1 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=compile start=1485266499787 end=1485266499802 duration=15 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO ql.Driver: We are resetting the hadoop caller context to HIVE_SSN_ID:6805bcb1-84bb-4c18-a394-4407e14bf3f4
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO ql.Driver: Setting caller context to query id hive_20170124090139_016247db-7df1-4563-af40-0b3256f66efe
17/01/24 09:01:39 INFO ql.Driver: Starting command(queryId=hive_20170124090139_016247db-7df1-4563-af40-0b3256f66efe): show tables
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=PreHook.org.apache.hadoop.hive.ql.security.authorization.plugin.DisallowTransformHook from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=PreHook.org.apache.hadoop.hive.ql.security.authorization.plugin.DisallowTransformHook start=1485266499804 end=1485266499804 duration=0 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1485266499803 end=1485266499804 duration=1 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO ql.Driver: Starting task [Stage-0:DDL] in serial mode
17/01/24 09:01:39 INFO metastore.HiveMetaStore: 8: get_database: default
17/01/24 09:01:39 INFO HiveMetaStore.audit: ugi=hive/rwl***.COM ip=unknown-ip-addr cmd=get_database: default
17/01/24 09:01:39 INFO metastore.HiveMetaStore: 8: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
17/01/24 09:01:39 INFO metastore.ObjectStore: ObjectStore, initialize called
17/01/24 09:01:39 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is ORACLE
17/01/24 09:01:39 INFO metastore.ObjectStore: Initialized ObjectStore
17/01/24 09:01:39 INFO metadata.HiveUtils: Adding metastore authorization provider: org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider
17/01/24 09:01:39 INFO metastore.HiveMetaStore: 8: get_tables: db=default pat=.*
17/01
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 ERROR authorizer.RangerHiveAuthorizer: filterListCmdObjects: Internal error: null RangerAccessResult object received back from isAccessAllowed()!
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=runTasks start=1485266499804 end=1485266499862 duration=58 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO hooks.ATSHook: Created ATS Hook
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=PostHook.org.apache.hadoop.hive.ql.hooks.ATSHook from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=PostHook.org.apache.hadoop.hive.ql.hooks.ATSHook start=1485266499862 end=1485266499863 duration=1 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO ql.Driver: Resetting the caller context to HIVE_SSN_ID:6805bcb1-84bb-4c18-a394-4407e14bf3f4
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1485266499803 end=1485266499863 duration=60 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO ql.Driver: OK
17/01/24 09:01:39 INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1485266499863 end=1485266499863 duration=0 from=org.apache.hadoop.hive.ql.Driver>
17/01/24 09:01:39 INFO log.PerfLogger: </PERFLOG method=Driver.run start=1485266499803 end=1485266499864 duration=61 from=org.apache.hadoop.hive.ql.Driver>
+-----------+--+
| tab_name |
+-----------+--+
+-----------+--+
No rows selected (0.159 seconds)
Created ‎01-24-2017 03:12 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It appears you may have a Ranger policy preventing access to the table. You can disable Ranger authentication through Ambari in the Hive configs or review the Hive Ranger policies and provide the appropriate authorization. This HCC thread has some additional information https://community.hortonworks.com/questions/64345/how-to-add-another-hiveserver-for-current-metastor...
Created ‎01-24-2017 03:15 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yes, ranger implemented for HIVE. how do i solve this without disabling ranger
Created ‎01-24-2017 06:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Check if there are any localhost entries under Ranger configs from Ambari and change to appropriate hostname and also, check if HDFS plugin is correctly installed with all necessary access.
