Member since
05-19-2018
4
Posts
0
Kudos Received
0
Solutions
05-22-2019
08:55 PM
rangerusersync password seems to have 2 parts easy to change (for API/UI authentication) - can change via API call by another admin level user hard to change - jcek file? if i change point 1 then the usersync service fails to sync users from ldap (i am guessing to map users it retrieves password from jcek and then tries to use that password against API), how to change password in point 2 so that the usersync service still works?
... View more
Labels:
- Labels:
-
Apache Ranger
07-25-2018
11:12 AM
Env: no kerberos, no ranger, no hdfs. EC2 with ssl. Getting this error after running $ATLAS_HOME/bin/quick_start.py https://$componentPrivateDNSRecord:21443 with correct user/pass Creating sample types:
Created type [DB]
Created type [Table]
Created type [StorageDesc]
Created type [Column]
Created type [LoadProcess]
Created type [View]
Created type [JdbcAccess]
Created type [ETL]
Created type [Metric]
Created type [PII]
Created type [Fact]
Created type [Dimension]
Created type [Log Data]
Creating sample entities:
Exception in thread "main" com.sun.jersey.api.client.ClientHandlerException: java.net.SocketTimeoutException: Read timed out
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:155)
at com.sun.jersey.api.client.filter.HTTPBasicAuthFilter.handle(HTTPBasicAuthFilter.java:105)
at com.sun.jersey.api.client.Client.handle(Client.java:652)
at com.sun.jersey.api.client.WebResource.handle(WebResource.java:682)
at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
at com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:634)
at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:334)
at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:311)
at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:199)
at org.apache.atlas.AtlasClientV2.createEntity(AtlasClientV2.java:277)
at org.apache.atlas.examples.QuickStartV2.createInstance(QuickStartV2.java:339)
at org.apache.atlas.examples.QuickStartV2.createDatabase(QuickStartV2.java:362)
at org.apache.atlas.examples.QuickStartV2.createEntities(QuickStartV2.java:268)
at org.apache.atlas.examples.QuickStartV2.runQuickstart(QuickStartV2.java:150)
at org.apache.atlas.examples.QuickStartV2.main(QuickStartV2.java:132)
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at sun.security.ssl.InputRecord.readFully(InputRecord.java:465)
at sun.security.ssl.InputRecord.read(InputRecord.java:503)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:983)
at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:940)
at sun.security.ssl.AppInputStream.read(AppInputStream.java:105)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1587)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getResponseCode(HttpsURLConnectionImpl.java:347)
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:253)
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:153)
... 14 more
No sample data added to Apache Atlas Server.
Relevant code: https://github.com/apache/incubator-atlas/blob/master/webapp/src/main/java/org/apache/atlas/examples/QuickStartV2.java #This works
quickStartV2.createTypes();
#This errors
quickStartV2.createEntities();
First i thought atlas->kafka connectivity was issue but then I see: [ec2-user@ip-10-160-187-181 logs]$ cat atlas_kafka_setup.log
2018-07-25 00:06:14,923 INFO - [main:] ~ Looking for atlas-application.properties in classpath (ApplicationProperties:78)
2018-07-25 00:06:14,926 INFO - [main:] ~ Loading atlas-application.properties from file:/home/ec2-user/atlas/distro/target/apache-atlas-1.0.0-SNAPSHOT-bin/apache-atlas-1.0.0-SNAPSHOT/conf/atlas-application.properties (ApplicationProperties:91)
2018-07-25 00:06:16,512 WARN - [main:] ~ Attempting to create topic ATLAS_HOOK (AtlasTopicCreator:72)
2018-07-25 00:06:17,004 WARN - [main:] ~ Created topic ATLAS_HOOK with partitions 1 and replicas 1 (AtlasTopicCreator:119)
2018-07-25 00:06:17,004 WARN - [main:] ~ Attempting to create topic ATLAS_ENTITIES (AtlasTopicCreator:72)
2018-07-25 00:06:17,024 WARN - [main:] ~ Created topic ATLAS_ENTITIES with partitions 1 and replicas 1 (AtlasTopicCreator:119)
2018-07-25 01:49:45,147 DEBUG - [main:] ~ Calling API [ GET : api/atlas/v2/types/typedefs ] (AtlasBaseClient:319)
2018-07-25 01:49:45,147 DEBUG - [main:] ~ Attempting to configure HTTPS connection using client configuration (SecureClientUtils$4:221)
2018-07-25 01:49:45,166 INFO - [main:] ~ Unable to configure HTTPS connection from configuration. Leveraging JDK properties. (SecureClientUtils$4:240)
2018-07-25 01:49:45,269 DEBUG - [main:] ~ API https://mydns:21443/api/atlas/v2/types/typedefs?name=Dimension returned status 200 (AtlasBaseClient:337)
2018-07-25 01:49:45,270 DEBUG - [main:] ~ Calling API [ GET : api/atlas/v2/types/typedefs ] (AtlasBaseClient:319)
2018-07-25 01:49:45,271 DEBUG - [main:] ~ Attempting to configure HTTPS connection using client configuration (SecureClientUtils$4:221)
2018-07-25 01:49:45,291 INFO - [main:] ~ Unable to configure HTTPS connection from configuration. Leveraging JDK properties. (SecureClientUtils$4:240)
2018-07-25 01:49:45,450 DEBUG - [main:] ~ API https://mydns:21443/api/atlas/v2/types/typedefs?name=Log+Data returned status 200 (AtlasBaseClient:337)
2018-07-25 01:49:45,455 DEBUG - [main:] ~ Calling API [ POST : api/atlas/v2/entity ] <== AtlasEntityWithExtInfo{entity=AtlasEntity{AtlasStruct{typeName='DB', attributes=[owner:John ETL, createTime:1532483385453, name:Sales, description:sales database, locationuri:hdfs://host:8000/apps/warehouse/sales]}guid='-6466195619848', status=null, createdBy='null', updatedBy='null', createTime=null, updateTime=null, version=0, relationshipAttributes=[], classifications=[], },AtlasEntityExtInfo{referredEntities={}}} (AtlasBaseClient:319)
2018-07-25 01:49:45,455 DEBUG - [main:] ~ Attempting to configure HTTPS connection using client configuration (SecureClientUtils$4:221)
2018-07-25 01:49:45,474 INFO - [main:] ~ Unable to configure HTTPS connection from configuration. Leveraging JDK properties. (SecureClientUtils$4:240)
2018-07-25 01:49:33,256 Audit: myuser/10.160.189.35-10.160.189.35 performed request POST https://mydns:21443/api/atlas/v2/types/typedefs (10.160.187.181) at time 2018-07-25T01:49Z
2018-07-25 01:49:45,445 Audit: myuser/10.160.189.35-10.160.189.35 performed request GET https://mydns:21443/api/atlas/v2/types/typedefs?name=Log+Data (10.160.187.181) at time 2018-07-25T01:49Z
2018-07-25 01:49:45,678 Audit: myuser/10.160.189.35-10.160.189.35 performed request POST https://mydns:21443/api/atlas/v2/entity (10.160.187.181) at time 2018-07-25T01:49Z
The 2 topics are returned by this: $KAFKA_HOME/bin/kafka-topics.sh --list --zookeeper localhost:2181 atlas' application.log does have this, not sure why: 2018-07-25 02:18:14,991 DEBUG - [NotificationHookConsumer thread-0:] ~ Give up sending metadata request since no node is available (NetworkClient$DefaultMetadataUpdater:625)
2018-07-25 02:18:15,018 DEBUG - [kafka-producer-network-thread | producer-1:] ~ Initialize connection to node -1 for sending metadata request (NetworkClient$DefaultMetadataUpdater:644)
2018-07-25 02:18:15,018 DEBUG - [kafka-producer-network-thread | producer-1:] ~ Initiating connection to node -1 at localhost:9027. (NetworkClient:496)
2018-07-25 02:18:15,018 DEBUG - [kafka-producer-network-thread | producer-1:] ~ Connection with localhost/127.0.0.1 disconnected (Selector:345)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:51)
at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:73)
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:309)
at org.apache.kafka.common.network.Selector.poll(Selector.java:283)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:260)
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:229)
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:134)
at java.lang.Thread.run(Thread.java:748)
2018-07-25 02:18:15,018 DEBUG - [kafka-producer-network-thread | producer-1:] ~ Node -1 disconnected. (NetworkClient:463)
2018-07-25 02:18:15,018 DEBUG - [kafka-producer-network-thread | producer-1:] ~ Give up sending metadata request since no node is available (NetworkClient$DefaultMetadataUpdater:625)
2018-07-25 02:18:15,092 DEBUG - [NotificationHookConsumer thread-0:] ~ Initialize connection to node -1 for sending metadata request (NetworkClient$DefaultMetadataUpdater:644)
2018-07-25 02:18:15,092 DEBUG - [NotificationHookConsumer thread-0:] ~ Initiating connection to node -1 at localhost:9027. (NetworkClient:496)
2018-07-25 02:18:15,092 DEBUG - [NotificationHookConsumer thread-0:] ~ Connection with localhost/127.0.0.1 disconnected (Selector:345)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:51)
at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:73)
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:309)
at org.apache.kafka.common.network.Selector.poll(Selector.java:283)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:260)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.clientPoll(ConsumerNetworkClient.java:360)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:224)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:192)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.awaitMetadataUpdate(ConsumerNetworkClient.java:134)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureCoordinatorReady(AbstractCoordinator.java:183)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:973)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:937)
at org.apache.atlas.kafka.AtlasKafkaConsumer.receive(AtlasKafkaConsumer.java:63)
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Kafka
05-19-2018
12:40 PM
Before installing Ranger i was able to query anything in Hive ok. After installing ranger, setting up solr/rangeraudit/rangeradmin /usersync/ranger-hive plugin and creating the Ranger-hive service in the UI and some Allow policies in the UI I am no longer able to query any hive tables. When using hiveserver2 (ie beeline), I now receive this error (even though I have created allow policy for the user on * db, *tbl, *cols): Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [dvdsuauu] does not have [USE] privilege on [*] If i click test the hive service in the admin UI I get this error: Connection Failed. Unable to retrieve any files using given parameters, You can still save the repository and start creating policies, but you would not be able to use autocomplete for resource names. Check ranger_admin.log for more info. org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].. Unable to execute SQL [show databases like "*"].. Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]. Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]. In the ews admin logs I see: ews logs
2018-05-19 03:09:17,331 [http-bio-6182-exec-125] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:232) - UserSession Updated to set new Permissions to User: rangerusersync
2018-05-19 03:09:17,332 [http-bio-6182-exec-125] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:184) - Login Success: loginId=rangerusersync, sessionId=2023, sessionId=BED0B2377A6E8B70707CD81D52F2E178, requestId=10.160.237.127, epoch=1526699357332
2018-05-19 03:09:17,386 [http-bio-6182-exec-125] INFO org.apache.ranger.biz.XUserMgr (XUserMgr.java:316) - Permission Updated for user: [tsdsuauu] For Module: [Resource Based Policies]
2018-05-19 03:09:17,388 [http-bio-6182-exec-125] INFO org.apache.ranger.biz.XUserMgr (XUserMgr.java:316) - Permission Updated for user: [tsdsuauu] For Module: [Reports]
2018-05-19 03:14:30,453 [http-bio-6182-exec-104] INFO org.apache.ranger.security.handler.RangerAuthenticationProvider (RangerAuthenticationProvider.java:147) - Authentication with SHA-256 failed. Now trying with MD5.
2018-05-19 03:14:30,454 [http-bio-6182-exec-104] INFO org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:70) - Login Successful:admin | Ip Address:10.160.237.127 | sessionId=0F9E9D64CB89B07059BDC402C12E66B3 | Epoch=1526699670454
2018-05-19 03:14:30,455 [http-bio-6182-exec-104] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:429) - admin is a valid user
2018-05-19 03:14:30,694 [http-bio-6182-exec-104] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:232) - UserSession Updated to set new Permissions to User: admin
2018-05-19 03:14:30,694 [http-bio-6182-exec-104] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:184) - Login Success: loginId=admin, sessionId=2024, sessionId=0F9E9D64CB89B07059BDC402C12E66B3, requestId=10.160.237.127, epoch=1526699670694
2018-05-19 03:14:44,477 [timed-executor-pool-0] WARN org.apache.hadoop.security.SecureClientLogin (SecureClientLogin.java:126) - Can't find keyTab Path : null
2018-05-19 03:14:44,477 [timed-executor-pool-0] WARN org.apache.hadoop.security.SecureClientLogin (SecureClientLogin.java:130) - Can't find principal : null
2018-05-19 03:14:44,478 [timed-executor-pool-0] INFO org.apache.ranger.plugin.client.BaseClient (BaseClient.java:126) - Init Login: security not enabled, using username
2018-05-19 03:14:44,478 [timed-executor-pool-0] INFO apache.ranger.services.hive.client.HiveClient (HiveClient.java:93) - Since Password is NOT provided, Trying to use UnSecure client with username and password
2018-05-19 03:14:44,776 [timed-executor-pool-0] ERROR apache.ranger.services.hive.client.HiveClient$3 (HiveClient.java:117) - <== HiveClient getDatabaseList() :Unable to get the Database List
org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
at org.apache.ranger.services.hive.client.HiveClient.getDBList(HiveClient.java:200)
at org.apache.ranger.services.hive.client.HiveClient.access$400(HiveClient.java:56)
at org.apache.ranger.services.hive.client.HiveClient$3.run(HiveClient.java:114)
at org.apache.ranger.services.hive.client.HiveClient$3.run(HiveClient.java:107)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.ranger.services.hive.client.HiveClient.getDatabaseList(HiveClient.java:107)
at org.apache.ranger.services.hive.client.HiveClient.connectionTest(HiveClient.java:829)
at org.apache.ranger.services.hive.client.HiveResourceMgr.connectionTest(HiveResourceMgr.java:48)
at org.apache.ranger.services.hive.RangerServiceHive.validateConfig(RangerServiceHive.java:57)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:574)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:561)
at org.apache.ranger.biz.ServiceMgr$TimedCallable.call(ServiceMgr.java:522)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:267)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:253)
at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:313)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:253)
at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:476)
at org.apache.ranger.services.hive.client.HiveClient.getDBList(HiveClient.java:179)
... 16 more
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:206)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:290)
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:320)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:530)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:517)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy31.executeStatementAsync(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:310)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:530)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1437)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1422)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
... 3 more
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:432)
at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:974)
at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:761)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:550)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1295)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:204)
... 27 more
2018-05-19 03:14:44,780 [timed-executor-pool-0] ERROR apache.ranger.services.hive.client.HiveResourceMgr (HiveResourceMgr.java:50) - <== HiveResourceMgr.connectionTest Error: org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
2018-05-19 03:14:44,780 [timed-executor-pool-0] ERROR org.apache.ranger.services.hive.RangerServiceHive (RangerServiceHive.java:59) - <== RangerServiceHive.validateConfig Error:org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
2018-05-19 03:14:44,780 [timed-executor-pool-0] ERROR org.apache.ranger.biz.ServiceMgr$TimedCallable (ServiceMgr.java:524) - TimedCallable.call: Error:org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
2018-05-19 03:14:44,780 [http-bio-6182-exec-133] ERROR org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:188) - ==> ServiceMgr.validateConfig Error:org.apache.ranger.plugin.client.HadoopException: org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"]. 2018-05-18 12:57:41,619 [http-bio-6182-exec-49] INFO org.apache.ranger.security.handler.RangerAuthenticationProvider (RangerAuthenticationProvider.java:147) - Authentication with SHA-256 failed. Now trying with MD5.
2018-05-18 12:57:41,623 [http-bio-6182-exec-49] INFO org.apache.ranger.security.listener.SpringEventListener (SpringEventListener.java:70) - Login Successful:admin | Ip Address:10.160.237.208 | sessionId=14321F7EB7D6CABDA7F65809B01253CB | Epoch=1526648261623
2018-05-18 12:57:41,628 [http-bio-6182-exec-49] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:429) - admin is a valid user
2018-05-18 12:57:41,759 [http-bio-6182-exec-54] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:232) - UserSession Updated to set new Permissions to User: admin
2018-05-18 12:57:41,759 [http-bio-6182-exec-54] INFO org.apache.ranger.biz.SessionMgr (SessionMgr.java:184) - Login Success: loginId=admin, sessionId=1225, sessionId=14321F7EB7D6CABDA7F65809B01253CB, requestId=10.160.237.208, epoch=1526648261759
2018-05-18 12:57:42,893 [http-bio-6182-exec-54] INFO org.apache.ranger.solr.SolrMgr (SolrMgr.java:168) - ==>SolrMgr.init()
2018-05-18 12:57:42,893 [http-bio-6182-exec-54] INFO org.apache.ranger.solr.SolrMgr (SolrMgr.java:177) - Loading SolrClient JAAS config from Ranger audit config if present...
2018-05-18 12:57:42,897 [http-bio-6182-exec-54] INFO org.apache.ranger.solr.SolrMgr (SolrMgr.java:182) - <==SolrMgr.init()
2018-05-18 12:57:42,916 [http-bio-6182-exec-54] INFO org.apache.ranger.service.RangerPluginActivityLogger (RangerPluginActivityLogger.java:46) - ranger.plugin.activity.audit.commit.inline = false
2018-05-18 12:57:42,916 [http-bio-6182-exec-54] INFO org.apache.ranger.service.RangerPluginActivityLogger (RangerPluginActivityLogger.java:50) - Will use separate thread for committing scheduled work
2018-05-18 13:18:04,809 [http-bio-6182-exec-56] WARN org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:369) - getFilesInDirectory('ranger-plugins/hive'): adding /usr/lib/apache-ranger-1.0.0/target/ranger-1.0.0-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/hive-common-2.3.2.jar
2018-05-18 13:18:04,810 [http-bio-6182-exec-56] WARN org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:369) - getFilesInDirectory('ranger-plugins/hive'): adding /usr/lib/apache-ranger-1.0.0/target/ranger-1.0.0-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/hive-exec-2.3.2.jar
2018-05-18 13:18:04,810 [http-bio-6182-exec-56] WARN org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:369) - getFilesInDirectory('ranger-plugins/hive'): adding /usr/lib/apache-ranger-1.0.0/target/ranger-1.0.0-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/hive-metastore-2.3.2.jar
2018-05-18 13:18:04,811 [http-bio-6182-exec-56] WARN org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:369) - getFilesInDirectory('ranger-plugins/hive'): adding /usr/lib/apache-ranger-1.0.0/target/ranger-1.0.0-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/hive-service-2.3.2.jar
2018-05-18 13:18:04,811 [http-bio-6182-exec-56] WARN org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:369) - getFilesInDirectory('ranger-plugins/hive'): adding /usr/lib/apache-ranger-1.0.0/target/ranger-1.0.0-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/ranger-hive-plugin-1.0.0.jar
2018-05-18 13:18:04,811 [http-bio-6182-exec-56] WARN org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:369) - getFilesInDirectory('ranger-plugins/hive'): adding /usr/lib/apache-ranger-1.0.0/target/ranger-1.0.0-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/hive-jdbc-2.3.2.jar
2018-05-18 13:18:04,811 [http-bio-6182-exec-56] WARN org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:369) - getFilesInDirectory('ranger-plugins/hive'): adding /usr/lib/apache-ranger-1.0.0/target/ranger-1.0.0-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/232
2018-05-18 13:18:04,829 [timed-executor-pool-0] WARN org.apache.hadoop.security.SecureClientLogin (SecureClientLogin.java:126) - Can't find keyTab Path : null
2018-05-18 13:18:04,829 [timed-executor-pool-0] WARN org.apache.hadoop.security.SecureClientLogin (SecureClientLogin.java:130) - Can't find principal : null
2018-05-18 13:18:04,834 [timed-executor-pool-0] ERROR org.apache.ranger.plugin.util.PasswordUtils (PasswordUtils.java:147) - Unable to decrypt password due to error
javax.crypto.IllegalBlockSizeException: Input length must be multiple of 8 when decrypting with padded cipher
at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:936)
at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:847)
at com.sun.crypto.provider.PBES1Core.doFinal(PBES1Core.java:416)
at com.sun.crypto.provider.
Cipher.engineDoFinal(PBEWithMD5AndDESCipher.java:316)
at javax.crypto.Cipher.doFinal(Cipher.java:2164)
at org.apache.ranger.plugin.util.PasswordUtils.decrypt(PasswordUtils.java:132)
at org.apache.ranger.plugin.util.PasswordUtils.decryptPassword(PasswordUtils.java:120)
at org.apache.ranger.plugin.client.BaseClient.login(BaseClient.java:109)
at org.apache.ranger.plugin.client.BaseClient.<init>(BaseClient.java:61)
at org.apache.ranger.plugin.client.BaseClient.<init>(BaseClient.java:53)
at org.apache.ranger.services.hive.client.HiveClient.<init>(HiveClient.java:76)
at org.apache.ranger.services.hive.client.HiveClient.connectionTest(HiveClient.java:827)
at org.apache.ranger.services.hive.client.HiveResourceMgr.connectionTest(HiveResourceMgr.java:48)
at org.apache.ranger.services.hive.RangerServiceHive.validateConfig(RangerServiceHive.java:57)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:574)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:561)
at org.apache.ranger.biz.ServiceMgr$TimedCallable.call(ServiceMgr.java:522)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2018-05-18 13:18:04,835 [timed-executor-pool-0] INFO org.apache.ranger.plugin.client.BaseClient (BaseClient.java:111) - Password decryption failed; trying connection with received password string
2018-05-18 13:18:04,835 [timed-executor-pool-0] INFO org.apache.ranger.plugin.client.BaseClient (BaseClient.java:126) - Init Login: security not enabled, using username
2018-05-18 13:18:04,835 [timed-executor-pool-0] INFO apache.ranger.services.hive.client.HiveClient (HiveClient.java:93) - Since Password is NOT provided, Trying to use UnSecure client with username and password
2018-05-18 13:18:04,839 [timed-executor-pool-0] ERROR org.apache.ranger.plugin.util.PasswordUtils (PasswordUtils.java:147) - Unable to decrypt password due to error
javax.crypto.IllegalBlockSizeException: Input length must be multiple of 8 when decrypting with padded cipher
at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:936)
at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:847)
at com.sun.crypto.provider.PBES1Core.doFinal(PBES1Core.java:416)
at com.sun.crypto.provider.PBEWithMD5AndDESCipher.engineDoFinal(PBEWithMD5AndDESCipher.java:316)
at javax.crypto.Cipher.doFinal(Cipher.java:2164)
at org.apache.ranger.plugin.util.PasswordUtils.decrypt(PasswordUtils.java:132)
at org.apache.ranger.plugin.util.PasswordUtils.decryptPassword(PasswordUtils.java:120)
at org.apache.ranger.services.hive.client.HiveClient.initConnection(HiveClient.java:710)
at org.apache.ranger.services.hive.client.HiveClient.access$100(HiveClient.java:56)
at org.apache.ranger.services.hive.client.HiveClient$2.run(HiveClient.java:98)
at org.apache.ranger.services.hive.client.HiveClient$2.run(HiveClient.java:96)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.ranger.services.hive.client.HiveClient.initHive(HiveClient.java:96)
at org.apache.ranger.services.hive.client.HiveClient.<init>(HiveClient.java:77)
at org.apache.ranger.services.hive.client.HiveClient.connectionTest(HiveClient.java:827)
at org.apache.ranger.services.hive.client.HiveResourceMgr.connectionTest(HiveResourceMgr.java:48)
at org.apache.ranger.services.hive.RangerServiceHive.validateConfig(RangerServiceHive.java:57)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:574)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:561)
at org.apache.ranger.biz.ServiceMgr$TimedCallable.call(ServiceMgr.java:522)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2018-05-18 13:18:04,839 [timed-executor-pool-0] INFO apache.ranger.services.hive.client.HiveClient (HiveClient.java:712) - Password decryption failed; trying Hive connection with received password string
2018-05-18 13:18:10,966 [timed-executor-pool-0] ERROR apache.ranger.services.hive.client.HiveClient$3 (HiveClient.java:117) - <== HiveClient getDatabaseList() :Unable to get the Database List
org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
at org.apache.ranger.services.hive.client.HiveClient.getDBList(HiveClient.java:200)
at org.apache.ranger.services.hive.client.HiveClient.access$400(HiveClient.java:56)
at org.apache.ranger.services.hive.client.HiveClient$3.run(HiveClient.java:114)
at org.apache.ranger.services.hive.client.HiveClient$3.run(HiveClient.java:107)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.ranger.services.hive.client.HiveClient.getDatabaseList(HiveClient.java:107)
at org.apache.ranger.services.hive.client.HiveClient.connectionTest(HiveClient.java:829)
at org.apache.ranger.services.hive.client.HiveResourceMgr.connectionTest(HiveResourceMgr.java:48)
at org.apache.ranger.services.hive.RangerServiceHive.validateConfig(RangerServiceHive.java:57)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:574)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:561)
at org.apache.ranger.biz.ServiceMgr$TimedCallable.call(ServiceMgr.java:522)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:267)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:253)
at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:313)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:253)
at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:476)
at org.apache.ranger.services.hive.client.HiveClient.getDBList(HiveClient.java:179)
... 16 more
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:206)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:290)
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:320)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:530)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:517)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy31.executeStatementAsync(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:310)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:530)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1437)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1422)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
... 3 more
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:432)
at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:974)
at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:761)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:550)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1295)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:204)
... 27 more
2018-05-18 13:18:10,999 [timed-executor-pool-0] ERROR apache.ranger.services.hive.client.HiveResourceMgr (HiveResourceMgr.java:50) - <== HiveResourceMgr.connectionTest Error: org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
2018-05-18 13:18:10,999 [timed-executor-pool-0] ERROR org.apache.ranger.services.hive.RangerServiceHive (RangerServiceHive.java:59) - <== RangerServiceHive.validateConfig Error:org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
2018-05-18 13:18:10,999 [timed-executor-pool-0] ERROR org.apache.ranger.biz.ServiceMgr$TimedCallable (ServiceMgr.java:524) - TimedCallable.call: Error:org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
2018-05-18 13:18:11,000 [http-bio-6182-exec-56] ERROR org.apache.ranger.biz.ServiceMgr (ServiceMgr.java:188) - ==> ServiceMgr.validateConfig Error:org.apache.ranger.plugin.client.HadoopException: org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
2018-05-18 13:18:16,587 [http-bio-6182-exec-53] INFO org.apache.ranger.biz.XUserMgr (XUserMgr.java:2220) - User created: dvdsuauu
2018-05-18 13:18:16,620 [http-bio-6182-exec-53] INFO org.apache.ranger.biz.XUserMgr (XUserMgr.java:308) - Permission assigned to user: [dvdsuauu] For Module: [Resource Based Policies]
2018-05-18 13:18:16,632 [http-bio-6182-exec-53] INFO org.apache.ranger.biz.XUserMgr (XUserMgr.java:308) - Permission assigned to user: [dvdsuauu] For Module: [Reports]
2018-05-18 13:18:24,534 [timed-executor-pool-0] WARN org.apache.hadoop.security.SecureClientLogin (SecureClientLogin.java:126) - Can't find keyTab Path : null
2018-05-18 13:18:24,534 [timed-executor-pool-0] WARN org.apache.hadoop.security.SecureClientLogin (SecureClientLogin.java:130) - Can't find principal : null
2018-05-18 13:18:24,535 [timed-executor-pool-0] INFO org.apache.ranger.plugin.client.BaseClient (BaseClient.java:126) - Init Login: security not enabled, using username
2018-05-18 13:18:24,535 [timed-executor-pool-0] INFO apache.ranger.services.hive.client.HiveClient (HiveClient.java:93) - Since Password is NOT provided, Trying to use UnSecure client with username and password
2018-05-18 13:18:24,847 [timed-executor-pool-0] ERROR apache.ranger.services.hive.client.HiveClient$3 (HiveClient.java:117) - <== HiveClient getDatabaseList() :Unable to get the Database List
org.apache.ranger.plugin.client.HadoopException: Unable to execute SQL [show databases like "*"].
at org.apache.ranger.services.hive.client.HiveClient.getDBList(HiveClient.java:200)
at org.apache.ranger.services.hive.client.HiveClient.access$400(HiveClient.java:56)
at org.apache.ranger.services.hive.client.HiveClient$3.run(HiveClient.java:114)
at org.apache.ranger.services.hive.client.HiveClient$3.run(HiveClient.java:107)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.ranger.services.hive.client.HiveClient.getDatabaseList(HiveClient.java:107)
at org.apache.ranger.services.hive.client.HiveClient.connectionTest(HiveClient.java:829)
at org.apache.ranger.services.hive.client.HiveResourceMgr.connectionTest(HiveResourceMgr.java:48)
at org.apache.ranger.services.hive.RangerServiceHive.validateConfig(RangerServiceHive.java:57)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:574)
at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall(ServiceMgr.java:561)
at org.apache.ranger.biz.ServiceMgr$TimedCallable.call(ServiceMgr.java:522)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:267)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:253)
at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:313)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:253)
at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:476)
at org.apache.ranger.services.hive.client.HiveClient.getDBList(HiveClient.java:179)
... 16 more
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:206)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:290)
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:320)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:530)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:517)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy31.executeStatementAsync(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:310)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:530)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1437)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1422)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
... 3 more
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [dvdsuauu] does not have [USE] privilege on [*]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:432) In hiveserver2 logs i see: which: no hbase in (/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/opt/aws/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin)
2018-05-18 16:22:49: Starting HiveServer2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/apache-hive-2.3.2.5-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ec2-user/spark_home/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2018-05-18T16:22:52,820 INFO [main] session.SessionState: Created HDFS directory: /var/hive/scratch/tmp/ec2-user/789b83bc-145f-467d-8d6b-e2c065b9bfcf
2018-05-18T16:22:52,821 INFO [main] session.SessionState: Created local directory: /var/hive/ec2-user/789b83bc-145f-467d-8d6b-e2c065b9bfcf
2018-05-18T16:22:52,821 INFO [main] session.SessionState: Created HDFS directory: /var/hive/scratch/tmp/ec2-user/789b83bc-145f-467d-8d6b-e2c065b9bfcf/_tmp_space.db
2018-05-18T16:22:52,839 WARN [main] authorizer.RangerHiveAuthorizerBase: RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty
2018-05-18T16:22:52,852 INFO [main] config.RangerConfiguration: addResourceIfReadable(ranger-hive-audit.xml): resource file is file:/usr/lib/apache-hive-2.3.2.5-bin/conf/ranger-hive-audit.xml
2018-05-18T16:22:52,852 INFO [main] config.RangerConfiguration: addResourceIfReadable(ranger-hive-security.xml): resource file is file:/usr/lib/apache-hive-2.3.2.5-bin/conf/ranger-hive-security.xml
2018-05-18T16:22:52,855 INFO [main] provider.AuditProviderFactory: AuditProviderFactory: creating..
2018-05-18T16:22:52,866 INFO [main] provider.AuditProviderFactory: AuditProviderFactory: initializing..
2018-05-18T16:22:52,900 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.log4j.is.async=false
2018-05-18T16:22:52,900 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.is.enabled=true
2018-05-18T16:22:52,900 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.hdfs.dir=hdfs://__REPLACE__NAME_NODE_HOST:8020/ranger/audit
2018-05-18T16:22:52,900 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.policy.rest.client.connection.timeoutMs=120000
2018-05-18T16:22:52,900 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.kafka.topic_name=ranger_audits
2018-05-18T16:22:52,900 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.destination.directory=hdfs://__REPLACE__NAME_NODE_HOST:8020/ranger/audit/%app-type%/%time:yyyyMMdd%
2018-05-18T16:22:52,900 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.solr.is.enabled=false
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.local.archive.max.file.count=10
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.jpa.javax.persistence.jdbc.user=rangerlogger
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.local.buffer.directory=__REPLACE__LOG_DIR/hive/audit/%app-type%
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.db.is.enabled=false
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.kafka.async.max.flush.interval.ms=1000
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.kafka.broker_list=localhost:9092
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.policy.source.impl=org.apache.ranger.admin.client.RangerAdminRESTClient
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.solr=true
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.jpa.javax.persistence.jdbc.driver=com.mysql.jdbc.Driver
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.is.async=true
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.policy.cache.dir=/etc/ranger/HIVE_RANGER_E2E/policycache
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.destination.open.retry.interval.seconds=60
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.solr.solr_url=http://localhost:6083/solr/ranger_audits
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.db.is.async=true
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.solr.batch.filespool.dir=/var/log/hive/audit/solr/spool
2018-05-18T16:22:52,901 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.jpa.javax.persistence.jdbc.url=jdbc:mysql://localhost:3306/ranger_audit
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.destination.file=%hostname%-audit.log
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.policy.rest.ssl.config.file=/etc/hive/conf/ranger-policymgr-ssl.xml
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.local.buffer.file=%time:yyyyMMdd-HHmm.ss%.log
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.local.archive.directory=__REPLACE__LOG_DIR/hive/audit/archive/%app-type%
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.db.async.max.queue.size=10240
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.log4j.async.max.queue.size=10240
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.log4j.is.enabled=false
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.async.max.flush.interval.ms=30000
2018-05-18T16:22:52,902 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.solr.urls=https://myhosthere:6083/solr/ranger_audits
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.service.name=HIVE_RANGER_E2E
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.solr.async.max.flush.interval.ms=1000
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.policy.rest.client.read.timeoutMs=30000
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.solr.async.max.queue.size=1
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.local.buffer.flush.interval.seconds=60
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.destination.flush.interval.seconds=900
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.solr.zookeepers=NONE
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.policy.pollIntervalMs=30000
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.hdfs.config.fs.azure.account.keyprovider.__REPLACE_AZURE_ACCOUNT_NAME.blob.core.windows.net=__REPLACE_AZURE_ACCOUNT_KEY_PROVIDER
2018-05-18T16:22:52,903 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.hdfs.config.fs.azure.account.key.__REPLACE_AZURE_ACCOUNT_NAME.blob.core.windows.net=__REPLACE_AZURE_ACCOUNT_KEY
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.db.async.max.flush.interval.ms=30000
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.solr.user=NONE
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.kafka.async.max.queue.size=1
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.is.enabled=false
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.async.max.queue.size=1048576
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: ranger.plugin.hive.policy.rest.url=https://myhosthere:6182
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.local.buffer.file.buffer.size.bytes=8192
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.jpa.javax.persistence.jdbc.password=none
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.credential.provider.file=jceks://file/etc/ranger/hivedev/auditcred.jceks
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.db.batch.size=100
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.hdfs.config.fs.azure.shellkeyprovider.script=__REPLACE_AZURE_SHELL_KEY_PROVIDER
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.hdfs.batch.filespool.dir=/var/log/hive/audit/hdfs/spool
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.destination.rollover.interval.seconds=86400
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.kafka.is.enabled=false
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.log4j.async.max.flush.interval.ms=30000
2018-05-18T16:22:52,904 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.hive.update.xapolicies.on.grant.revoke=true
2018-05-18T16:22:52,905 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.hdfs=false
2018-05-18T16:22:52,905 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.hdfs.config.local.buffer.rollover.interval.seconds=600
2018-05-18T16:22:52,905 INFO [main] provider.AuditProviderFactory: AUDIT PROPERTY: xasecure.audit.destination.solr.password=NONE
2018-05-18T16:22:52,905 INFO [main] provider.AuditProviderFactory: Audit destination xasecure.audit.destination.solr is set to true
2018-05-18T16:22:52,916 INFO [main] destination.AuditDestination: AuditDestination() enter
2018-05-18T16:22:52,916 INFO [main] destination.SolrAuditDestination: init() called
2018-05-18T16:22:52,916 INFO [main] provider.BaseAuditHandler: BaseAuditProvider.init()
2018-05-18T16:22:52,916 INFO [main] provider.BaseAuditHandler: propPrefix=xasecure.audit.destination.solr
2018-05-18T16:22:52,916 INFO [main] provider.BaseAuditHandler: Using providerName from property prefix. providerName=solr
2018-05-18T16:22:52,917 INFO [main] provider.BaseAuditHandler: providerName=solr
2018-05-18T16:22:52,917 INFO [main] destination.SolrAuditDestination: ==>SolrAuditDestination.init()
2018-05-18T16:22:52,917 INFO [main] destination.SolrAuditDestination: In solrAuditDestination.init() : JAAS Configuration set as [null]
2018-05-18T16:22:52,917 WARN [main] destination.SolrAuditDestination: No Client JAAS config present in solr audit config. Ranger Audit to Kerberized Solr will fail...
2018-05-18T16:22:52,917 INFO [main] destination.SolrAuditDestination: Loading SolrClient JAAS config from Ranger audit config if present...
2018-05-18T16:22:52,923 INFO [main] destination.SolrAuditDestination: In solrAuditDestination.init() (finally) : JAAS Configuration set as [null]
2018-05-18T16:22:52,923 INFO [main] destination.SolrAuditDestination: <==SolrAuditDestination.init()
2018-05-18T16:22:52,923 INFO [main] destination.SolrAuditDestination: Solr zkHosts=null, solrURLs=https://myhosthere:6083/solr/ranger_audits, collectionName=ranger_audits
2018-05-18T16:22:52,923 INFO [main] destination.SolrAuditDestination: Connecting to Solr using URLs=[https://myhosthere:6083/solr/ranger_audits]
2018-05-18T16:22:52,965 INFO [main] provider.AuditProviderFactory: xasecure.audit.destination.solr.queue is not set. Setting queue to batch for solr
2018-05-18T16:22:52,965 INFO [main] provider.AuditProviderFactory: queue for solr is batch
2018-05-18T16:22:52,969 INFO [main] queue.AuditQueue: BaseAuditProvider.init()
2018-05-18T16:22:52,969 INFO [main] provider.BaseAuditHandler: BaseAuditProvider.init()
2018-05-18T16:22:52,969 INFO [main] provider.BaseAuditHandler: propPrefix=xasecure.audit.destination.solr.batch
2018-05-18T16:22:52,969 INFO [main] provider.BaseAuditHandler: providerName=batch
2018-05-18T16:22:52,970 INFO [main] queue.AuditQueue: File spool is enabled for batch, logFolderProp=/var/log/hive/audit/solr/spool, xasecure.audit.destination.solr.batch.filespool.dir=false
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: retryDestinationMS=30000, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: fileRolloverSec=86400, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: maxArchiveFiles=100, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: logFolder=/var/log/hive/audit/solr/spool, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: logFileNameFormat=spool_%app-type%_%time:yyyyMMdd-HHmm.ss%.log, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: archiveFolder=/var/log/hive/audit/solr/spool/archive, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: indexFile=/var/log/hive/audit/solr/spool/index_batch_batch.solr_hiveServer2.json, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: indexDoneFile=/var/log/hive/audit/solr/spool/index_batch_batch.solr_hiveServer2_closed.json, queueName=batch
2018-05-18T16:22:52,974 INFO [main] queue.AuditFileSpool: Loading index file. fileName=/var/log/hive/audit/solr/spool/index_batch_batch.solr_hiveServer2.json
2018-05-18T16:22:52,975 INFO [main] queue.AuditFileSpool: INDEX printIndex() ==== START
2018-05-18T16:22:52,975 INFO [main] queue.AuditFileSpool: INDEX printIndex() ==== END
2018-05-18T16:22:52,975 INFO [main] provider.AuditProviderFactory: Using v3 audit configuration
2018-05-18T16:22:52,975 INFO [main] provider.AuditProviderFactory: AuditSummaryQueue is disabled
2018-05-18T16:22:52,977 INFO [main] queue.AuditQueue: BaseAuditProvider.init()
2018-05-18T16:22:52,977 INFO [main] provider.BaseAuditHandler: BaseAuditProvider.init()
2018-05-18T16:22:52,977 INFO [main] provider.BaseAuditHandler: propPrefix=xasecure.audit.provider.async
2018-05-18T16:22:52,977 INFO [main] provider.BaseAuditHandler: providerName=async
2018-05-18T16:22:52,977 INFO [main] queue.AuditQueue: File spool is disabled for async
2018-05-18T16:22:52,978 INFO [main] provider.AuditProviderFactory: Starting audit queue hiveServer2.async
2018-05-18T16:22:52,978 INFO [main] queue.AuditBatchQueue: Creating ArrayBlockingQueue with maxSize=1048576
2018-05-18T16:22:52,979 INFO [main] queue.AuditFileSpool: Starting writerThread, queueName=hiveServer2.async.batch, consumer=hiveServer2.async.batch.solr
2018-05-18T16:22:52,980 INFO [Ranger async Audit cleanup] provider.AuditProviderFactory: RangerAsyncAuditCleanup: Waiting to audit cleanup start signal
2018-05-18T16:22:52,980 INFO [main] service.RangerBasePlugin: PolicyEngineOptions: { evaluatorType: auto, cacheAuditResult: false, disableContextEnrichers: false, disableCustomConditions: false, disableTrieLookupPrefilter: false }
2018-05-18T16:22:53,027 ERROR [main] utils.RangerCredentialProvider: Unable to get the Credential Provider from the Configuration
java.lang.IllegalArgumentException: The value of property hadoop.security.credential.provider.path must not be null
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:92) ~[guava-14.0.1.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1134) ~[hadoop-common-2.7.3.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1115) ~[hadoop-common-2.7.3.jar:?]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialProviders(RangerCredentialProvider.java:68) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialString(RangerCredentialProvider.java:46) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getCredential(RangerRESTClient.java:382) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getKeyManagers(RangerRESTClient.java:268) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.buildClient(RangerRESTClient.java:188) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getClient(RangerRESTClient.java:176) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getResource(RangerRESTClient.java:156) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.createWebResource(RangerAdminRESTClient.java:275) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:126) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:264) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:202) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.startRefresher(PolicyRefresher.java:149) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:150) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHivePlugin.init(RangerHiveAuthorizer.java:1705) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.<init>(RangerHiveAuthorizer.java:119) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory.createHiveAuthorizer(RangerHiveAuthorizerFactory.java:37) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:875) ~[hive-exec-2.3.2.jar:2.3.2]
at org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1683) ~[hive-exec-2.3.2.jar:2.3.2]
at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:130) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.cli.CLIService.init(CLIService.java:114) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:142) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:607) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.access$700(HiveServer2.java:100) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:855) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:724) ~[hive-service-2.3.2.jar:2.3.2]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) ~[hadoop-common-2.7.3.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) ~[hadoop-common-2.7.3.jar:?]
2018-05-18T16:22:53,027 ERROR [main] utils.RangerCredentialProvider: Unable to get the Credential Provider from the Configuration
java.lang.IllegalArgumentException: The value of property hadoop.security.credential.provider.path must not be null
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:92) ~[guava-14.0.1.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1134) ~[hadoop-common-2.7.3.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1115) ~[hadoop-common-2.7.3.jar:?]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialProviders(RangerCredentialProvider.java:68) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialString(RangerCredentialProvider.java:46) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getCredential(RangerRESTClient.java:382) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getTrustManagers(RangerRESTClient.java:319) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.buildClient(RangerRESTClient.java:189) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getClient(RangerRESTClient.java:176) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getResource(RangerRESTClient.java:156) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.createWebResource(RangerAdminRESTClient.java:275) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:126) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:264) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:202) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.startRefresher(PolicyRefresher.java:149) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:150) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHivePlugin.init(RangerHiveAuthorizer.java:1705) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.<init>(RangerHiveAuthorizer.java:119) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory.createHiveAuthorizer(RangerHiveAuthorizerFactory.java:37) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:875) ~[hive-exec-2.3.2.jar:2.3.2]
at org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1683) ~[hive-exec-2.3.2.jar:2.3.2]
at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:130) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.cli.CLIService.init(CLIService.java:114) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:142) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:607) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.access$700(HiveServer2.java:100) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:855) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:724) ~[hive-service-2.3.2.jar:2.3.2]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) ~[hadoop-common-2.7.3.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) ~[hadoop-common-2.7.3.jar:?]
2018-05-18T16:22:53,028 ERROR [main] util.PolicyRefresher: PolicyRefresher(serviceName=HIVE_RANGER_E2E): failed to refresh policies. Will continue to use last known version of policies (-1)
java.lang.IllegalArgumentException: TrustManager is not specified
at org.apache.commons.lang.Validate.notNull(Validate.java:192) ~[commons-lang-2.6.jar:2.6]
at org.apache.ranger.plugin.util.RangerRESTClient.getSSLContext(RangerRESTClient.java:365) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.buildClient(RangerRESTClient.java:190) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getClient(RangerRESTClient.java:176) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getResource(RangerRESTClient.java:156) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.createWebResource(RangerAdminRESTClient.java:275) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:126) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:264) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:202) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.startRefresher(PolicyRefresher.java:149) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:150) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHivePlugin.init(RangerHiveAuthorizer.java:1705) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.<init>(RangerHiveAuthorizer.java:119) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory.createHiveAuthorizer(RangerHiveAuthorizerFactory.java:37) ~[ranger-hive-plugin-1.0.0.jar:1.0.0]
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:875) ~[hive-exec-2.3.2.jar:2.3.2]
at org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1683) ~[hive-exec-2.3.2.jar:2.3.2]
at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:130) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.cli.CLIService.init(CLIService.java:114) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:142) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:607) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.access$700(HiveServer2.java:100) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:855) ~[hive-service-2.3.2.jar:2.3.2]
at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:724) ~[hive-service-2.3.2.jar:2.3.2]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) ~[hadoop-common-2.7.3.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) ~[hadoop-common-2.7.3.jar:?]
2018-05-18T16:22:53,028 WARN [main] util.PolicyRefresher: cache file does not exist or not readable '/etc/ranger/HIVE_RANGER_E2E/policycache/hiveServer2_HIVE_RANGER_E2E.json'
2018-05-18T16:22:53,079 INFO [main] service.RangerBasePlugin: Policies will NOT be reordered based on number of evaluations
2018-05-18T16:22:53,080 ERROR [Thread-6] utils.RangerCredentialProvider: Unable to get the Credential Provider from the Configuration
java.lang.IllegalArgumentException: The value of property hadoop.security.credential.provider.path must not be null
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:92) ~[guava-14.0.1.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1134) ~[hadoop-common-2.7.3.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1115) ~[hadoop-common-2.7.3.jar:?]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialProviders(RangerCredentialProvider.java:68) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialString(RangerCredentialProvider.java:46) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getCredential(RangerRESTClient.java:382) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getKeyManagers(RangerRESTClient.java:268) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.buildClient(RangerRESTClient.java:188) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getClient(RangerRESTClient.java:176) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getResource(RangerRESTClient.java:156) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.createWebResource(RangerAdminRESTClient.java:275) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:126) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:264) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:202) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.run(PolicyRefresher.java:171) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
2018-05-18T16:22:53,080 ERROR [Thread-6] utils.RangerCredentialProvider: Unable to get the Credential Provider from the Configuration
java.lang.IllegalArgumentException: The value of property hadoop.security.credential.provider.path must not be null
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:92) ~[guava-14.0.1.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1134) ~[hadoop-common-2.7.3.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1115) ~[hadoop-common-2.7.3.jar:?]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialProviders(RangerCredentialProvider.java:68) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.authorization.hadoop.utils.RangerCredentialProvider.getCredentialString(RangerCredentialProvider.java:46) ~[ranger-plugins-cred-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getCredential(RangerRESTClient.java:382) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getTrustManagers(RangerRESTClient.java:319) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.buildClient(RangerRESTClient.java:189) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getClient(RangerRESTClient.java:176) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getResource(RangerRESTClient.java:156) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.createWebResource(RangerAdminRESTClient.java:275) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:126) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:264) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:202) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.run(PolicyRefresher.java:171) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
2018-05-18T16:22:53,080 ERROR [Thread-6] util.PolicyRefresher: PolicyRefresher(serviceName=HIVE_RANGER_E2E): failed to refresh policies. Will continue to use last known version of policies (-1)
java.lang.IllegalArgumentException: TrustManager is not specified
at org.apache.commons.lang.Validate.notNull(Validate.java:192) ~[commons-lang-2.6.jar:2.6]
at org.apache.ranger.plugin.util.RangerRESTClient.getSSLContext(RangerRESTClient.java:365) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.buildClient(RangerRESTClient.java:190) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getClient(RangerRESTClient.java:176) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.RangerRESTClient.getResource(RangerRESTClient.java:156) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.createWebResource(RangerAdminRESTClient.java:275) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.admin.client.RangerAdminRESTClient.getServicePoliciesIfUpdated(RangerAdminRESTClient.java:126) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfromPolicyAdmin(PolicyRefresher.java:264) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(PolicyRefresher.java:202) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
at org.apache.ranger.plugin.util.PolicyRefresher.run(PolicyRefresher.java:171) ~[ranger-plugins-common-1.0.0.jar:1.0.0]
2018-05-18T16:22:53,080 WARN [Thread-6] util.PolicyRefresher: cache file does not exist or not readable '/etc/ranger/HIVE_RANGER_E2E/policycache/hiveServer2_HIVE_RANGER_E2E.json'
2018-05-18T16:22:53,081 WARN [main] session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
2018-05-18T16:22:53,449 INFO [main] hive.metastore: Trying to connect to metastore with URI thrift://localhost:9083
2018-05-18T16:22:53,471 INFO [main] hive.metastore: Opened a connection to metastore, current connections: 1
2018-05-18T16:22:53,482 INFO [main] hive.metastore: Connected to metastore.
2018-05-18T16:22:53,655 INFO [main] service.CompositeService: Operation log root directory is created: /var/hive/hs2log/tmp
2018-05-18T16:22:53,660 INFO [main] service.CompositeService: HiveServer2: Background operation thread pool size: 100
2018-05-18T16:22:53,660 INFO [main] service.CompositeService: HiveServer2: Background operation thread wait queue size: 100
2018-05-18T16:22:53,660 INFO [main] service.CompositeService: HiveServer2: Background operation thread keepalive time: 10 seconds
2018-05-18T16:22:53,690 WARN [main] authorizer.RangerHiveAuthorizer: filterListCmdObjects: user information not available
2018-05-18T16:22:53,825 INFO [main] server.Server: jetty-7.6.0.v20120127
2018-05-18T16:22:53,884 WARN [main] webapp.WebInfConfiguration: Can't reuse /tmp/jetty-0.0.0.0-10002-hiveserver2-_-any-, using /tmp/jetty-0.0.0.0-10002-hiveserver2-_-any-_6937658403338723795
2018-05-18T16:22:53,885 INFO [main] webapp.WebInfConfiguration: Extract jar:file:/usr/lib/apache-hive-2.3.2.5-bin/lib/hive-service-2.3.2.jar!/hive-webapps/hiveserver2/ to /tmp/jetty-0.0.0.0-10002-hiveserver2-_-any-_6937658403338723795/webapp
2018-05-18T16:22:54,040 INFO [main] handler.ContextHandler: started o.e.j.w.WebAppContext{/,file:/tmp/jetty-0.0.0.0-10002-hiveserver2-_-any-_6937658403338723795/webapp/},jar:file:/usr/lib/apache-hive-2.3.2.5-bin/lib/hive-service-2.3.2.jar!/hive-webapps/hiveserver2
2018-05-18T16:22:54,115 INFO [main] handler.ContextHandler: started o.e.j.s.ServletContextHandler{/static,jar:file:/usr/lib/apache-hive-2.3.2.5-bin/lib/hive-service-2.3.2.jar!/hive-webapps/static}
2018-05-18T16:22:54,134 INFO [main] server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:10002
2018-05-18T16:22:54,136 INFO [main] http.HttpServer: Started HttpServer[hiveserver2] on port 10002
2018-05-18T16:22:54,141 INFO [Thread-8] auth.HiveAuthUtils: SSL Server Socket Enabled Protocols: [SSLv2Hello, TLSv1, TLSv1.1, TLSv1.2]
2018-05-18T16:22:54,144 INFO [main] session.SparkSessionManagerImpl: Setting up the session manager.
2018-05-18T16:22:54,146 INFO [Thread-8] thrift.ThriftCLIService: Starting ThriftBinaryCLIService on port 10000 with 5...500 worker threads
2018-05-18T16:22:54,482 INFO [main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.hfilecleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveHFileCleaner).
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger