Member since
02-29-2016
108
Posts
213
Kudos Received
14
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1990 | 08-18-2017 02:09 PM | |
3459 | 06-16-2017 08:04 PM | |
3219 | 01-20-2017 03:36 AM | |
8706 | 01-04-2017 03:06 AM | |
4337 | 12-09-2016 08:27 PM |
01-18-2017
03:38 PM
2 Kudos
HDP 2.5 secured cluster with Knox installed as gateway. Ranger policy is create for default topology and WEBHDFS service. However when the policy only contains only hr group which hr1 user is part of, I got error from gateway request curl -ik -u hr1 https://<knox-gateway>:8443/gateway/default/webhdfs/v1/hr/exempt?op=LISTSTATUS
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 403 Forbidden</title>
</head>
<body><h2>HTTP ERROR 403</h2>
<p>Problem accessing /gateway/default/webhdfs/v1/hr/exempt. Reason:
<pre> Forbidden</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
When user hr1 is included in the policy, it works fine curl -ik -u hr1 https://<knox-gateway>:8443/gateway/default/webhdfs/v1/hr/exempt?op=LISTSTATUS
{"FileStatuses":{"FileStatus":[{"accessTime":1483632050751,"blockSize":134217728,"childrenNum":0,"fileId":152421,"group":"hr","length":23,"modificationTime":1483632051087,"owner":"hdfs","pathSuffix":"testfile","permission":"644","replication":3,"storagePolicy":0,"type":"FILE"}]}} This happens with other knox services as well like hive through knox. The group policy does work fine on other Ranger policies not for knox, like HDFS and Hive. Only the policies for Knox seems to have this particular problem. And in HDFS custom core-site, I have hadoop.proxyuser.knox.hosts=*
hadoop.proxyuser.knox.groups=*
... View more
Labels:
- Labels:
-
Apache Knox
-
Apache Ranger
01-05-2017
08:01 PM
4 Kudos
Found the answer in another HCC post https://community.hortonworks.com/questions/16887/beeline-connect-via-knox-ssl-issue.html The truststore and password of truststore need to be included in the hive conn string. beeline -u "jdbc:hive2://knoxnode:8443/;ssl=true;sslTrustStore=/root/myLocalTrustStore.jks;trustStorePassword=<password>;transportMode=http;httpPath=gateway/default/hive" -n hadoopadmin -p <password> The step is like the following. First get the certification from Knox server, then add the cert to a key file that will be used by the client, keep the keypassword handy, that is the password for the connection string. You don't have to add the cert in the java key file. openssl s_client -connect ${knoxserver}:8443 <<<'' | openssl x509 -out /tmp/knox.crt
keytool -import -keystore myLocalTrustStore.jks -file knox.crt
... View more
01-05-2017
07:19 PM
3 Kudos
@dvillarreal So I used SSLPoke to connect to knox server and it works fine [root@qwang-hdp2 sslpoke]# java -Djavax.net.ssl.trustStore=/usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert SSLPoke qwang-hdp5.field.hortonworks.com 8443
Successfully connected
I guess the question now is what truststore beeline is using. Where can I get that information?
... View more
01-05-2017
05:03 PM
2 Kudos
@dvillarreal I followed your steps, the java version is the same as you mentioned [root@qwang-hdp2 ~]# ps -ef | grep beeline
root 32310 26098 39 16:58 pts/0 00:00:05 /usr/jdk64/jdk1.8.0_77/bin/java -Xmx1024m -Dhdp.version=2.5.3.0-37 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.3.0-37 -Dhadoop.log.dir=/var/log/hadoop/root -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.3.0-37/hadoop -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -Xmx1024m -Djava.util.logging.config.file=/usr/hdp/2.5.3.0-37/hive/conf/parquet-logging.properties -Dlog4j.configuration=beeline-log4j.properties -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.5.3.0-37/hive/lib/hive-beeline-1.2.1000.2.5.3.0-37.jar org.apache.hive.beeline.BeeLine -u jdbc:hive2://qwang-hdp5.field.hortonworks.com:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive -n hadoopadmin -p password
root 32542 10102 0 16:58 pts/1 00:00:00 grep --color=auto beeline
then import the certificate to /usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert keytool -import -trustcacerts -keystore /usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert -storepass changeit -noprompt -alias knox -file /tmp/knox.crt
But I still get the same error starting beeline beeline -u "jdbc:hive2://knoxnode:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive" -n hadoopadmin -p password
17/01/05 16:58:46 [main]: ERROR jdbc.HiveConnection: Error opening session
org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
You mentioned restart beeline client, I am not sure what exactly that means.
... View more
01-05-2017
03:31 PM
2 Kudos
@Vipin Rathor Just included the beeline error in original message. I tailed /var/log/hive/hiveserver2.log and did not see any error with beeline command pop errors. From the beeline error, it seems to be related to ssl, that was why I included the cmd I import cert on both nodes.
... View more
01-05-2017
03:16 PM
1 Kudo
the intention is to access hive through knox, not directly.
... View more
01-05-2017
03:03 AM
3 Kudos
I am following the security lab and getting to lab 8 for knox. https://github.com/HortonworksUniversity/Security_Labs#lab-8 It was all fine for WEBHDFS steps but I got some error in HIVE step when I use openssl to imprt the self-signed cert, i got some error. not sure if that was the cause knoxserver=$(hostname -f)
openssl s_client -connect ${knoxserver}:8443 <<<'' | openssl x509 -out /tmp/knox.crt
depth=0 C = US, ST = Test, L = Test, O = Hadoop, OU = Test, CN = qwang-hdp5.field.hortonworks.com
verify error:num=18:self signed certificate
verify return:1
depth=0 C = US, ST = Test, L = Test, O = Hadoop, OU = Test, CN = qwang-hdp5.field.hortonworks.com
verify return:1
DONE
On beeline node, the cert was imported fine with the following command keytool -import -trustcacerts -keystore /etc/pki/java/cacerts -storepass changeit -noprompt -alias knox -file /tmp/knox.crt But when I use the following cmd to access hive, i get error looks like the user was not granted access in Ranger, but the user is already include in "all - topology, service" policy beeline -u "jdbc:hive2://knoxnode:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive" -n hadoopadmin -p password Error from beeline 17/01/05 15:22:19 [main]: ERROR jdbc.HiveConnection: Error opening session
org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:297)
at org.apache.thrift.transport.THttpClient.flush(THttpClient.java:313)
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
at org.apache.hive.service.cli.thrift.TCLIService$Client.send_OpenSession(TCLIService.java:154)
at org.apache.hive.service.cli.thrift.TCLIService$Client.OpenSession(TCLIService.java:146)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:552)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:170)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:146)
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:211)
at org.apache.hive.beeline.Commands.connect(Commands.java:1190)
at org.apache.hive.beeline.Commands.connect(Commands.java:1086)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:990)
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:715)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:777)
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:491)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:474)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1949)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:302)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:296)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1509)
at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
at sun.security.ssl.Handshaker.processLoop(Handshaker.java:979)
at sun.security.ssl.Handshaker.process_record(Handshaker.java:914)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1062)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1403)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1387)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:395)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:354)
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)
at org.apache.http.impl.conn.BasicHttpClientConnectionManager.connect(BasicHttpClientConnectionManager.java:338)
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.execchain.ServiceUnavailableRetryExec.execute(ServiceUnavailableRetryExec.java:84)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:117)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
at org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:251)
... 30 more
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:387)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:292)
at sun.security.validator.Validator.validate(Validator.java:260)
at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1491)
... 51 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:382)
... 57 more
Error: Could not establish connection to jdbc:hive2://qwang-hdp5.field.hortonworks.com:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target (state=08S01,code=0)
Looking at the knox log, it complains about error writting to log file 2017-01-05 03:01:24,993 ERROR provider.BaseAuditHandler (BaseAuditHandler.java:logError(329)) - Error writing to log file.
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at sun.reflect.GeneratedConstructorAccessor32.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2160)
at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1423)
at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1419)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1419)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1443)
at org.apache.ranger.audit.destination.HDFSAuditDestination.getLogFileStream(HDFSAuditDestination.java:273)
at org.apache.ranger.audit.destination.HDFSAuditDestination.access$000(HDFSAuditDestination.java:44)
at org.apache.ranger.audit.destination.HDFSAuditDestination$1.run(HDFSAuditDestination.java:159)
at org.apache.ranger.audit.destination.HDFSAuditDestination$1.run(HDFSAuditDestination.java:156)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.ranger.audit.destination.HDFSAuditDestination.logJSON(HDFSAuditDestination.java:170)
at org.apache.ranger.audit.queue.AuditFileSpool.sendEvent(AuditFileSpool.java:880)
at org.apache.ranger.audit.queue.AuditFileSpool.runLogAudit(AuditFileSpool.java:828)
at org.apache.ranger.audit.queue.AuditFileSpool.run(AuditFileSpool.java:758)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1552)
at org.apache.hadoop.ipc.Client.call(Client.java:1496)
at org.apache.hadoop.ipc.Client.call(Client.java:1396)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
at com.sun.proxy.$Proxy48.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:816)
at sun.reflect.GeneratedMethodAccessor50.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
at com.sun.proxy.$Proxy49.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2158)
... 17 more
2017-01-05 03:01:24,995 ERROR queue.AuditFileSpool (AuditFileSpool.java:logError(710)) - Error sending logs to consumer. provider=knox.async.multi_dest.batch, consumer=knox.async.multi_dest.batch.hdfs
I couldn't make sense of the error, what does it really mean?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Knox
01-04-2017
03:06 AM
3 Kudos
Finally found the resolution: because I have Ranger KMS installed on this cluster, livy user also need to be added in proxy user for ranger KMS Add following in Ambari => Ranger KMS => custom core site hadoop.kms.proxyuser.livy.hosts=*
hadoop.kms.proxyuser.livy.users=*
... View more
01-04-2017
02:57 AM
2 Kudos
@Bikas really appreciate your detailed explanation I have the following in spark config. Do I need anything else for livy impersonation? livy.impersonation.enabled=true Also inside Ambari, HDFS=> custom core-site, I have hadoop.proxyuser.zeppelin.hosts=*
hadoop.proxyuser.zeppelin.groups=*
hadoop.proxyuser.livy.hosts=*
hadoop.proxyuser.livy.groups=*
Also if I kinit as the same user in console, I could start spark-shell fine [root@qwang-hdp1 ~]# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: hadoopadmin@FIELD.HORTONWORKS.COM
Valid starting Expires Service principal
01/04/2017 02:41:42 01/05/2017 02:41:42 krbtgt/FIELD.HORTONWORKS.COM@FIELD.HORTONWORKS.COM
[root@qwang-hdp1 ~]# spark-shell
I also noticed in the exception, the authentication actually failed for Ranger KMS if you look at the url. I did install Ranger KMS on the cluster but I did not enable it for any HDFS folder yet. Do I need to add livy user somewhere in KMS?
... View more
01-03-2017
06:30 PM
2 Kudos
Still getting the same error after setting livy.spark.master=yarn-cluster And the log in /var/log/spark/ doesn't have much information 17/01/03 18:16:54 INFO SecurityManager: Changing view acls to: spark
17/01/03 18:16:54 INFO SecurityManager: Changing modify acls to: spark
17/01/03 18:16:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark)
17/01/03 18:16:54 INFO FsHistoryProvider: Replaying log path: hdfs://qwang-hdp0.field.hortonworks.com:8020/spark-history/local-1483414292027
17/01/03 18:16:54 INFO SecurityManager: Changing acls enabled to: false
17/01/03 18:16:54 INFO SecurityManager: Changing admin acls to:
17/01/03 18:16:54 INFO SecurityManager: Changing view acls to: hadoopadmin
Also I am seeing the following error in livy log, I use hadoopadmin user to log into Zeppelin. What authentication caused this failure? INFO: Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, URL: http://qwang-hdp4.field.hortonworks.com:9292/kms/v1/?op=GETDELEGATIONTOKEN&doAs=hadoopadmin&renewer=rm%2Fqwang-hdp1.field.hortonworks.com%40FIELD.HORTONWORKS.COM&user.name=livy, status: 403, message: Forbidden
... View more