Member since
04-30-2018
20
Posts
0
Kudos Received
0
Solutions
11-08-2018
12:27 PM
Thks, I tried this : /usr/hdp/2.6.4.0-91/spark2/bin/spark-submit --keytab clement.service.keytab --principal clement@MYDOMAIN.FR --files clement_client_jaas.conf,clement.service.keytab --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=clement_client_jaas.conf" --conf "spark.driver.extraJavaOptions=-Djava.security.auth.login.config=clement_client_jaas.conf" --jars spark-streaming-kafka-0-8_2.11-2.2.0.jar,spark-streaming_2.11-2.2.0.2.6.4.0-91.jar,spark-streaming-kafka-assembly_2.11-1.6.3.jar consumer.py But i have the same error : 18/11/08 13:26:14 WARN VerifiableProperties: Property security.protocol is not valid
18/11/08 13:26:14 WARN ClientCnxn: SASL configuration failed: javax.security.auth.login.LoginException: No key to store Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it.
18/11/08 13:26:14 WARN AppInfo$: Can't read Kafka version from MANIFEST.MF. Possible cause: java.lang.NullPointerException
... View more
11-08-2018
09:42 AM
Hello, I have a problem with spark wanting to consume a topic kafka in kerberized environment. I use SPARK2 in cluster HDP 2.6, and Kafka HDF 3.1. consumer.py : import sys
from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils
def process(time, rdd):
print("========= %s =========" % str(time))
if not rdd.isEmpty():
rdd.count()
rdd.first()
sc = SparkContext(appName="test")
ssc = StreamingContext(sc, 5)
sc.setLogLevel("WARN")
print "Connected to spark streaming"
kafkaParams = {"security.protocol":"PLAINTEXTSASL"}
kafkaStream = KafkaUtils.createStream(ssc, "zookeeperHDF:2181", "pysparkclient1", {"mytopic": 1},kafkaParams)
kafkaStream.pprint()
kafkaStream.foreachRDD(process)
ssc.start()
ssc.awaitTermination() Command line : /usr/hdp/2.6.4.0-91/spark2/bin/spark-submit --keytab /etc/security/keytabs/clement.service.keytab --principal clement@MYDOMAIN.FR --files /home/spark/test/clement_client_jaas.conf --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=clement_client_jaas.conf" --conf "spark.driver.extraJavaOptions=-Djava.security.auth.login.config=clement_client_jaas.conf" --jars spark-streaming-kafka-0-8_2.11-2.2.0.jar,spark-streaming_2.11-2.2.0.2.6.4.0-91.jar,spark-streaming-kafka-assembly_2.11-1.6.3.jar consumer.py clement_client_jaas.conf : KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
useKeyTab=true
principal="clement@MYDOMAIN.FR"
keyTab="clement.service.keytab"
renewTicket=true
storeKey=true
serviceName="kafka";
};
Client {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
useKeyTab=true
principal="clement@MYDOMAIN.FR"
keyTab="clement.service.keytab"
renewTicket=true
storeKey=true
serviceName="zookeeper";
}; My error : 18/11/08 10:26:57 WARN VerifiableProperties: Property security.protocol is not valid
18/11/08 10:26:57 WARN ClientCnxn: SASL configuration failed: javax.security.auth.login.LoginException: No key to store Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it.
18/11/08 10:26:57 WARN AppInfo$: Can't read Kafka version from MANIFEST.MF. Possible cause: java.lang.NullPointerException
Do you have an idea ?
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache Spark
07-24-2018
12:54 PM
Hello, I have a problem with the default umask HDFS and nifi. It create files with umask=022 while I defined umask=777 (fs.permissions.umask-mode). No override in nifi.
Or is what I should look for?
... View more
Labels:
- Labels:
-
Apache Hadoop
07-18-2018
03:20 PM
Hello, I note that the local users of my server go up in ranger. Is it possible to remove users from RANGER and make sure they do not go up again?
I use LDAP, how are these users here? I use HDP 2.6.4
... View more
Labels:
- Labels:
-
Apache Ranger
07-17-2018
09:07 PM
Hello, I work with HDP 2.6.4. I want to use ranger admin ui with SSL authentication, it is possible ? ldap authentification is ok, OpenSSL s_connect is ok, java ssl sample connection is ok and nothing when ldaps URL connection
... View more
Labels:
- Labels:
-
Apache Ranger
06-18-2018
02:10 PM
Hello,
is it possible to trace the requests executed on the cluster?
This would allow me to identify possible requests at risk (SELECT * ..) Thanks
... View more
Labels:
- Labels:
-
Apache Hive
06-08-2018
07:51 AM
Hello, I have encountered a problem after reboot. Knox not started because /var/run/knox is missing. Is a bug ? /usr/hdp/current/knox-server/bin/gateway.sh stop
==> Can't find PID dir
... View more
Labels:
- Labels:
-
Apache Knox
06-06-2018
12:51 PM
Hello, I use atlas with knox. I connect without problems, but when i click on "logout", I am redirecting to 404. URL Altas : https://HOST:8443/gateway/default/atlas URL logout (404) : https://HOST:8443/gateway/default/logout.html This is my "knox-topology": <service>
<role>ATLAS</role>
<url>http://HOST:21000</url>
</service>
<service>
<role>ATLAS-API</role>
<url>http://HOST:21000</url>
</service>
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Knox
06-05-2018
01:15 PM
Hello, how plug two LDAP with zeppelin ? one LDAP for users and one LDAP for groups. Thanks for your help.
... View more
Labels:
- Labels:
-
Apache Zeppelin
05-22-2018
12:15 PM
Hi, Versions : HDP 2.6.4 this is my files : Advanced knoxsso-topology
<topology>
<gateway>
<provider>
<role>webappsec</role>
<name>WebAppSec</name>
<enabled>true</enabled>
<param><name>xframe.options.enabled</name><value>true</value></param>
</provider>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>redirectToUrl</name>
<value>/gateway/knoxsso/knoxauth/login.html</value>
</param>
<param>
<name>restrictedCookies</name>
<value>rememberme,WWW-Authenticate</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<param>
<name>main.ldapContextFactory</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapContextFactory</value>
</param>
<param>
<name>main.ldapRealm.contextFactory</name>
<value>$ldapContextFactory</value>
</param>
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>uid={0},ou=Utilisateurs,o=domain,c=fr</value>
</param>
<param>
<name>main.ldapRealm.userSearchAttributeName</name>
<value>uid</value>
</param>
<param>
<name>main.ldapRealm.authorizationEnabled</name>
<value>true</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemUsername</name>
<value>uid=YPXXX001_Appli,ou=Technical Users,o=domain,c=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemPassword</name>
<value>YPXXX001_Appli</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldaps://HOST:1636</value>
</param>
<param>
<name>main.ldapRealm.authenticationCachingEnabled</name>
<value>false</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
</gateway>
<application>
<name>knoxauth</name>
</application>
<service>
<role>KNOXSSO</role>
<param>
<name>knoxsso.cookie.secure.only</name>
<value>false</value>
</param>
<param>
<name>knoxsso.token.ttl</name>
<value>30000</value>
</param>
<param>
<name>knoxsso.redirect.whitelist.regex</name>
<value>^https?:\/\/(localhost|127\.0\.0\.1|0:0:0:0:0:0:0:1|::1):[0-9].*$</value>
</param>
</service>
</topology>
Advanced topology
<topology>
<gateway>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>uid={0},cn=users,cn=compat,dc=pocbigdata,dc=hpmetier,dc=sf,dc=intra,dc=domain,dc=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldap://HOST:389</value>
</param>
<param>
<name>main.ldapRealm.userSearchAttributeName</name>
<value>uid</value>
</param>
<param>
<name>main.ldapRealm.authorizationEnabled</name>
<value>true</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemUsername</name>
<value>uid=bigdata,cn=sysaccounts,dc=pocbigdata,dc=hpmetier,dc=sf,dc=intra,dc=domain,dc=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemPassword</name>
<value>bigdata</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
<provider>
<role>authorization</role>
<name>AclsAuthz</name>
<enabled>true</enabled>
</provider>
</gateway>
<service>
<role>WEBHDFS</role>
<url>http://HOST:50070/webhdfs</url>
</service>
<service>
<role>HIVE</role>
<url>http://{{hive_server_host}}:{{hive_http_port}}/{{hive_http_path}}</url>
</service>
<service>
<role>AMBARIUI</role>
<url>http://HOST:8080</url>
</service>
<service>
<role>AMBARI</role>
<url>http://HOST:8080</url>
</service>
<service>
<role>RANGERUI</role>
<url>http://HOST:6080</url>
</service>
</topology>
... View more
05-18-2018
09:52 AM
Hello,
I have an error when I try to connect an ambari and a ranger through Knox. 2018-05-18 11:40:29,557 INFO hadoop.gateway (KnoxLdapRealm.java:getUserDn(691)) - Computed userDn: uid=clement,cn=users,cn=accounts,dc=pocbigdata,dc=hpmetier,dc=sf,dc=intra,dc=test,dc=fr using dnTemplate for principal: clement
2018-05-18 11:40:29,558 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(66)) - Failed to execute filter: javax.servlet.ServletException: java.lang.NullPointerException
2018-05-18 11:40:29,558 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(66)) - Failed to execute filter: javax.servlet.ServletException: java.lang.NullPointerException
2018-05-18 11:40:29,558 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(66)) - Failed to execute filter: javax.servlet.ServletException: java.lang.NullPointerException
2018-05-18 11:40:29,558 ERROR hadoop.gateway (GatewayFilter.java:doFilter(145)) - Gateway processing failed: javax.servlet.ServletException: java.lang.NullPointerException
javax.servlet.ServletException: java.lang.NullPointerException
at org.apache.shiro.web.servlet.AdviceFilter.cleanup(AdviceFilter.java:196)
at org.apache.shiro.web.filter.authc.AuthenticatingFilter.cleanup(AuthenticatingFilter.java:155)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:148)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
at org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449)
at org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
at org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
at org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
at org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:383)
at org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.filter.ResponseCookieFilter.doFilter(ResponseCookieFilter.java:50)
at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:61)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.filter.RedirectToUrlFilter.doFilter(RedirectToUrlFilter.java:45)
at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:61)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.filter.XForwardedHeaderFilter.doFilter(XForwardedHeaderFilter.java:30)
at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:61)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.webappsec.filter.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:58)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.GatewayFilter.doFilter(GatewayFilter.java:139)
at org.apache.hadoop.gateway.GatewayFilter.doFilter(GatewayFilter.java:91)
at org.apache.hadoop.gateway.GatewayServlet.service(GatewayServlet.java:141)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.hadoop.gateway.trace.TraceHandler.handle(TraceHandler.java:51)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.hadoop.gateway.filter.CorrelationHandler.handle(CorrelationHandler.java:39)
at org.eclipse.jetty.servlets.gzip.GzipHandler.handle(GzipHandler.java:529)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.hadoop.gateway.filter.PortMappingHelperHandler.handle(PortMappingHelperHandler.java:92)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.websocket.server.WebSocketHandler.handle(WebSocketHandler.java:112)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at javax.naming.InitialContext.getURLScheme(InitialContext.java:294)
at javax.naming.InitialContext.getURLOrDefaultInitCtx(InitialContext.java:343)
at javax.naming.directory.InitialDirContext.getURLOrDefaultInitDirCtx(InitialDirContext.java:106)
at javax.naming.directory.InitialDirContext.search(InitialDirContext.java:267)
at org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm.rolesFor(KnoxLdapRealm.java:281)
at org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm.getRoles(KnoxLdapRealm.java:244)
at org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm.queryForAuthorizationInfo(KnoxLdapRealm.java:230)
at org.apache.shiro.realm.ldap.JndiLdapRealm.doGetAuthorizationInfo(JndiLdapRealm.java:313)
at org.apache.shiro.realm.AuthorizingRealm.getAuthorizationInfo(AuthorizingRealm.java:341)
at org.apache.shiro.realm.AuthorizingRealm.hasRole(AuthorizingRealm.java:573)
at org.apache.shiro.authz.ModularRealmAuthorizer.hasRole(ModularRealmAuthorizer.java:374)
at org.apache.shiro.mgt.AuthorizingSecurityManager.hasRole(AuthorizingSecurityManager.java:153)
at org.apache.shiro.subject.support.DelegatingSubject.hasRole(DelegatingSubject.java:224)
at org.apache.hadoop.gateway.filter.ShiroSubjectIdentityAdapter.doFilter(ShiroSubjectIdentityAdapter.java:69)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
at org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
... 58 more
2018-05-18 11:40:29,559 ERROR hadoop.gateway (GatewayServlet.java:service(146)) - Gateway processing failed: javax.servlet.ServletException: java.lang.NullPointerException
javax.servlet.ServletException: java.lang.NullPointerException
at org.apache.shiro.web.servlet.AdviceFilter.cleanup(AdviceFilter.java:196)
at org.apache.shiro.web.filter.authc.AuthenticatingFilter.cleanup(AuthenticatingFilter.java:155)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:148)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
at org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449)
at org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
at org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
at org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
at org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:383)
at org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.filter.ResponseCookieFilter.doFilter(ResponseCookieFilter.java:50)
at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:61)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.filter.RedirectToUrlFilter.doFilter(RedirectToUrlFilter.java:45)
at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:61)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.filter.XForwardedHeaderFilter.doFilter(XForwardedHeaderFilter.java:30)
at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:61)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.webappsec.filter.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:58)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.hadoop.gateway.GatewayFilter.doFilter(GatewayFilter.java:139)
at org.apache.hadoop.gateway.GatewayFilter.doFilter(GatewayFilter.java:91)
at org.apache.hadoop.gateway.GatewayServlet.service(GatewayServlet.java:141)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.hadoop.gateway.trace.TraceHandler.handle(TraceHandler.java:51)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.hadoop.gateway.filter.CorrelationHandler.handle(CorrelationHandler.java:39)
at org.eclipse.jetty.servlets.gzip.GzipHandler.handle(GzipHandler.java:529)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.apache.hadoop.gateway.filter.PortMappingHelperHandler.handle(PortMappingHelperHandler.java:92)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.websocket.server.WebSocketHandler.handle(WebSocketHandler.java:112)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at javax.naming.InitialContext.getURLScheme(InitialContext.java:294)
at javax.naming.InitialContext.getURLOrDefaultInitCtx(InitialContext.java:343)
at javax.naming.directory.InitialDirContext.getURLOrDefaultInitDirCtx(InitialDirContext.java:106)
at javax.naming.directory.InitialDirContext.search(InitialDirContext.java:267)
at org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm.rolesFor(KnoxLdapRealm.java:281)
at org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm.getRoles(KnoxLdapRealm.java:244)
at org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm.queryForAuthorizationInfo(KnoxLdapRealm.java:230)
at org.apache.shiro.realm.ldap.JndiLdapRealm.doGetAuthorizationInfo(JndiLdapRealm.java:313)
at org.apache.shiro.realm.AuthorizingRealm.getAuthorizationInfo(AuthorizingRealm.java:341)
at org.apache.shiro.realm.AuthorizingRealm.hasRole(AuthorizingRealm.java:573)
at org.apache.shiro.authz.ModularRealmAuthorizer.hasRole(ModularRealmAuthorizer.java:374)
at org.apache.shiro.mgt.AuthorizingSecurityManager.hasRole(AuthorizingSecurityManager.java:153)
at org.apache.shiro.subject.support.DelegatingSubject.hasRole(DelegatingSubject.java:224)
at org.apache.hadoop.gateway.filter.ShiroSubjectIdentityAdapter.doFilter(ShiroSubjectIdentityAdapter.java:69)
at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332)
at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
at org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
... 58 more
Can you help me ? thank you in advance
... View more
Labels:
- Labels:
-
Apache Knox
-
Apache Ranger
05-17-2018
07:32 AM
Hi @Felix Albani. thank you !
... View more
05-16-2018
03:35 PM
Hello, I activated in the storage plugins HDFS, HIVE and Nifi.
Unfortunately when restarting the components I have this error for HDFS and HIVE : 2018-05-16 17:30:33,100 - Will retry 35 time(s), caught exception: Connection failed to Ranger Admin. Reason - [Errno 111] Connection refused.. Sleeping for 8 sec(s)
2018-05-16 17:30:41,109 - Will retry 34 time(s), caught exception: Connection failed to Ranger Admin. Reason - [Errno 111] Connection refused.. Sleeping for 8 sec(s)
2018-05-16 17:30:49,118 - Will retry 33 time(s), caught exception: Connection failed to Ranger Admin. Reason - [Errno 111] Connection refused.. Sleeping for 8 sec(s)
On the other hand Nifi starts him correctly.
Have I forgotten something?
... View more
Labels:
- Labels:
-
Apache Ranger
04-30-2018
09:01 AM
Hello, I have installed HDP 2.6.4 and upgrade to HDF 3.1.1 to use Nifi, Schéma Registry and SAM. I don't understand why Schéma Registry and SAM as not compatible in HDF 3.1.1 on an HDP 2.6.4. Do I have to create two cluster (HDP + HDP) to use its services?
... View more
Labels: