Member since
04-30-2018
20
Posts
0
Kudos Received
0
Solutions
10-16-2019
10:26 PM
Hello, were you able to resolve this? If yes can you please update how.
... View more
08-02-2018
06:56 AM
Hello, I have 1 Ambari HDP and 1 Ambari Views. We should acces to TEZ View with Ambari Views, over KNOX. But i have a problem, the URL in AJAX query is invalid : I draw your attention to the fact that [callerId:"] is present in URL, and it is wrong. This problem appear only when use knox, during asynchrone call (ajax). Example : When i want view task in datatable TEZ View work with other Ambari (without KNOX) and Hive View, File View work with KNOX in Ambari Views. My problem is just with TEZ over KNOX. Generate URL : http://HOST:8080/api/v1/views/TEZ/versions/0.7.0.2.6.4.0-91/instances/Tez/resources/atsproxy/ws/v1/timeline/TEZ_DAG_ID?limit=9007199254740991&primaryFilter=callerId:"hive_20180730172908_06ded5e9-a551-43ba-a48e-0c270c84d924"&_=1533103629725 TRACE ERROR : 2018-08-01 08:07:13,686 ERROR hadoop.gateway (GatewayServlet.java:service(146)) - Gateway processing failed: javax.servlet.ServletException : java.lang.IllegalArgumentException: Illegal character in query at index 189: http://HOST:8080/api/v1/views/TEZ/ve rsions/0.7.0.2.6.4.0-91/instances/Tez/resources/atsproxy/ws/v1/timeline/TEZ_DAG_ID?limit=9007199254740991&primaryFilter=callerId:"hive_2018 0730171134_36a47347-b0e7-4ddc-bfb2-5a362a2f8c5c"&_=1533103629729 2018-08-01 08:07:13,686 WARN servlet.ServletHandler (ServletHandler.java:doHandle(620)) - javax.servlet.ServletException: java.lang.IllegalArgumentException: Illegal character in query at index 189: http://HOST:8080/api/v1/views/TEZ/versions/0.7.0.2.6.4.0-91/instances/Tez/resources/atsproxy/ws/v1/timeline/TEZ_DAG_ID?limit=9007199254740991&pri maryFilter=callerId:"hive_20180730171134_36a47347-b0e7-4ddc-bfb2-5a362a2f8c5c"&_=1533103629729 at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:70) at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332) at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232) at org.apache.ranger.authorization.knox.RangerPDPKnoxFilter.doFilter(RangerPDPKnoxFilter.java:166) at org.apache.ranger.authorization.knox.RangerPDPKnoxFilter.doFilter(RangerPDPKnoxFilter.java:110) at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332) at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232) at org.apache.hadoop.gateway.filter.rewrite.api.UrlRewriteServletFilter.doFilter(UrlRewriteServletFilter.java:60) at org.apache.hadoop.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:61) at org.apache.hadoop.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:332) at org.apache.hadoop.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:232) at org.apache.hadoop.gateway.filter.AnonymousAuthFilter$1.run(AnonymousAuthFilter.java:76) at java.security.AccessController.doPrivileged(Native Method)
... View more
Labels:
07-27-2018
02:51 PM
Hello, I have a différence with fsck and dfsadmin : [hdfs@host:~]$ hdfs
dfsadmin -report | grep corrupt Blocks with corrupt
replicas: 4 [hdfs@host:~]$ hdfs
fsck / | grep Corrupt Connecting to namenode
via http://host:50070/fsck?ugi=hdfs&path=%2F Corrupt
blocks: 0
... View more
Labels:
07-24-2018
02:25 PM
@Clément Dumont Have you tried to use the CLI to write on the HDFS and check if the umask values are effectively applied? hdfs dfs -put /etc/hosts /<directory to store the file>/hosts
hdfs dfs -ls /<directory to store the file>/hosts If it's setting the permission according to the defined umask you might want to look at setting a umask value on NIFI: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.5.0/org.apache.nifi.processors.hadoop.PutHDFS/index.html Note: If you are using Ambari, when changing the umask value, all Datanodes and Namenodes must be restarted for the changes to be applied.
If Ambari is not used the umask value must be changed in hdfs-site.xml conf file in all DataNodes and Namenodes, and all must be restarted for the changes to take place.
... View more
07-18-2018
03:30 PM
@Clément Dumont Do you have ranger usersync enabled? If so, to configure user sync for unix look here https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_security/content/ranger_user_sync_unix.html and to config ad sync look here: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_security/content/ranger_user_sync_ldap_ad.html once setup properly, only groups you specify will be sync'd Sunile
-
When an "Answer" addresses/solves your question, please select "Accept" beneath that answer. This encourages user participation in this forum.
... View more
07-18-2018
03:55 AM
1 Kudo
@Guillaume Dalibart @Clément Dumont Yes, is possible to configure ranger admin ui with LDAPS - just make sure you add the ldap certificate in ranger admin ui truststore! Easiest is to add this certificate on the cacerts of the java. HTH
... View more
06-19-2018
12:04 AM
Hi @Clément Dumont! In hive you can try to investigate the query by looking for a parameter called hive.query.string. E.g. BTW it works for Hive on Tez or Hive on MR (in this case you'd see on Yarn Web UI). Hope this helps!
... View more
06-12-2018
04:20 PM
Hello, I want to install 2 distinct clusters (1 HDP, 1HDF), and would like to pool RANGER.
Is it possible to use RANGER (HDP cluster) to work with NIFI (HDF cluster)?
That would save me from managing rights in two places
Regards,
... View more
Labels:
06-08-2018
09:39 AM
@Clément Dumont The "/var/run/knox" is mounted as tmpfs so it might be deleted by the operating system after host reboot. So if you find this issue in OS like RHEL7 then please try changing the "knox PID" to some other path. Ambari UI --> Knox --> Configs --> Advanced --> "Advanced knox-env" --> "Knox PID dir"<br> The "Knox PID dir" by default has a value "/var/run/knox" you can change it to something else. Red Hat Enterprise Linux 7, the /run directory is a temporary file storage system ( tmpfs ) that bind mounts the /var/run directory. https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/7/html/migration_planning_guide/sect-red_hat_enterprise_linux-migration_planning_guide-file_system_layout
... View more
06-05-2018
01:33 PM
@Clément Dumont Zeppelin currently supports 2 implementations of shiro: 1. org.apache.zeppelin.realm.LdapRealm, which is the one you should use if running HDP 2.6 onwards 2. org.apache.zeppelin.server.ActiveDirectoryGroupRealm which is supported only for AD but from HDP 2.6 you should use (1) None of the above implementations support authenticating users from 1 ldap and fetching groups from different ldap. To answer your question this is not supported as of now. HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-22-2018
12:15 PM
Hi, Versions : HDP 2.6.4 this is my files : Advanced knoxsso-topology
<topology>
<gateway>
<provider>
<role>webappsec</role>
<name>WebAppSec</name>
<enabled>true</enabled>
<param><name>xframe.options.enabled</name><value>true</value></param>
</provider>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>redirectToUrl</name>
<value>/gateway/knoxsso/knoxauth/login.html</value>
</param>
<param>
<name>restrictedCookies</name>
<value>rememberme,WWW-Authenticate</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<param>
<name>main.ldapContextFactory</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapContextFactory</value>
</param>
<param>
<name>main.ldapRealm.contextFactory</name>
<value>$ldapContextFactory</value>
</param>
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>uid={0},ou=Utilisateurs,o=domain,c=fr</value>
</param>
<param>
<name>main.ldapRealm.userSearchAttributeName</name>
<value>uid</value>
</param>
<param>
<name>main.ldapRealm.authorizationEnabled</name>
<value>true</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemUsername</name>
<value>uid=YPXXX001_Appli,ou=Technical Users,o=domain,c=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemPassword</name>
<value>YPXXX001_Appli</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldaps://HOST:1636</value>
</param>
<param>
<name>main.ldapRealm.authenticationCachingEnabled</name>
<value>false</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
</gateway>
<application>
<name>knoxauth</name>
</application>
<service>
<role>KNOXSSO</role>
<param>
<name>knoxsso.cookie.secure.only</name>
<value>false</value>
</param>
<param>
<name>knoxsso.token.ttl</name>
<value>30000</value>
</param>
<param>
<name>knoxsso.redirect.whitelist.regex</name>
<value>^https?:\/\/(localhost|127\.0\.0\.1|0:0:0:0:0:0:0:1|::1):[0-9].*$</value>
</param>
</service>
</topology>
Advanced topology
<topology>
<gateway>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>uid={0},cn=users,cn=compat,dc=pocbigdata,dc=hpmetier,dc=sf,dc=intra,dc=domain,dc=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldap://HOST:389</value>
</param>
<param>
<name>main.ldapRealm.userSearchAttributeName</name>
<value>uid</value>
</param>
<param>
<name>main.ldapRealm.authorizationEnabled</name>
<value>true</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemUsername</name>
<value>uid=bigdata,cn=sysaccounts,dc=pocbigdata,dc=hpmetier,dc=sf,dc=intra,dc=domain,dc=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemPassword</name>
<value>bigdata</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
<provider>
<role>authorization</role>
<name>AclsAuthz</name>
<enabled>true</enabled>
</provider>
</gateway>
<service>
<role>WEBHDFS</role>
<url>http://HOST:50070/webhdfs</url>
</service>
<service>
<role>HIVE</role>
<url>http://{{hive_server_host}}:{{hive_http_port}}/{{hive_http_path}}</url>
</service>
<service>
<role>AMBARIUI</role>
<url>http://HOST:8080</url>
</service>
<service>
<role>AMBARI</role>
<url>http://HOST:8080</url>
</service>
<service>
<role>RANGERUI</role>
<url>http://HOST:6080</url>
</service>
</topology>
... View more
05-17-2018
01:45 PM
Hello,
We are trying to set up Knox in SSO mode as a gateway (the default mode works). Here is the error in the gateway.log logs: https://HOST:8443/gateway/knoxsso/ambari/ “2018-05-17 15:17:45,622 WARN hadoop.gateway (GatewayFilter.java:doFilter(162)) - Failed to
match path /ambari/” And for this request GET : https://HOST:8443/gateway/knoxsso/knoxauth/login.html 2018-05-17 15:20:15,978 ERROR hadoop.gateway (GatewayServlet.java:service(146))
- Gateway processing failed: javax.servlet.ServletException:
java.lang.NullPointerException
javax.servlet.ServletException:
java.lang.NullPointerException at org.apache.shiro.web.servlet.AdviceFilter.cleanup(AdviceFilter.java:196)
The configuration files are as follows: Advanced
topology <topology>
<gateway>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>uid={0},ou=people,dc=hadoop,dc=apache,dc=org</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldap://{{knox_host_name}}:33389</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
<provider>
<role>authorization</role>
<name>AclsAuthz</name>
<enabled>true</enabled>
</provider>
</gateway>
<service>
<role>NAMENODE</role>
<url>hdfs://{{namenode_host}}:{{namenode_rpc_port}}</url>
</service>
<service>
<role>JOBTRACKER</role>
<url>rpc://{{rm_host}}:{{jt_rpc_port}}</url>
</service>
<service>
<role>WEBHDFS</role>
{{webhdfs_service_urls}}
</service>
<service>
<role>WEBHCAT</role>
<url>http://{{webhcat_server_host}}:{{templeton_port}}/templeton</url>
</service>
<service>
<role>OOZIE</role>
<url>http://{{oozie_server_host}}:{{oozie_server_port}}/oozie</url>
</service>
<service>
<role>WEBHBASE</role>
<url>http://{{hbase_master_host}}:{{hbase_master_port}}</url>
</service>
<service>
<role>HIVE</role>
<url>http://{{hive_server_host}}:{{hive_http_port}}/{{hive_http_path}}</url>
</service>
<service>
<role>RESOURCEMANAGER</role>
<url>http://{{rm_host}}:{{rm_port}}/ws</url>
</service>
<service>
<role>DRUID-COORDINATOR-UI</role>
{{druid_coordinator_urls}}
</service>
<service>
<role>DRUID-COORDINATOR</role>
{{druid_coordinator_urls}}
</service>
<service>
<role>DRUID-OVERLORD-UI</role>
{{druid_overlord_urls}}
</service>
<service>
<role>DRUID-OVERLORD</role>
{{druid_overlord_urls}}
</service>
<service>
<role>DRUID-ROUTER</role>
{{druid_router_urls}}
</service>
<service>
<role>DRUID-BROKER</role>
{{druid_broker_urls}}
</service>
<service>
<role>ZEPPELINUI</role>
{{zeppelin_ui_urls}}
</service>
<service>
<role>ZEPPELINWS</role>
{{zeppelin_ws_urls}}
</service>
<service>
<role>AMBARIUI</role>
<url>http://HOST:8080</url>
</service>
</topology>
Advanced knoxsso-topology <topology>
<gateway>
<provider>
<role>webappsec</role>
<name>WebAppSec</name>
<enabled>true</enabled>
<param><name>xframe.options.enabled</name><value>true</value></param>
</provider>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>redirectToUrl</name>
<value>/gateway/knoxsso/knoxauth/login.html</value>
</param>
<param>
<name>restrictedCookies</name>
<value>rememberme,WWW-Authenticate</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<param>
<name>main.ldapContextFactory</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapContextFactory</value>
</param>
<param>
<name>main.ldapRealm.contextFactory</name>
<value>$ldapContextFactory</value>
</param>
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>uid={0},cn=users,cn=accounts,dc=pocbigdata,dc=hpmetier,dc=sf,dc=intra,dc=toto,dc=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldap://HOST:389</value>
</param>
<param>
<name>main.ldapRealm.authenticationCachingEnabled</name>
<value>false</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
<param>
<name>main.ldapRealm.userSearchAttributeName</name>
<value>uid</value>
</param>
<param>
<name>main.ldapRealm.authorizationEnabled</name>
<value>true</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemUsername</name>
<value>uid=bigdata,cn=sysaccounts,dc=pocbigdata,dc=toto,dc=sf,dc=intra,dc=toto,dc=fr</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.systemPassword</name>
<value>bigdata</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
</gateway>
<application>
<name>knoxauth</name>
</application>
<service>
<role>KNOXSSO</role>
<param>
<name>knoxsso.cookie.secure.only</name>
<value>false</value>
</param>
<param>
<name>knoxsso.token.ttl</name>
<value>30000</value>
</param>
<param>
<name>knoxsso.redirect.whitelist.regex</name>
<value>^https?:\/\/(localhost|127\.0\.0\.1|0:0:0:0:0:0:0:1|::1):[0-9].*$</value>
</param>
</service>
</topology>
... View more
Labels:
05-17-2018
07:32 AM
Hi @Felix Albani. thank you !
... View more
05-22-2018
03:01 PM
1 Kudo
Hi, If you use the latest Ambari you can install HDF 3.1.1 on a HDP 2.6.5 Cluster
... View more