Member since
04-28-2016
24
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1085 | 03-12-2018 12:45 PM |
05-03-2019
01:01 PM
Hi Manjunath, You dont need to run all the steps on Ambari server. You can use even your own existing openldap. Just make sure you have the krb5.conf across all the machines. And if you have secure ldap make sure to import certificate to Ambari truststore i & || Java truststore of OS where ambari running. Thanks
... View more
06-29-2018
01:59 PM
2 Kudos
Short Description:
Here is the steps to use OpenLDAP as Backend for Kerberos
Article
This is tried for enabling security Kerberos for Cluster and tested with Ambari 2.6.2.0 and HDP 2.6.5 (CENTOS 7)
1-Installation of OpenLDAP
yum install openldap-server openldap-clients
2-Kerberos use only LDAPS we have to configure SSL for Openldap (you can use OpenSSL I used tinycert.org to generate my certificates)
# mkdir /etc/openldap/cacerts ( Copy the root certificate and the certificate for OpendLdap server and the private keys)
# chown ldap:ldap /etc/opendlap/cacerts/*
3- Edit /etc/sysconfig/slapd and set :
SLAPD_URLS="ldapi:/// ldap:/// ldaps:///"
SLAPD_LDAPS=yes
4- We have to import Root Certificate to Java Keystore of my server
keytool -importcert -keystore /usr/jdk64/jdk1.8.0_112/jre/lib/security/cacerts -alias
RootCaLdap -trustcacerts -file cacert.crt
5- Edit /etc/openldap/slapd.d/cn\=config.ldif and set :
olcTLSCACertificatePath: /etc/openldap/cacerts/
olcTLSCertificateFile: /etc/openldap/cacerts/LDAP.pem
olcTLSCertificateKeyFile: /etc/openldap/cacerts/LDAP.key
6-Edit /etc/openldap/ldap.conf and set :
TLS_CACERTDIR /etc/openldap/cacerts
SASL_NOCANON on
TLS_REQCERT allow
URI ldaps://zrg-druid3.field.hortonworks.com:636
BASE dc=field,dc=hortonworks,dc=com
TLS_REQUIRE never
7- Start OpenLdap
systemctl enable slapd
systemctl start slapd
8- Setup LDAP admin password
slappasswd -h {SSHA} -s "yourpassword"
Output
{SSHA}d/thexcQUuSfe3rx3gRaEhHpNJ52N8D3
9- We need to update this file /etc/openldap/slapd.d/cn=config/olcDatabase={2}hdb.ldif (DON'T CHANGE MANUALLY IT WILL CAUSE CHECKSUM ERRORS) we will create admin.ldif and run ldapmodify to make changes
# cat admin.ldif
dn: olcDatabase={2}hdb,cn=config
changetype: modify
replace: olcSuffix
olcSuffix: dc=field,dc=hortonworks,dc=com
dn: olcDatabase={2}hdb,cn=config
changetype: modify
replace: olcRootDN
olcRootDN: cn=Manager,dc=field,dc=hortonworks,dc=com
dn: olcDatabase={2}hdb,cn=config
changetype: modify
replace: olcRootPW
olcRootPW: {SSHA}/****************************************** (You copy here the password generated above)
Then run ldapmodify to make change
ldapmodify -Y EXTERNAL -H ldapi:/// -f db.ldif
Make a changes to /etc/openldap/slapd.d/cn=config/olcDatabase={1}monitor.ldif (Do not edit manually) file to restrict the monitor access only to ldap root (ldapadmin) user not to others.
cat monitor.ldif
dn: olcDatabase={1}monitor,cn=config
changetype: modify
replace: olcAccess
olcAccess: {0}to * by dn.base="gidNumber=0+uidNumber=0,cn=peercred,cn=external, cn=auth" read by dn.base="cn=Manager,dc=field,dc=hortonworks,dc=com" read by * none
ldapmodify -Y EXTERNAL -H ldapi:/// -f monitor.ldif
10 -Setup LDAP database
cp /usr/share/openldap-servers/DB_CONFIG.example /var/lib/ldap/DB_CONFIG
chown ldap:ldap /var/lib/ldap/*
Add the cosine and nis LDAP schemas.
ldapadd -Y EXTERNAL -H ldapi:/// -f /etc/openldap/schema/cosine.ldif
ldapadd -Y EXTERNAL -H ldapi:/// -f /etc/openldap/schema/nis.ldif
ldapadd -Y EXTERNAL -H ldapi:/// -f /etc/openldap/schema/inetorgperson.ldif
Generate base.ldif file for your domain.
cat base.ldif
Output
ldapadd -x -W -D "cn=ldapadm,dc=field,dc=hortonworks,dc=com" -f base.ldif
Enter LDAP Password:
adding new entry "dc=field,dc=hortonworks,dc=com"adding new entry "cn=ldapadm ,dc=field,dc=hortonworks,dc=com"
adding new entry "ou=Hadoop,dc=field,dc=hortonworks,dc=com"
adding new entry "ou=Group,dc=field,dc=hortonworks,dc=com"
adding new entry "ou=Hamid,dc=field,dc=hortonworks,dc=com"
ldapsearch -x -H 'ldaps://zrg-druid1.field.hortonworks.com:636' -D "cn=ldapadm,dc=field,dc=hortonworks,dc=com" -W
Enter LDAP Password:
# extended LDIF
# LDAPv3
# base <dc=field,dc=hortonworks,dc=com>
(default) with scope subtree
# filter: (objectclass=*)
# requesting: ALL
# field.hortonworks.com
dn: dc=field,dc=hortonworks,dc=com
dc: field
objectClass: top
objectClass: domain
ldapadm, field.hortonworks.com
dn: cn=ldapadm,dc=field,dc=hortonworks,dc=com
objectClass: organizationalRole
cn: ldapadm
description: LDAP Manager
Hadoop, field.hortonworks.com
dn: ou=Hadoop,dc=field,dc=hortonworks,dc=com
objectClass: organizationalUnit
ou: Hadoop
Group, field.hortonworks.com
dn: ou=Group,dc=field,dc=hortonworks,dc=com
objectClass: organizationalUnit
ou: Group
Hamid, field.hortonworks.com
dn: ou=Hamid,dc=field,dc=hortonworks,dc=com
objectClass: organizationalUnit
ou: Hamid
search result
search: 2
result: 0 Success
# numResponses: 6
# numEntries: 5
11-Install Kerberos
yum install krb5-server krb5-server-ldap openldap-clients krb5-workstation
on all other nodes
yum install openldap-clients krb5-workstation
12-Load the Kerberos schema into OpenLdap
12-1-Preparing Kerberos.ldif
cp /usr/share/doc/krb5-server-ldap-1.15.1/kerberos.schema /etc/openldap/schema
touch /tmp/schema_convert.conf and add this lines
include /etc/openldap/schema/core.schema
include /etc/openldap/schema/collective.schema
include /etc/openldap/schema/corba.schema
include /etc/openldap/schema/cosine.schema
include /etc/openldap/schema/duaconf.schema
include /etc/openldap/schema/dyngroup.schema
include /etc/openldap/schema/inetorgperson.schema
include /etc/openldap/schema/java.schema
include /etc/openldap/schema/misc.schema
include /etc/openldap/schema/nis.schema
include /etc/openldap/schema/openldap.schema
include /etc/openldap/schema/ppolicy.schema
include /etc/openldap/schema/kerberos.schema
mkdir /tmp/ldif_output
slapcat -f /tmp/schema_convert.conf -F /tmp/ldif_output/ -n0 -s "cn{12}kerberos,cn=schema,cn=config" > /opt/cn=kerberos.ldif
12-2-Edit the generated /tmp/cn\=kerberos.ldif file :
dn: cn=kerberos,cn=schema,cn=config
objectClass: olcSchemaConfig
cn: kerberos
And remove the following lines from the end of the file :
structuralObjectClass:
entryUUID:
creatorsName:
createTimeStamp:
entryCSN:
modifiersName:
modifyTimestamp:
Load Kerberos schema to OpenLdap:
ldapmodify -Y EXTERNAL -H ldapi:/// -f /tmp/cn\=kerberos.ldif
12-3-Add an index for Kerberos
cat index.ldif
dn: olcDatabase={2}hdb,cn=config
add: olcDbIndex
olcDbIndex: krbPrincipalName eq,pres,sub
ldapmodify -Y EXTERNAL -H ldapi:/// -f index.ldif
13-Setup Krb5.conf
includedir /etc/krb5.conf.d/
[logging]
default = FILE:/var/log/krb5libs.log
kdc = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind.log
[libdefaults]
dns_lookup_realm = false
ticket_lifetime = 24h
renew_lifetime = 7d
forwardable = true
rdns = false
udp_preference_limit = 1
default_realm = FIELD.HORTONWORKS.COM
default_ccache_name = FILE:/tmp/krb5cc_%{uid}
[realms]
FIELD.HORTONWORKS.COM = {
kdc = zrg-druid3.field.hortonworks.com
admin_server = zrg-druid3.field.hortonworks.com
database_module = openldap_ldapconf
}
[domain_realm]
.field.hortonworks.com = FIELD.HORTONWORKS.COM
field.hortonworks.com = FIELD.HORTONWORKS.COM
zrg-druid3.field.hortonworks.com = FIELD.HORTONWORKS.COM
zrg-druid2.field.hortonworks.com = FIELD.HORTONWORKS.COM
zrg-druid1.field.hortonworks.com = FIELD.HORTONWORKS.COM
[appdefaults]
pam = {
debug = false
ticket_lifetime = 3600
renew_lifetime = 3600
forwardable = true
krb4_convert = false
}
[dbmodules]
openldap_ldapconf = {
db_library = kldap
ldap_kerberos_container_dn = cn=kerberos,dc=field,dc=hortonworks,dc=com
ldap_kdc_dn = cn=manager,dc=field,dc=hortonworks,dc=com
ldap_kadmind_dn = cn=manager,dc=field,dc=hortonworks,dc=com
ldap_service_password_file = /etc/krb5.d/stash.keyfile
ldap_servers = ldapi:///ldap_conns_per_server = 5
}
14-Create the Kerberos subtree in LDAP
Populate the OpenLDAP with the base kerberos users:
krdb5_ldap_util -D cn=manager,dc=field,dc=hortonworks,dc=com create -subtrees cn=kerberos,dc=field,dc=hortonworks,dc=com -r FIELD.HORTONWORKS.COM -s -H ldaps://zrg-druid3.field.hortonworks.com
15-Create the stash file containing the admin user password that had admin access to OpenLDAP
krdb5_ldap_util -D cn=manager,dc=field,dc=hortonworks,dc=com stashsrvpw -f /etc/krb5.d/stash.keyfile cn=kerberos,dc=field,dc=hortonworks,dc=com
Restart KDC and Kadmin
systemctl restart krb5kdc
systemctl restart kadmin
16-Prepare Kerberos for ambari
Create at KDC admin
kadmin.local -q "addprinc admin/admin"
Confirm that this admin principal had permissions in the KDC ACL
vim /var/kerberos/krb5kdc/kadm5.aclChange EXAMPLE.COM with your realm
systemctl restart krb5kdc
systemctl restart kadmin
16-Enbale Security for the Cluster
After Kerberization of the Cluster check with Ldapsearch if all the principal been created .
Test accessing HDFS
http://web.mit.edu/kerberos/krb5-latest/doc/admin/conf_ldap.html
https://www.thegoldfish.org/2017/02/openldap-fix-a-incorrect-checksum/
https://www.itzgeek.com/how-tos/linux/centos-how-tos/step-step-openldap-server-configuration-centos-7-rhel-7.html
https://help.ubuntu.com/lts/serverguide/kerberos-ldap.html.en
https://www.thegeekstuff.com/2015/02/openldap-add-users-groups/
... View more
06-05-2018
02:54 AM
druid-tq.pngdruid-tq1.pngdruid-tranquility.pngHi Thiago, Thanks for the article just wondering if do we need to wait six hour to have the first segment Thanks very much granularitySpec":{ "type":"uniform", "segmentGranularity":"six_hour",
... View more
03-12-2018
12:45 PM
I changed the segment granularity to YEAR and worked fine "druid.segment.granularity" = "YEAR",druid-benchmarking-results.png
... View more
03-12-2018
12:18 PM
Hi, When loading data into Hive using druid storage handler my table properties have "druid.segment.granularity" = "MONTH", It cause a exception error while loading data, seems like the segment of length one month process only 30 days and some month have 31 days. Caused by: java.util.concurrent.ExecutionException: org.apache.hive.druid.io.druid.java.util.common.IAE: interval[1992-10-01T00:00:00.000+01:00/1992-10-30T23:00:00.000Z] does not encapsulate the full range of timestamps[1992-10-01T00:00:00.000+01:00, 1992-10-31T00:00:00.000Z]
at org.apache.hive.druid.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
at org.apache.hive.druid.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
at org.apache.hive.druid.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
at org.apache.hadoop.hive.druid.io.DruidRecordWriter.pushSegments(DruidRecordWriter.java:165)
... View more
Labels:
- Labels:
-
Apache Hive
01-11-2018
02:27 PM
Hi Timothy, I got error while upsert to Phoenix I configured jdbc to secure Hbase tested manually working fine , Could you refer me to a example of jdbc connection Phoenix jdbc:phoenix:hostname:/hbase-secure:hbase-silence@rRELAM.COM:/etc/security/keytabs/hbase.headless.keytab /usr/hdp/current/phoenix-client/bin/sqlline.py xx.xx.com:2181:hbase-silence@RELAM:/home/nifi/hbase.headless.keytab this working fine from command line. Thankserror-putsql1.pngjdbc-phoenix.png
... View more
11-21-2016
12:08 PM
Glad that's helped!
... View more
11-21-2016
11:40 AM
Are u using SQL standard based Hive authorization ?? looks like you didn't configure it : [ERROR] java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory Please provide following properties :
hive.security.authorization.manager=org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory
hive.security.authorization.enabled=true hive.security.authenticator.manager=org.apache.hadoop.hive.ql.security.SessionStateUserAuthenticator
... View more
11-16-2016
04:17 PM
Try to restart ambari-agent , or provide logs for yarn-nodemanager /var/log/hadoop-yarn/yarn/yarn-yarn-nodemanager-xxx.com.log
... View more
11-15-2016
05:03 PM
through you log you have some corrupted files hdfs fsck / double check with this command Please find here how to check that : http://stackoverflow.com/questions/19205057/how-to-fix-corrupt-hdfs-files/19216037#19216037
... View more