- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Disabling Kerberos
Created on ‎10-03-2014 03:06 AM - edited ‎09-16-2022 02:09 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi all,
- Zookeeper -> enableSecurity (Enable Kerberos Authentication)-> false
- HDFS -> hadoop.security.authentication -> Simple
- HDFS -> hadoop.security.authorization -> false
- HDFS -> dfs.datanode.address -> from 1004 (for Kerberos) to 50010 (default)
- HDFS -> dfs.datanode.http.address -> from 1006 (for Kerberos) to 50075 (default)
- HDFS -> Data Directory Permissions -> from 700 to 755
- HBASE -> hbase.security.authentication -> Simple
- HBASE -> hbase.security.authorization -> false
Created ‎10-09-2014 10:54 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You would work your way back through the security guide discussion on enabling kerberos:
Note that if HBASE, or NN HA or JT HA was configured after enabling security, the cleanup can be difficult, the Znode paths within zookeeper might require manual removal of the ACL statements.
Todd
Created ‎04-15-2015 02:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Alessio,
I am facing the same problem could you please outline the steps you did for Yarn and Zookeeper.
Thanks
Deepak
Created ‎02-16-2016 02:41 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The link provided is now broken. Is there an update to it?
Created on ‎02-16-2016 02:44 PM - edited ‎02-16-2016 02:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created on ‎02-17-2016 04:12 AM - edited ‎02-17-2016 04:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Try this one:
Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created on ‎09-16-2015 09:08 AM - edited ‎09-16-2015 09:12 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have the same problem, but after clean and reinstall cluster (using parcels). Error appears when oozie java action wrties to HDFS (runs from HUE). Earlier on this cluster was Kerberos, and I cleand all (I hope) directories from previous installation.
Here is a detailed message:
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.JavaMain], main() threw exception, java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "hadoop-05.xxx.xx/172.19.x.xxx"; destination host is: "hadoop-02.xxx.xx":8020; org.apache.oozie.action.hadoop.JavaMainException: java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "hadoop-05.xxx.xx/172.19.x.xxx"; destination host is: "hadoop-02.xxx.xx":8020;
Created ‎02-11-2016 01:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Now the cluster does not use Kerberos, HDFS and Hive and Impala works fine, BUT THE OPTION FOR ENABLE KERBEROS IS STILL GREYED
Any thoughts?
T

- « Previous
-
- 1
- 2
- Next »