Member since
08-11-2016
20
Posts
2
Kudos Received
0
Solutions
08-02-2017
06:45 AM
@Sanjib Behera if the cluster is Kerberos enabled then we need "kafka-console-consumer.sh --zookeeper abc00691239901.cde.com:6667 --topic test --from-beginning --security-protocol PLAINTEXTSASL" and for Storm spout you need to have jaas file to be configured.
... View more
09-15-2016
01:32 PM
1 Kudo
@Sanjib Behera After enabling Kerberos via Ambari, some of the UI's are configured to require Kerberos authentication where others are not. I am not sure why not all of them are changed, but that is the way it is for now. The Hadoop UIs (HDFS, Yarn, etc...), for example, do not have Kerberos enabled by default - though there are directions on how to do it manually. That said, once Kerberos authentication is required by a (web-based) UI, you cannot simply point your web browser at them. There are a few additional steps needed to enabled your the web browser to send Kerberos tokens. Each browser has a different set of instructions on how to do this. See https://ping.force.com/Support/PingFederate/Integrations/How-to-configure-supported-browsers-for-Kerberos-NTLM for some instruction on this. However in general you need to do the following: Configure your local machine to communicate with the relevant KDC On your local machine, kinit (or similar facility) as some Kerberos identity Open your web browser (you may need to close and re-open your web browser for it to acknowledge the Kerberos ticket cache) Update the settings in your web browser to enable Kerberos authentication (see the link posted above) Browse to the protected URL
... View more