<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Secure Webhdfs in Hadoop Hortonworks Cluster in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289007#M213974</link>
    <description>&lt;P&gt;Actually, for more details:&lt;/P&gt;&lt;P&gt;In my ambari server machine I have this ticket:&lt;/P&gt;&lt;P&gt;[root@ambariserver ~]# klist&lt;BR /&gt;Ticket cache: FILE:/tmp/krb5cc_0&lt;BR /&gt;Default principal: spark-analytics_hadoop@REALM.COM&lt;/P&gt;&lt;P&gt;Valid starting Expires Service principal&lt;BR /&gt;02/03/2020 13:31:21 02/03/2020 23:31:21 krbtgt/REALM.COM@REALM.COM&lt;BR /&gt;renew until 02/10/2020 13:31:21&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When i connect with spark user :&lt;/P&gt;&lt;P&gt;HADOOP_ROOT_LOGGER=DEBUG,console /usr/hdp/3.1.4.0-315/spark2/bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://Edgenode:7077 --num-executors 4 --driver-memory 512m --executor-memory 512m --executor-cores 1 /usr/hdp/3.1.4.0-315/spark2/examples/jars/spark-examples_2.11-2.3.2.3.1.4.0-315.jar&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;=&amp;gt; OK&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now if I connect from the Edge Node&lt;/P&gt;&lt;P&gt;[root@EdgeNode~]# klist&lt;BR /&gt;Ticket cache: FILE:/tmp/krb5cc_0&lt;BR /&gt;Default principal: spark/EdgeNode@REALM.COM&lt;/P&gt;&lt;P&gt;Valid starting Expires Service principal&lt;BR /&gt;02/03/2020 16:52:12 02/04/2020 02:52:12 krbtgt/REALM.COM@REALM.COM&lt;BR /&gt;renew until 02/10/2020 16:52:12&lt;/P&gt;&lt;P&gt;But when I connect with user spark&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;HADOOP_ROOT_LOGGER=DEBUG,console /usr/hdp/3.1.4.0-315/spark2/bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://Edgenode:7077 --num-executors 4 --driver-memory 512m --executor-memory 512m --executor-cores 1 /usr/hdp/3.1.4.0-315/spark2/examples/jars/spark-examples_2.11-2.3.2.3.1.4.0-315.jar&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;=&amp;gt; I got error :&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;20/02/03 17:53:01 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@69cac930{/metrics/json,null,AVAILABLE,@Spark}&lt;BR /&gt;20/02/03 17:53:01 WARN Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;20/02/03 17:53:01 ERROR SparkContext: Error initializing SparkContext.&lt;BR /&gt;java.io.IOException: DestHost:destPort NameNode:8020 , LocalHost:localPort EdgeNode/10.48.142.32:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:4&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Did I miss something please?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Users from their laptop launch this commands&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;cluster = RxSpark(sshHostname = "&lt;SPAN&gt;EdgeNode&lt;/SPAN&gt;", sshUsername = "username")&lt;BR /&gt;rxSetComputeContext(cluster)&lt;BR /&gt;source = c("~/AirlineDemoSmall.csv")&lt;BR /&gt;dest_file = "/share"&lt;BR /&gt;&lt;BR /&gt;rxHadoopMakeDir(dest_file)&lt;/P&gt;&lt;P&gt;They are getting thr same issue&lt;/P&gt;&lt;P&gt;On all node cluster hdfs dfs -ls / is working well&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please advise&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Asma&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Mon, 03 Feb 2020 16:55:51 GMT</pubDate>
    <dc:creator>asmarz</dc:creator>
    <dc:date>2020-02-03T16:55:51Z</dc:date>
    <item>
      <title>Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288360#M213595</link>
      <description>&lt;P&gt;Dear community&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have installed a hadoop cluster on 8 servers using Ambari Hortonworks.&lt;/P&gt;&lt;P&gt;I am able to access webhdfs using the ip address and the default port 50070 without authentication.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How can I secure Webhdfs?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;P.S I did not enable using kerberos in Ambari &amp;gt; Enable kerberos , should I do it?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any suggestion will be appreciated&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Asma&lt;/P&gt;</description>
      <pubDate>Mon, 27 Jan 2020 16:48:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288360#M213595</guid>
      <dc:creator>asmarz</dc:creator>
      <dc:date>2020-01-27T16:48:30Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288407#M213627</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70953"&gt;@asmarz&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please refer to the following doc in order to know how you can enable SPNEGO authentication. Once you have enabled Kerberos for your cluster after that you can also enable the SPNEGO authentication. The following doc explains&amp;nbsp;&lt;SPAN&gt;how to configure HTTP authentication for Hadoop components in a Kerberos environment.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;By default, access to the HTTP-based services and UIs for the cluster are not configured to require authentication.&amp;nbsp;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;1.&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/authentication-with-kerberos/content/authe_spnego_enabling_spnego_authentication_for_hadoop.html" target="_blank"&gt;https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/authentication-with-kerberos/content/authe_spnego_enabling_spnego_authentication_for_hadoop.html&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;2.&amp;nbsp;&lt;A href="https://docs.cloudera.com/HDPDocuments/Ambari-2.7.5.0/managing-and-monitoring-ambari/content/amb_start_kerberos_wizard_from_ambari_web.html" target="_blank"&gt;https://docs.cloudera.com/HDPDocuments/Ambari-2.7.5.0/managing-and-monitoring-ambari/content/amb_start_kerberos_wizard_from_ambari_web.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 28 Jan 2020 08:56:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288407#M213627</guid>
      <dc:creator>jsensharma</dc:creator>
      <dc:date>2020-01-28T08:56:31Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288715#M213823</link>
      <description>&lt;P&gt;thak you for your help&lt;/P&gt;&lt;P&gt;I tried to restart th ambari server but in vain .&lt;/P&gt;&lt;P&gt;I got this error&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;2020-01-30 18:20:21,866 INFO [main] KerberosChecker:64 - Checking Ambari Server Kerberos credentials.&lt;BR /&gt;2020-01-30 18:20:22,052 ERROR [main] KerberosChecker:120 - Client not found in Kerberos database (6)&lt;BR /&gt;2020-01-30 18:20:22,052 ERROR [main] AmbariServer:1119 - Failed to run the Ambari Server&lt;BR /&gt;org.apache.ambari.server.AmbariException: Ambari Server Kerberos credentials check failed.&lt;BR /&gt;Check KDC availability and JAAS configuration in /etc/ambari-server/conf/krb5JAASLogin.conf&lt;BR /&gt;at org.apache.ambari.server.controller.utilities.KerberosChecker.checkJaasConfiguration(KerberosChecker.java:121)&lt;BR /&gt;at org.apache.ambari.server.controller.AmbariServer.main(AmbariServer.java:1110)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ht JAASLogin is configured like this&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;com.sun.security.jgss.krb5.initiate {&lt;BR /&gt;com.sun.security.auth.module.Krb5LoginModule required&lt;BR /&gt;renewTGT=false&lt;BR /&gt;doNotPrompt=true&lt;BR /&gt;useKeyTab=true&lt;BR /&gt;keyTab="/etc/security/ambariservername.keytab"&lt;BR /&gt;principal="ambariservername@REALM.COM"&lt;BR /&gt;storeKey=true&lt;BR /&gt;useTicketCache=false;&lt;BR /&gt;};&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I tried to follow&amp;nbsp; these links&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/authentication-with-kerberos/content/kerberos_optional_install_a_new_mit_kdc.html" target="_blank"&gt;https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/authentication-with-kerberos/content/kerberos_optional_install_a_new_mit_kdc.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/authentication-with-kerberos/content/set_up_kerberos_for_ambari_server.html" target="_blank"&gt;https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/authentication-with-kerberos/content/set_up_kerberos_for_ambari_server.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any suggestion please?&lt;/P&gt;&lt;P&gt;&lt;span class="lia-unicode-emoji" title=":disappointed_face:"&gt;😞&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Thu, 30 Jan 2020 17:27:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288715#M213823</guid>
      <dc:creator>asmarz</dc:creator>
      <dc:date>2020-01-30T17:27:44Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288733#M213839</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70953"&gt;@asmarz&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As we see the error like:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt; Failed to run the Ambari Server
org.apache.ambari.server.AmbariException: Ambari Server Kerberos credentials check failed.

Check KDC availability and JAAS configuration in /etc/ambari-server/conf/krb5JAASLogin.conf&lt;/LI-CODE&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;1. So can you please let us know how did you enable Kerberos for Ambari Server ?&amp;nbsp; &amp;nbsp;or manually?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;2. Do you have ambari-agent installed &lt;EM&gt;on the ambari server host&lt;/EM&gt;?&amp;nbsp; and Do you have the Kerberos clients installed on the ambari server host?&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# yum info krb5-libs 
# yum info krb5-workstation&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;3. Do you have the correct KDC/AD address defined inside the file :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# ps -ef | grep AmbariServer | grep --color krb5.conf

# cat /etc/krb5.conf&lt;/LI-CODE&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;4. Are you able to do "kinit" to get a valid kerberos ticket using the same detail mentioned in the file "&lt;SPAN&gt;/etc/ambari-server/conf/krb5JAASLogin.conf&lt;/SPAN&gt;"&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# kinit -kt /etc/security/ambariservername.keytab ambariservername@REALM.COM
# klist&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 31 Jan 2020 00:04:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288733#M213839</guid>
      <dc:creator>jsensharma</dc:creator>
      <dc:date>2020-01-31T00:04:18Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288846#M213898</link>
      <description>&lt;P&gt;Thanks a lot &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have configured the Cluster with Kerberos using Active Directory&lt;/P&gt;&lt;P&gt;but i got some issues when connecting&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[root@server keytabs]# hdfs dfs -ls /&lt;BR /&gt;20/01/31 16:31:19 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;ls: DestHost:destPort namenode:8020 , LocalHost:localPort ambari/ip:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any idea please?&lt;/P&gt;&lt;P&gt;looks like&amp;nbsp; the 8020 ports is also blocked&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Asma&lt;/P&gt;</description>
      <pubDate>Fri, 31 Jan 2020 16:31:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288846#M213898</guid>
      <dc:creator>asmarz</dc:creator>
      <dc:date>2020-01-31T16:31:20Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288865#M213912</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70953"&gt;@asmarz&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In order to clarify the port access, From Ambari host please check if the NameNode port and address is accessible?&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# nc -v $ACTIVE_NAMENODE_FQDN 8020
(OR)
# telnet $ACTIVE_NAMENODE_FQDN 8020&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;The error which you posted usually indicates that before running the mentioned HDFS command you did not get a Valid kerberos ticket using "kinit" command.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;20/01/31 16:31:19 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;/LI-CODE&gt;&lt;P&gt;&lt;FONT color="#993300"&gt;&lt;STRONG&gt;.&lt;BR /&gt;Most Possible Cause of above WARNING:&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;If the port is accessible then please check if you are able to run the same hdfs command after getting a valid kerberos ticket.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# klist -kte /etc/security/ambariservername.keytab
# kinit -kt /etc/security/ambariservername.keytab ambariservername@REALM.COM
# klist
# export HADOOP_ROOT_LOGGER=DEBUG,console
# hdfs dfs -ls /&lt;/LI-CODE&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;And then try the same command using the "hdfs" headless keytab&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# kdestroy
# klist -kte /etc/security/keytabs/hdfs.headless.keytab
# kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-ker1latest@EXAMPLE.COM
# klist
# export HADOOP_ROOT_LOGGER=DEBUG,console
# hdfs dfs -ls /&lt;/LI-CODE&gt;&lt;P&gt;&lt;STRONG&gt;*NOTE:*&lt;/STRONG&gt; the "hdfs-ker1latest@EXAMPLE.COM" principal name may be different in your case so replace it with your own hdfs keytab principle&lt;/P&gt;&lt;P&gt;Please share the output of the above commands.&lt;BR /&gt;Also verify if all your cluster nodes has correct FQDN.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;</description>
      <pubDate>Fri, 31 Jan 2020 23:25:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288865#M213912</guid>
      <dc:creator>jsensharma</dc:creator>
      <dc:date>2020-01-31T23:25:42Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288986#M213960</link>
      <description>&lt;P&gt;Thanks a lot&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now the problem for hdfs is fixed&amp;nbsp; however when i try to launch a script from an edge node , i am getting the same issue&lt;/P&gt;&lt;P&gt;/usr/hdp/3.1.4.0-315/spark2/bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://edgenode.servername:7077 --num-executors 4 --driver-memory 512m --executor-memory 512m --executor-cores 1 /usr/hdp/3.1.4.0-315/spark2/examples/jars/spark-examples_2.11-2.3.2.3.1.4.0-315.jar&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Results :&lt;/P&gt;&lt;P&gt;20/02/03 15:13:41 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20200203151341-0000/79 is now RUNNING&lt;BR /&gt;20/02/03 15:13:41 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@69cac930{/metrics/json,null,AVAILABLE,@Spark}&lt;BR /&gt;20/02/03 15:13:42 WARN Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;20/02/03 15:13:42 ERROR SparkContext: Error initializing SparkContext.&lt;BR /&gt;java.io.IOException: DestHost:destPort namenode.servername:8020 , LocalHost:localPort edgenodeaddress:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:423)&lt;BR /&gt;at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)&lt;BR /&gt;at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:806)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1502)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1444)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1354)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)&lt;BR /&gt;at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:900)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)&lt;BR /&gt;at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1660)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1577)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1574)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1589)&lt;BR /&gt;at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)&lt;BR /&gt;at org.apache.spark.SparkContext.&amp;lt;init&amp;gt;(SparkContext.scala:522)&lt;BR /&gt;at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2498)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925)&lt;BR /&gt;at scala.Option.getOrElse(Option.scala:121)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925)&lt;BR /&gt;at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)&lt;BR /&gt;at org.apache.spark.examples.SparkPi.main(SparkPi.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:758)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:721)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:814)&lt;BR /&gt;at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:411)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.getConnection(Client.java:1559)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1390)&lt;/P&gt;</description>
      <pubDate>Mon, 03 Feb 2020 14:23:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/288986#M213960</guid>
      <dc:creator>asmarz</dc:creator>
      <dc:date>2020-02-03T14:23:34Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289007#M213974</link>
      <description>&lt;P&gt;Actually, for more details:&lt;/P&gt;&lt;P&gt;In my ambari server machine I have this ticket:&lt;/P&gt;&lt;P&gt;[root@ambariserver ~]# klist&lt;BR /&gt;Ticket cache: FILE:/tmp/krb5cc_0&lt;BR /&gt;Default principal: spark-analytics_hadoop@REALM.COM&lt;/P&gt;&lt;P&gt;Valid starting Expires Service principal&lt;BR /&gt;02/03/2020 13:31:21 02/03/2020 23:31:21 krbtgt/REALM.COM@REALM.COM&lt;BR /&gt;renew until 02/10/2020 13:31:21&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When i connect with spark user :&lt;/P&gt;&lt;P&gt;HADOOP_ROOT_LOGGER=DEBUG,console /usr/hdp/3.1.4.0-315/spark2/bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://Edgenode:7077 --num-executors 4 --driver-memory 512m --executor-memory 512m --executor-cores 1 /usr/hdp/3.1.4.0-315/spark2/examples/jars/spark-examples_2.11-2.3.2.3.1.4.0-315.jar&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;=&amp;gt; OK&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now if I connect from the Edge Node&lt;/P&gt;&lt;P&gt;[root@EdgeNode~]# klist&lt;BR /&gt;Ticket cache: FILE:/tmp/krb5cc_0&lt;BR /&gt;Default principal: spark/EdgeNode@REALM.COM&lt;/P&gt;&lt;P&gt;Valid starting Expires Service principal&lt;BR /&gt;02/03/2020 16:52:12 02/04/2020 02:52:12 krbtgt/REALM.COM@REALM.COM&lt;BR /&gt;renew until 02/10/2020 16:52:12&lt;/P&gt;&lt;P&gt;But when I connect with user spark&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;HADOOP_ROOT_LOGGER=DEBUG,console /usr/hdp/3.1.4.0-315/spark2/bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://Edgenode:7077 --num-executors 4 --driver-memory 512m --executor-memory 512m --executor-cores 1 /usr/hdp/3.1.4.0-315/spark2/examples/jars/spark-examples_2.11-2.3.2.3.1.4.0-315.jar&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;=&amp;gt; I got error :&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;20/02/03 17:53:01 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@69cac930{/metrics/json,null,AVAILABLE,@Spark}&lt;BR /&gt;20/02/03 17:53:01 WARN Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;20/02/03 17:53:01 ERROR SparkContext: Error initializing SparkContext.&lt;BR /&gt;java.io.IOException: DestHost:destPort NameNode:8020 , LocalHost:localPort EdgeNode/10.48.142.32:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:4&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Did I miss something please?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Users from their laptop launch this commands&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;cluster = RxSpark(sshHostname = "&lt;SPAN&gt;EdgeNode&lt;/SPAN&gt;", sshUsername = "username")&lt;BR /&gt;rxSetComputeContext(cluster)&lt;BR /&gt;source = c("~/AirlineDemoSmall.csv")&lt;BR /&gt;dest_file = "/share"&lt;BR /&gt;&lt;BR /&gt;rxHadoopMakeDir(dest_file)&lt;/P&gt;&lt;P&gt;They are getting thr same issue&lt;/P&gt;&lt;P&gt;On all node cluster hdfs dfs -ls / is working well&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please advise&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Asma&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 03 Feb 2020 16:55:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289007#M213974</guid>
      <dc:creator>asmarz</dc:creator>
      <dc:date>2020-02-03T16:55:51Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289008#M213975</link>
      <description>&lt;P&gt;should i create principle for each user in the AD ?&lt;/P&gt;&lt;P&gt;We are using active directory users?&lt;/P&gt;&lt;P&gt;If yes how so?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Many thanks&lt;/P&gt;&lt;P&gt;Asma&lt;/P&gt;</description>
      <pubDate>Mon, 03 Feb 2020 17:23:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289008#M213975</guid>
      <dc:creator>asmarz</dc:creator>
      <dc:date>2020-02-03T17:23:38Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289019#M213985</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70953"&gt;@asmarz&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Good to know that your original issue is resolved.&amp;nbsp; However for any subsequent slightly different issue it is always better to open a new Community Thread that way the readers of this thread can easily find out One Error/Issue with one Solution.&amp;nbsp; &amp;nbsp; Multiple issues in a single thread can cause readers to get confused.&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;If your question is answered then, Please make sure to mark the answer as the accepted solution.&lt;BR /&gt;If you find a reply useful, say thanks by clicking on the thumbs up button.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 03 Feb 2020 21:03:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289019#M213985</guid>
      <dc:creator>jsensharma</dc:creator>
      <dc:date>2020-02-03T21:03:08Z</dc:date>
    </item>
    <item>
      <title>Re: Secure Webhdfs in Hadoop Hortonworks Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289043#M214000</link>
      <description>&lt;OL&gt;&lt;LI&gt;&lt;SPAN class="ph cmd"&gt;Set the value of the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;dfs.webhdfs.enabled&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;property in&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;hdfs-site.xml&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;to&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;true.&lt;/SPAN&gt;&lt;DIV class="itemgroup info"&gt;&lt;PRE&gt;&lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;property&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
  &lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;dfs.webhdfs.enabled&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
  &lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;value&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;true&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;value&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt; 
&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;property&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;&lt;/PRE&gt;&lt;/DIV&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN class="ph cmd"&gt;Create an HTTP service user principal.&lt;/SPAN&gt;&lt;DIV class="itemgroup info"&gt;&lt;PRE&gt;kadmin: addprinc -randkey HTTP/$&lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;Fully_Qualified_Domain_Name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;@$&lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;Realm_Name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;.COM&lt;/PRE&gt;&lt;/DIV&gt;&lt;DIV class="itemgroup info"&gt;where:&lt;UL&gt;&lt;LI&gt;Fully_Qualified_Domain_Name: Host where the NameNode is deployed.&lt;/LI&gt;&lt;LI&gt;Realm_Name: Name of your Kerberos realm.&lt;/LI&gt;&lt;/UL&gt;&lt;/DIV&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN class="ph cmd"&gt;Create a keytab file for the HTTP principal.&lt;/SPAN&gt;&lt;DIV class="itemgroup info"&gt;&lt;PRE&gt;kadmin: xst -norandkey -k &lt;SPAN class="hljs-regexp"&gt;/etc/security/spnego.service.keytab HTTP/&lt;/SPAN&gt;$&amp;lt;Fully_Qualified_Domain_Name&amp;gt;&lt;/PRE&gt;&lt;/DIV&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN class="ph cmd"&gt;Verify that the keytab file and the principal are associated with the correct service.&lt;/SPAN&gt;&lt;DIV class="itemgroup info"&gt;&lt;PRE&gt;&lt;SPAN class="hljs-attribute"&gt;klist&lt;/SPAN&gt; –k -t /etc/security/spnego.service.keytab&lt;/PRE&gt;&lt;/DIV&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN class="ph cmd"&gt;Add the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;dfs.web.authentication.kerberos.principal&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;and&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;dfs.web.authentication.kerberos.keytab&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;properties to&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;hdfs-site.xml.&lt;/SPAN&gt;&lt;DIV class="itemgroup info"&gt;&lt;PRE&gt;&lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;property&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
  &lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;dfs.web.authentication.kerberos.principal&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
  &lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;value&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;HTTP/$&lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;Fully_Qualified_Domain_Name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;@$&lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;Realm_Name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;.COM&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;value&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;property&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
&lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;property&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
  &lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;dfs.web.authentication.kerberos.keytab&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;name&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
  &lt;SPAN class="hljs-tag"&gt;&amp;lt;&lt;SPAN class="hljs-name"&gt;value&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;/etc/security/spnego.service.keytab&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;value&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;
&lt;SPAN class="hljs-tag"&gt;&amp;lt;/&lt;SPAN class="hljs-name"&gt;property&lt;/SPAN&gt;&amp;gt;&lt;/SPAN&gt;&lt;/PRE&gt;&lt;/DIV&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN class="ph cmd"&gt;Restart the NameNode and the DataNodes.&lt;/SPAN&gt;&lt;/LI&gt;&lt;/OL&gt;</description>
      <pubDate>Tue, 04 Feb 2020 07:43:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Secure-Webhdfs-in-Hadoop-Hortonworks-Cluster/m-p/289043#M214000</guid>
      <dc:creator>willieken49</dc:creator>
      <dc:date>2020-02-04T07:43:39Z</dc:date>
    </item>
  </channel>
</rss>

