Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

distcp secure to insecure cluster

avatar
Rising Star

Hi,

I have an issue with distcp authentication between a kerberos secured cluster (HDP2.6.1.0-129) and an unsecured cluster. (HDP3.0.1.0-187)

The compatibility matrix for the versions says they should interoperate

I am running the following from the secure cluster:

 

hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true swebhdfs://FQDN(secure cluster):50470/tmp/test.sh swebhdfs://FQDN(insecure cluster):50470/tmp/test.sh

 

Both ends have TLS enabled.I have a truststore configured and working. 

I can see packets outbound from the client and arriving at the server using tcpdump

 

The error returned by the distcp command above is:

19/10/15 09:58:48 ERROR tools.DistCp: Exception encountered
org.apache.hadoop.security.AccessControlException: Authentication required
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:460)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:114)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:750)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:592)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:622)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:618)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:1524)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:333)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getAuthParameters(WebHdfsFileSystem.java:557)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toUrl(WebHdfsFileSystem.java:578)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractFsPathRunner.getUrl(WebHdfsFileSystem.java:852)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:745)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:592)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:622)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:618)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:1004)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:1020)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1696)
at org.apache.hadoop.tools.GlobbedCopyListing.doBuildListing(GlobbedCopyListing.java:77)
at org.apache.hadoop.tools.CopyListing.buildListing(CopyListing.java:86)
at org.apache.hadoop.tools.DistCp.createInputFileListing(DistCp.java:398)
at org.apache.hadoop.tools.DistCp.createAndSubmitJob(DistCp.java:190)
at org.apache.hadoop.tools.DistCp.execute(DistCp.java:155)
at org.apache.hadoop.tools.DistCp.run(DistCp.java:128)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.tools.DistCp.main(DistCp.java:462)

The namenode log on the client shows successful authentication to kerberos.

The server hdfs namenode log shows the following warning:

WARN namenode.FSNamesystem (FSNamesystem.java:getDelegationToken(5611)) - trying to get DT with no secret manager running

 

Has anyone come across this issue before? 

1 ACCEPTED SOLUTION

avatar
Rising Star

The problem was iptables on the data nodes. Once these were flushed the following command worked a treat.

 

hadoop distcp  -D ipc.client.fallback-to-simple-auth-allowed=true hdfs://active_namenode:8020/tmp/test.txt swebhdfs://active_namenode:50470/tmp/test.txt

Apologies for my confusion. 

View solution in original post

1 REPLY 1

avatar
Rising Star

The problem was iptables on the data nodes. Once these were flushed the following command worked a treat.

 

hadoop distcp  -D ipc.client.fallback-to-simple-auth-allowed=true hdfs://active_namenode:8020/tmp/test.txt swebhdfs://active_namenode:50470/tmp/test.txt

Apologies for my confusion.