- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Running distcp between two cluster: One Kerberized and the other is not
Created 09-25-2015 05:58 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hadoop distcp -i -log /tmp/ hdfs://xxx:8020/apps/yyyy hdfs://xxx_cid/tmp/
In this case the "xxx" is the "un-secure" cluster, while "xxx_cid" in the secure cluster.
We are launching the job from the Kerberos cluster, with the appropriate kinit for the user and getting the following error:
java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "xxx/10.x.x.x"; destination host is: "xxx":8020;
...
Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.
I thought by launching the job from the secure cluster, that we could avoid any access issues. But it appears that the processes are kicked off from the "source" cluster. In this case, that's the insecure cluster.
Idea's on getting around this?
Created 11-18-2015 10:25 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I recommend not setting this in core-site.xml, and instead setting it on the command line invocation specifically for the DistCp command that needs to communicate with the unsecured cluster. Setting it in core-site.xml means that all RPC connections for any application are eligible for fallback to simple authentication. This potentially expands the attack surface for man-in-the-middle attacks.
Here is an example of overriding the setting on the command line while running DistCp:
hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true hdfs://nn1:8020/foo/bar hdfs://nn2:8020/bar/foo
The command must be run while logged into the secured cluster, not the unsecured cluster.
Created 03-16-2016 04:35 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- getting below error after running the command "hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true hdfs://nn1:8020/foo/bar hdfs://nn2:8020/bar/foo"
- java.io.EOFException:End of FileException between local host is***; destination host is:***;
- please suggest
Created on 09-30-2024 07:40 AM - edited 09-30-2024 07:41 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
非安全集群被阻止rpc通信,使用webhdfs协议,hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true webhdfs://nn1:50070/foo/bar hdfs://nn2:8020/bar/foo
Created 03-16-2016 04:44 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
getting below error after running the command "hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true hdfs://nn1:8020/foo/bar hdfs://nn2:8020/bar/foo" java.io.EOFException: End of File Exception between local host is *** ; destination host is:***; please suggest
Created 05-13-2016 02:09 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have found that using webhdfs instead of hdfs for the nonsecure host gets around this error, e.g.,
"hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true webhdfs://nn1:8020/foo/bar hdfs://nn2:8020/bar/foo"
Created 05-25-2016 01:12 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
can you try this
hadoop distcp –D ipc.client.falback-tosimple-auth-allowed=true webhdfs://insecureCluster webhdfs://secureCluster
Created 03-07-2017 08:28 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We had similar issue.
Most likely you have following property in /etc/hadoop/conf/hdfs-site.xml:
<property> <name>dfs.namenode.acls.enabled</name> <value>true</value> </property
Remove this property or set it to "false". It should help.
Created 04-04-2017 07:03 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Am doing distcp from insecure to secure hadop cluster and am getting error "SIMPLE authentication isnot enabled". Can any one suggest.
hdfs@master02:~> hadoop distcp -Dipc.client.fallback-to-simple-auth-allowed=true hdfs://HDP23:8020/test01.txt hdfs://HDP24:8020/
17/04/0500:09:28 ERROR tools.DistCp:Invalid arguments:org.apache.hadoop.security.AccessControlException: SIMPLE authentication isnot enabled.Available:[TOKEN, KERBEROS]
Created 04-17-2019 06:21 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
LiteBlue is the U.S. Government website this is intended for authorized use only by Postal Service employees

- « Previous
-
- 1
- 2
- Next »