Member since 
    
	
		
		
		11-21-2019
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                31
            
            
                Posts
            
        
                1
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		08-05-2021
	
		
		10:05 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Community Team.     I´m configuring Distcp between secure clusters in different kerberos realms.  Source= HDP 2.6.5  Target= CDP Private Cloud Base 7.1.5     I am setting up, but in the process I have doubts as to how these 2 steps are done, I add the links that show little detail    1)First doubt (what is the step by step of this):  https://docs.cloudera.com/cdp-private-cloud-base/7.1.5/scaling-namespaces/topics/hdfs-distcp-truststore-properties.html     2)Second doubt (what is the step by step of this):  https://docs.cloudera.com/cdp-private-cloud-base/7.1.5/scaling-namespaces/topics/hdfs-distcp-set-hadoop-conf.html     thanking the help 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-03-2021
	
		
		11:45 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Tylenol, Thank you very much for your help. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-03-2021
	
		
		11:44 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @vidanimegh, Thank you very much for your help. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-11-2021
	
		
		10:46 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Tylenol   ****Console command:  hadoop distcp hdfs://server4.localdomain:8020/tmp/distcp_test.txt hdfs://server8.local:8020/tmp     ****NOTE:  server4 (source-HDP-kerberos) and server8(target-CDP-non-kerberos) = NameNodes     *************ERROR ****************  Java config name: null  Native config name: /etc/krb5.conf  Loaded from native config  >>>KinitOptions cache name is /tmp/krb5cc_11259  21/05/11 13:34:10 ERROR tools.DistCp: Invalid arguments:  java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "server4.localdomain/10.x.x.x"; destination host is: "server8.local":8020;  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782)  at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558)  at org.apache.hadoop.ipc.Client.call(Client.java:1498)  at org.apache.hadoop.ipc.Client.call(Client.java:1398)  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)  at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:818)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:498)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)  at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2165)  at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1442)  at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1438)  at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)  at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1438)  at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1447)  at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:227)  at org.apache.hadoop.tools.DistCp.run(DistCp.java:118)  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)  at org.apache.hadoop.tools.DistCp.main(DistCp.java:462)  Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:787)  at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)  at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620)  at org.apache.hadoop.ipc.Client.call(Client.java:1451)  ... 22 more  Invalid arguments: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "server4.localdomain/10.x.x.x.x"; destination host is: "server8.local":8020;  ***********************************************     Thanks! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-11-2021
	
		
		08:38 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Tylenol  it is not a migration.  It is a new installation and I need to copy information from the HDP cluster to the CDP.     Thanks       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-11-2021
	
		
		06:26 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Source Server = kerberos  By console:::  hadoop distcp hdfs://svr2.localdomain:1019/tmp/distcp_test.txt hdfs://svr1.local:9866/tmp     ***************ERROR:::::::******************  Java config name: null  Native config name: /etc/krb5.conf  Loaded from native config  >>>KinitOptions cache name is /tmp/krb5cc_11259  >>>DEBUG <CCacheInputStream> client principal is useradm/admin@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> server principal is krbtgt/LOCALDOMAIN@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> key type: 18  >>>DEBUG <CCacheInputStream> auth time: Tue May 11 08:24:10 VET 2021  >>>DEBUG <CCacheInputStream> start time: Tue May 11 08:24:10 VET 2021  >>>DEBUG <CCacheInputStream> end time: Wed May 12 08:24:10 VET 2021  >>>DEBUG <CCacheInputStream> renew_till time: Tue May 18 08:24:10 VET 2021  >>> CCacheInputStream: readFlags() FORWARDABLE; RENEWABLE; INITIAL;  >>>DEBUG <CCacheInputStream> client principal is useradm/admin@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> server principal is X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/LOCALDOMAIN@LOCALDOMAIN@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> key type: 0  >>>DEBUG <CCacheInputStream> auth time: Wed Dec 31 20:00:00 VET 1969  >>>DEBUG <CCacheInputStream> start time: null  >>>DEBUG <CCacheInputStream> end time: Wed Dec 31 20:00:00 VET 1969  >>>DEBUG <CCacheInputStream> renew_till time: null  >>> CCacheInputStream: readFlags()  21/05/11 09:15:33 WARN ipc.Client: Exception encountered while connecting to the server : java.io.EOFException  21/05/11 09:15:33 ERROR tools.DistCp: Invalid arguments:  java.io.IOException: Failed on local exception: java.io.IOException: java.io.EOFException; Host Details : local host is: "svr1.localdomain/10.x.x.x"; destination host is: "svr2.locall":9866;  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782)  at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558)  at org.apache.hadoop.ipc.Client.call(Client.java:1498)  at org.apache.hadoop.ipc.Client.call(Client.java:1398)  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)  at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:818)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:498)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)  at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2165)  at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1442)  at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1438)  at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)  at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1438)  at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1447)  at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:227)  at org.apache.hadoop.tools.DistCp.run(DistCp.java:118)  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)  at org.apache.hadoop.tools.DistCp.main(DistCp.java:462)  Caused by: java.io.IOException: java.io.EOFException  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:720)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)  at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:683)  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:770)  at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)  at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620)  at org.apache.hadoop.ipc.Client.call(Client.java:1451)  ... 22 more  Caused by: java.io.EOFException  at java.io.DataInputStream.readInt(DataInputStream.java:392)  at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:367)  at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:595)  at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:397)  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:762)  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:758)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:757)  ... 25 more  Invalid arguments: Failed on local exception: java.io.IOException: java.io.EOFException; Host Details : local host is: "svr1.localdomain/10.x.x.x"; destination host is: "svr2.locall":9866;  usage: distcp OPTIONS [source_path...] <target_path>  *******************************************  more information:::  1) Ports= 9866 (dfs.datanode.address) - OPEN Port  2) Ports=1019 (dfs.datanode.address) - OPEN Port  3) svr1.localdomain = Kerberos-Enabled (Source - Source Files)  4) svr2.locall = non-Kerberos (Target - Destination Files)     Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-09-2021
	
		
		04:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Community     I am trying to copy HDFS from an HDP 2.6.x cluster (kerberized) to a CDP Private Cloud Base 7.1.5 cluster (not kerberized) and using other ports as well, it gives me an error     So I write the command in console  hadoop distcp hdfs://svr2.localdomain:1019/tmp/distcp_test.txt hdfs://svr1.local:9866/tmp/     what could be the origin of the fault?      Thank you     ***********ERROR*************************  Java config name: null  Native config name: /etc/krb5.conf  Loaded from native config  >>>KinitOptions cache name is /tmp/krb5cc_11259  >>>DEBUG <CCacheInputStream> client principal is useradm/admin@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> server principal is krbtgt/LOCALDOMAIN@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> key type: 18  >>>DEBUG <CCacheInputStream> auth time: Sun May 09 18:39:04 VET 2021  >>>DEBUG <CCacheInputStream> start time: Sun May 09 18:39:04 VET 2021  >>>DEBUG <CCacheInputStream> end time: Mon May 10 18:39:04 VET 2021  >>>DEBUG <CCacheInputStream> renew_till time: Sun May 16 18:39:04 VET 2021  >>> CCacheInputStream: readFlags() FORWARDABLE; RENEWABLE; INITIAL;  >>>DEBUG <CCacheInputStream> client principal is useradm/admin@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> server principal is X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/LOCALDOMAIN@LOCALDOMAIN@LOCALDOMAIN  >>>DEBUG <CCacheInputStream> key type: 0  >>>DEBUG <CCacheInputStream> auth time: Wed Dec 31 20:00:00 VET 1969  >>>DEBUG <CCacheInputStream> start time: null  >>>DEBUG <CCacheInputStream> end time: Wed Dec 31 20:00:00 VET 1969  >>>DEBUG <CCacheInputStream> renew_till time: null  >>> CCacheInputStream: readFlags()  21/05/09 19:23:36 WARN ipc.Client: Exception encountered while connecting to the server : java.io.EOFException  21/05/09 19:23:36 ERROR tools.DistCp: Invalid arguments:  java.io.IOException: Failed on local exception: java.io.IOException: java.io.EOFException; Host Details : local host is: "svr2.localdomain/10.x.x.x"; destination host is: "svr1.local":9866;  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782)  at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558)  at org.apache.hadoop.ipc.Client.call(Client.java:1498)  at org.apache.hadoop.ipc.Client.call(Client.java:1398)  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)  at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:818)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:498)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)  at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2165)  at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1442)  at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1438)  at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)  at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1438)  at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1447)  at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:227)  at org.apache.hadoop.tools.DistCp.run(DistCp.java:118)  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)  at org.apache.hadoop.tools.DistCp.main(DistCp.java:462)  Caused by: java.io.IOException: java.io.EOFException  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:720)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)  at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:683)  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:770)  at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)  at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620)  at org.apache.hadoop.ipc.Client.call(Client.java:1451)  ... 22 more  Caused by: java.io.EOFException  at java.io.DataInputStream.readInt(DataInputStream.java:392)  at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:367)  at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:595)  at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:397)  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:762)  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:758)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:757)  ... 25 more  Invalid arguments: Failed on local exception: java.io.IOException: java.io.EOFException; Host Details : local host is: "server2.localdomain/10.x.x.x"; destination host is: "svr1.local":9866;     Thanks! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		05-02-2021
	
		
		02:04 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 1) Check what is the type of scheduler in your YARN config       * By Default CDP PvC Base 7.x uses "Capacity Scheduler"     2) If it is Fair Scheduler, change it to Capacity Scheduler, restart the services and observe whether the error with Impala is gone.       * did not apply     3) In case it is Fair Scheduler and changing the scheduler is not something you want to do, then configure Queue placement policies as shown here, Restart the services and observe whether the errors in Impala have gone.             3.1) In all Data Nodes, in configuration apply this configuration                  (Queue placement policies):                     Impala Daemon Fair Scheduler Advanced Configuration Snippet (Safety Valve)                  Configuration Snippet (Safety Valve)  ************************add this xml code===>  <?xml version="1.0"?>  <allocations>  <queue name="root">  <minResources>10000 mb,0vcores</minResources>  <maxResources>90000 mb,0vcores</maxResources>  <maxRunningApps>50</maxRunningApps>  <weight>2.0</weight>  <schedulingPolicy>fair</schedulingPolicy>  <queue name="default">  <aclSubmitApps>root</aclSubmitApps>  <minResources>5000 mb,0vcores</minResources>  </queue>  </queue>    <user name="root">  <maxRunningApps>30</maxRunningApps>  </user>  <userMaxAppsDefault>5</userMaxAppsDefault>    <queuePlacementPolicy>  <rule name="specified" />  <rule name="primaryGroup" create="false" />  <rule name="default" />  </queuePlacementPolicy>  </allocations>  ************************add this xml code===>     Thank!!!  you very much for your support @vidanimegh          
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		- « Previous
- 
						- 1
- 2
 
- Next »
 
        






