Member since 
    
	
		
		
		03-17-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                32
            
            
                Posts
            
        
                1
            
            
                Kudos Received
            
        
                3
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2797 | 04-18-2017 06:16 AM | |
| 42283 | 04-10-2017 08:29 AM | |
| 34084 | 04-03-2017 07:53 AM | 
			
    
	
		
		
		04-08-2019
	
		
		11:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							i am also facing same error , may i know where you increased the memory
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-17-2017
	
		
		10:44 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I added the following   UserGroupInformation.setConfiguration(conf);    UserGroupInformation.loginUserFromKeytab("myId@OurCompany.ORG", "/myPathtoMyKeyTab/my.keytab")  I was able to connect and get a list of the files in the HSFS directory, however the write operation failed with the following exception:  java.io.IOException: Connection reset by peer          at sun.nio.ch.FileDispatcherImpl.read0(Native Method)          at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)          at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)          at sun.nio.ch.IOUtil.read(IOUtil.java:197)          at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)          at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)          at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)          at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)          at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)          at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118)          at java.io.FilterInputStream.read(FilterInputStream.java:83)          at java.io.FilterInputStream.read(FilterInputStream.java:83)          at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2270)          at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1701)          at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1620)          at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:772)  17/08/17 13:31:49 WARN hdfs.DFSClient: Abandoning BP-2081783877-10.91.61.102-1496699348717:blk_1074056717_315940  17/08/17 13:31:49 WARN hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.91.61.106:50010,DS-caf46aea-ebbb-4d8b-8ded-2e476bb0acee,DISK]     Any ideas? Pointers, help is appreciated. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-18-2017
	
		
		06:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 By adding the annotation to my model(document) the problem was resolved.  @SolrDocument(solrCoreName = "party_name")  public class PartyName {  ....  } 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-11-2017
	
		
		07:22 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							By fixing the Zookeeper string as suggested: "host1:port,host2:port,host3:port/solr",  the problem was fixed. Thanks.
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-11-2017
	
		
		06:22 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I moved it for you. 🙂 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-03-2017
	
		
		07:53 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Your comment gave me the clue, when I generated the script, I missed the statment that follows:                   ROW FORMAT DELIMITED, namely, -FIELDS TERMINATED BY ','.  So the correct create statement would be:  CREATE EXTERNAL TABLE IF NOT EXISTS ccce_apl(      APL_LNK INT,      UPDT_DTTM CHAR(26),      UPDT_USER CHAR(8),      RLS_ORDR_MOD_CD CHAR(12),      RLS_ORDR_MOD_TXT VARCHAR(255) )  ROW FORMAT DELIMITED  FIELDS TERMINATED BY ','  STORED AS TEXTFILE  location '/hdfs/data-lake/master/criminal/csv/ccce_apl';     Thanks. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-31-2017
	
		
		02:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Simple solr indexing example using post:     http://www.solrtutorial.com/solr-in-5-minutes.html     or you can use solrj     http://www.solrtutorial.com/solrj-tutorial.html     even you can try indexing using flume and moprhline solr sink     https://www.cloudera.com/documentation/enterprise/5-5-x/topics/search_tutorial.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-23-2017
	
		
		06:47 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Just to close this thread.  After a bit more investigation, we found that despite the error message received the table was actually populated with the data.     Thanks for your help. 
						
					
					... View more