Member since 
    
	
		
		
		01-09-2014
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                283
            
            
                Posts
            
        
                70
            
            
                Kudos Received
            
        
                50
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2272 | 06-19-2019 07:50 AM | |
| 3504 | 05-01-2019 08:07 AM | |
| 3553 | 04-10-2019 08:49 AM | |
| 3662 | 03-20-2019 09:30 AM | |
| 2845 | 01-23-2019 10:58 AM | 
			
    
	
		
		
		12-16-2015
	
		
		07:11 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I solved the problem. I had to created a java custom interceptor (based in the one you sent me), compile it with maven and paste it in the flume-ng dir.      Thanks pdvorak for all the help 🙂 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-16-2015
	
		
		01:12 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I didn't found any OutOfMemory error in the indicated logs (I did a grep).   However, changing the heap helped. So it really was a heap problem 🙂     Thank you!     Alina  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-21-2015
	
		
		01:38 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks . That got me a bit closer.     I discovered that there was no core created on solrserver2, which i just did and restarted  both servers.     now i am getting Node: solrserver2:8983_solr is not live !     but i see it started in the manager , and no logs either. What am I missing !!     Your help probably saved couple days of reading and frustration ! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-14-2015
	
		
		08:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am able to connect to IBM MQ using the steps mentioned here. But when Flume is trying to consume any messages from the Q, its throwing following exception.     com.ibm.msg.client.jms.DetailedMessageFormatException: JMSCC0053: An exception occurred deserializing a message, exception: 'java.lang.ClassNotFoundException: null class'.  It was not possible to deserialize the message because of the exception shown.     1) I am using all the ibm mq client jars. Flume is starting with out any exception. But exception is coming when trying to consume  the messages .  2) I am putting a custom message [Serializable object] into Q which Flume need to consume.  3) Flume 1.5.0-cdh5.4.1  4) MQ Version 8.x     a1.sources=fe_s1  a1.channels=c1  a1.sinks=k1  a1.sources.s1.type=jms  a1.sources.s1.channels=c1  a1.sources.s1.initialContextFactory=com.sun.jndi.fscontext.RefFSContextFactory  a1.sources.s1.connectionFactory=FLUME_CF  a1.sources.s1.destinationName=MY.Q  a1.sources.s1.providerURL=file:///home/JNDI-Directory  a1.sources.s1.destinationType=QUEUE  a1.sources.s1.transportType=1  a1.sources.s1.userName=mqm  a1.sources.s1.batchSize=1  a1.channels.c1.type=memory  a1.channels.c1.capacity=10000  a1.channels.c1.transactionCapacity=100  a1.sinks.k1.type=logger  a1.sinks.k1.channel=c1    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-29-2015
	
		
		01:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 The problem was solved by changing the source from spooldir to http.   I think there is a problem with the spooldir source. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-05-2015
	
		
		12:07 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Kafka 1.3.1 is a maintenance release, and here is the list of fixed issues:  http://www.cloudera.com/content/cloudera/en/documentation/cloudera-kafka/latest/topics/kafka_fixed_issues.html     HTH!     -PD 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-11-2015
	
		
		01:58 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							The latest kafka parcels can be found here:  http://archive-primary.cloudera.com/kafka/parcels/latest/    And adding that URL to your parcel repository list will make them available to download in Cloudera Manager.
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-06-2015
	
		
		04:10 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							Thanks for the second command, this is very useful and a lot simpler to identify than netstat - Cheers!
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-30-2015
	
		
		09:04 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I just did a roll back which resolved the issue. We have been testing apache hadoop as well so that's why I was delayed to answer.     Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-23-2015
	
		
		06:10 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks for your reply,can we run a shell script when CDH have alert? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		- « Previous
 - Next »