Member since 
    
	
		
		
		05-30-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                14
            
            
                Posts
            
        
                6
            
            
                Kudos Received
            
        
                3
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 22530 | 05-25-2018 03:18 PM | |
| 62587 | 03-29-2018 04:13 PM | |
| 1878 | 04-14-2017 09:30 PM | 
			
    
	
		
		
		05-25-2018
	
		
		03:18 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Khouloud LandariIn HDP superset is installed inside a python virtual environment.   In order to install psycopg2, you will need to run following command -   /usr/hdp/current/superset/bin/pip install psycopg2
  Alternatively, We package superset with pygresql and you can change the connection URI to use that -   postgresql+pygresql://user:password@host:port/dbname 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-29-2018
	
		
		04:13 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @jmedel  The root cause of failure is invalid parse spec   java.lang.IllegalArgumentException: Instantiation of [simple type, class io.druid.data.input.impl.DelimitedParseSpec] value failed: If columns field is not set, the first row of your data must have your header and hasHeaderRow must be set to true.
  Check if your input file has a header row or not.   If yes, set hasHeaderRow = true in parseSpec, otherwise you need to specify list of columns in your parseSpec so that druid knows about the columns present in the file. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-14-2018
	
		
		02:25 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Please also share spec file -  hadoop_index_spec.json and complete yarn application logs. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-14-2018
	
		
		02:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Great Article, Showcasing various technologies playing together.   You could possibly also simplify above flow a bit by skipping tranquility ->   Twitter -> Nifi -> Kafka -> Druid   Druid supports directly ingesting data from kafka (without tranquility too)  See - http://druid.io/docs/latest/development/extensions-core/kafka-ingestion.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-07-2017
	
		
		03:47 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 From the description it seems that the port forwarding on the vm is not set properly.   please make sure port forwarding is setup correctly.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-14-2017
	
		
		09:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 Thanks for reporting this. seems like an issue with the packaging on Centos7. We are working on fixing it.   In the meantime, can you try with changing the superset database type to 'mysql' or 'postgresql'  ?  I verified that it works with mysql.  
						
					
					... View more