Member since 
    
	
		
		
		04-22-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                931
            
            
                Posts
            
        
                46
            
            
                Kudos Received
            
        
                26
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 1848 | 10-11-2018 01:38 AM | |
| 2209 | 09-26-2018 02:24 AM | |
| 2239 | 06-29-2018 02:35 PM | |
| 2909 | 06-29-2018 02:34 PM | |
| 6087 | 06-20-2018 04:30 PM | 
			
    
	
		
		
		09-28-2016
	
		
		03:14 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Sami Ahmad, in terms of supported configurations, Spark versions 1.6.2 and 2.0 (in technical preview) can be installed together on HDP 2.5.0. See the following for more info:  http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_spark-component-guide/content/install-spark-over-ambari.html. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-30-2016
	
		
		05:55 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sami Ahmad  The upgrade addressed it, but I guess that we still don't know the cause. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-19-2016
	
		
		08:57 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 could you please post code snippet where you are trying to load/read some data .. it looks there is something in file uri starting with C:// 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-15-2016
	
		
		07:51 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 this build.sbt fix the issue and now it compiles the package fine     [root@hadoop1 TwitterPopularTags]# more build.sbt
name := "TwitterPopularTags"  version := "1.0"  scalaVersion := "2.11.8"  val sparkVersion = "1.6.1"  libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)  resolvers += "Akka Repository" at "http://repo.akka.io/releases/"  [root@hadoop1 TwitterPopularTags]# 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-15-2016
	
		
		02:22 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Sami Ahmad  I would put your Flume related questions in Data Ingestion and Streaming. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-13-2016
	
		
		06:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 This will give you commands to control your services: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_HDP_Reference_Guide/content/ch_controlling_hdp_svcs_manually.html  It should work for whatever recent version of HDP you are running on. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-14-2016
	
		
		03:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Gouri you were right it was the privileges issue on Linux /tmp/hive folder ,  I was changing the permission of the hdfs /tmp/hive folder  . I can access beeline now and can connect to the hive store , I have other issues though for which I will open a new post.  thanks for your help 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-02-2016
	
		
		02:01 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks Pierre now its beginning to make some sense , so the 5 GenerateFF processors are there to take care of the 5 countries I guess.  I want to read my own log file , which processor would I use? I want to start with a simple task as read my log file , parse out some values by using the Regexp language and then save the parsed values to HIVE. 
						
					
					... View more