Member since 
    
	
		
		
		12-19-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                149
            
            
                Posts
            
        
                15
            
            
                Kudos Received
            
        
                2
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 4778 | 04-04-2017 03:01 PM | |
| 2306 | 01-17-2017 10:44 AM | 
			
    
	
		
		
		04-03-2017
	
		
		01:21 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Unable to install dockeer-engine, getting below error.  Docker is not working after yum install and yum upgrade.  I did following to completely unistall docker and after that tried for a fresh installation but its throwing below errors.  1> yum remove docker  2> rm -rf /etc/yum.repos.d/docker.repo  3> rm -rf /var/lib/docker*  4> rm -rf /var/cache/yum/x86_64/7Server/docker*  10-74-58-106:/root # yum install docker-engine
Loaded plugins: langpacks, product-id, rhnplugin, search-disabled-repos
This system is receiving updates from RHN Classic or Red Hat Satellite.
Resolving Dependencies
--> Running transaction check
---> Package docker-engine.x86_64 0:17.03.1.ce-1.el7.centos will be installed
--> Processing Dependency: docker-engine-selinux >= 17.03.1.ce-1.el7.centos for package: docker-engine-17.03.1.ce-1.el7.centos.x86_64
--> Running transaction check
---> Package docker-engine-selinux.noarch 0:17.03.1.ce-1.el7.centos will be installed
--> Processing Dependency: selinux-policy-base >= 3.13.1-102 for package: docker-engine-selinux-17.03.1.ce-1.el7.centos.noarch
--> Processing Dependency: selinux-policy-targeted >= 3.13.1-102 for package: docker-engine-selinux-17.03.1.ce-1.el7.centos.noarch
--> Processing Conflict: docker-engine-selinux-17.03.1.ce-1.el7.centos.noarch conflicts docker-selinux
--> Finished Dependency Resolution
Error: Package: docker-engine-selinux-17.03.1.ce-1.el7.centos.noarch (dockerrepo)
           Requires: selinux-policy-base >= 3.13.1-102
           Installed: selinux-policy-targeted-3.13.1-60.el7_2.7.noarch (@rhel7-x86_64-2016-07)
               selinux-policy-base = 3.13.1-60.el7_2.7
           Available: selinux-policy-minimum-3.12.1-153.el7.noarch (rhel7-x86_64-2016-07)
               selinux-policy-base = 3.12.1-153.el7
           Available: selinux-policy-minimum-3.12.1-153.el7_0.10.noarch (rhel7-x86_64-2016-07)
               selinux-policy-base = 3.12.1-153.el7_0.10
           Available: selinux-policy-minimum-3.12.1-153.el7_0.11.noarch (rhel7-x86_64-2016-07)
               selinux-policy-base = 3.12.1-153.el7_0.11
           Available: selinux-policy-minimum-3.12.1-153.el7_0.12.noarch (rhel7-x86_64-2016-07)
               selinux-policy-base = 3.12.1-153.el7_0.12
           Available: selinux-policy-minimum-3.12.1-153.el7_0.13.noarch (rhel7-x86_64-2016-07)
               selinux-policy-base = 3.12.1-153.el7_0.13
           Available: selinux-policy-minimum-3.13.1-23.el7.noarch (rhel7-x86_64-2016-07)
Error: Package: docker-engine-selinux-17.03.1.ce-1.el7.centos.noarch (dockerrepo)
           Requires: selinux-policy-targeted >= 3.13.1-102
           Installed: selinux-policy-targeted-3.13.1-60.el7_2.7.noarch (@rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-60.el7_2.7
           Available: selinux-policy-targeted-3.12.1-153.el7.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.12.1-153.el7
           Available: selinux-policy-targeted-3.12.1-153.el7_0.10.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.12.1-153.el7_0.10
           Available: selinux-policy-targeted-3.12.1-153.el7_0.11.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.12.1-153.el7_0.11
           Available: selinux-policy-targeted-3.12.1-153.el7_0.12.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.12.1-153.el7_0.12
           Available: selinux-policy-targeted-3.12.1-153.el7_0.13.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.12.1-153.el7_0.13
           Available: selinux-policy-targeted-3.13.1-23.el7.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-23.el7
           Available: selinux-policy-targeted-3.13.1-23.el7_1.7.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-23.el7_1.7
           Available: selinux-policy-targeted-3.13.1-23.el7_1.8.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-23.el7_1.8
           Available: selinux-policy-targeted-3.13.1-23.el7_1.13.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-23.el7_1.13
           Available: selinux-policy-targeted-3.13.1-23.el7_1.17.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-23.el7_1.17
           Available: selinux-policy-targeted-3.13.1-23.el7_1.18.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-23.el7_1.18
           Available: selinux-policy-targeted-3.13.1-23.el7_1.21.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-23.el7_1.21
           Available: selinux-policy-targeted-3.13.1-60.el7.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-60.el7
           Available: selinux-policy-targeted-3.13.1-60.el7_2.3.noarch (rhel7-x86_64-2016-07)
               selinux-policy-targeted = 3.13.1-60.el7_2.3
Error: docker-engine-selinux conflicts with docker-selinux-1.10.3-44.el7.x86_64
 You could try using --skip-broken to work around the problem
 You could try running: rpm -Va --nofiles --nodigest
  while starting docker also am facing issues 
  ip-10-74-58-106:/root # systemctl start docker
Job for docker.service failed because the control process exited with error code. See "systemctl status docker.service" and "journalctl -xe" for details.
  Can someone please help me here. If possible please share me the link to download dockerized version of latest HDP. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		03-27-2017
	
		
		02:51 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Looking for scala certification. can someone suggest me a good online training portal which will take care of certification as well. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Certification
- 
						
							
		
			Training
			
    
	
		
		
		03-06-2017
	
		
		09:14 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Adnan Alvee   
<console>:31: error: object HiveContext in package hive cannot be accessed in package org.apache.spark.sql.hive
         HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc.sc())
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-03-2017
	
		
		03:29 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 How can I save a dataframe in to a Hive table or sql table using scala.  scala> input.printSchema()
root
 |-- UID: decimal(38,10) (nullable = false)
 |-- DTIME: date (nullable = false)
 |-- TARGET: string (nullable = false)
 |-- RAVG: decimal(38,10) (nullable = true)
 |-- RMIN: decimal(38,10) (nullable = true)
 |-- RMAX: decimal(38,10) (nullable = true)
 |-- RSTD: decimal(38,10) (nullable = true)
 |-- SUCCESSES: decimal(38,10) (nullable = true)
 |-- FAILURES: decimal(38,10) (nullable = true)
 |-- LOCID: decimal(38,10) (nullable = true)
 |-- FNAME: string (nullable = true)
 |-- LD_DT: date (nullable = true)
 |-- RECDDELFLG: string (nullable = true)
 |-- DT_DIM_SEQ: decimal(38,10) (nullable = false)
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Spark
			
    
	
		
		
		02-16-2017
	
		
		09:56 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@melek Here am trying for a single funtion which will read all the file in a dir and take action w.r.t to its type. Each file will go through if condition.  If (csv) then split with comma else pipe. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-16-2017
	
		
		09:11 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 I have 1 CSV (comma separated) and 1 PSV ( pipe separated ) files in the same dir /data/dev/spark  How can I read each file and convert them to their own dataframe using scala. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Spark
			
    
	
		
		
		02-10-2017
	
		
		03:46 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Joe Widen   am unable to do select query on my input_file_temp, at the same time i can do it on gsam_temp which is the DF i made it from SQL table.  If I could do query on both the DF then it would be much easier for me to finish it off.  Here is complete code  import sqlContext.implicits._
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.{SparkContext, SparkConf}
import org.apache.spark.sql.functions.broadcast
import org.apache.spark.sql.types._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
val hiveContext = new HiveContext(sc);
val sqlContext = new org.apache.spark.sql.SQLContext(sc);
// Loading DB table in to dataframe
val gsam = hiveContext.read.format("jdbc").option("driver","oracle.jdbc.driver.OracleDriver").option("url","jdbc:oracle:thin:NPIDWDEV/sysdwnpi@scan-nsgnp.ebiz.verizon.com:1521/nsgdev").option("dbtable", "GSAM_REF").load();
gsam.registerTempTable("gsam_temp")
// Create case class to load input file from local or hdfs
case class f1(
  ckt_id:String,
  location:String,
  usage:String,
  port:String,
  machine:String
)
val input_file = sc.textFile("file:///data04/dev/v994292/spark/input_file.txt").map(_.split("\\|")).map(x => f1(x(0).toString,x(1).toString,x(2).toString,x(3).toString,x(4).toString)).toDF
input_file.registerTempTable("input_file_tmp")
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-10-2017
	
		
		12:09 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Binu Mathew   Thanks for the python code. Am tryin to do it in both scala n python for knowledge purpose.  Am using Spark 1.6.2 . Yes I have created SQLContext. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-10-2017
	
		
		12:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Joe Widen   Am running it in the Spark-shell and am getting the same error !! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-10-2017
	
		
		12:00 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Joe Widen   Thank you Sir, But I think if we do join for a larger dataset memory issues will happen. So in such case can we use if/else or look up function here .  My Aim is to match input_file DFwith gsam DF and if CCKT_NO = ckt_id and SEV_LVL = 3 then print complete row for that ckt_id. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













