Member since 
    
	
		
		
		08-07-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                23
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                1
            
            
                Solution
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 1938 | 08-09-2018 11:20 AM | 
			
    
	
		
		
		09-17-2018
	
		
		10:41 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sudharsan
 Ganeshkumar   The optimal number of mappers depends on many variables: you need to take into account your database type, the hardware that is used for your database server, and the impact to other requests that your database needs to serve. There is no optimal number of mappers that works for all scenarios. Instead, you’re encouraged to experiment to find the optimal degree of parallelism for your environment and use case. It’s a good idea to start with a small number of mappers, slowly ramping up, rather than to start with a large number of mappers, working your way down.  When you run sqoop import with -m 1 option, 1 mapper will be launched and in case this parameter is specified, sqoop will run 4 mappers by default. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-31-2018
	
		
		03:08 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sudharsan
 Ganeshkumar
  Snapshots are stored in the same path under .snapshot directory.  for example, If you take snapshot of /user/root, it would be stored in /user/root/.snapshot directory. An example is given below.  [hdfs@sandbox ~]$ hdfs dfsadmin -allowSnapshot /user/root/testsnap
snapsAllowing snaphot on /user/root/testsnaps succeeded
[root@sandbox ~]# hdfs dfs -createSnapshot /user/root/testsnaps snap1
Created snapshot /user/root/testsnaps/.snapshot/snap1
[gulshad@sandbox ~]$ hdfs dfs -createSnapshot /user/gulshad
Created snapshot /user/gulshad/.snapshot/s20180831-145829.441   To get all Snapshotable directories, run below command.  [root@sandbox ~]# sudo -su hdfs hdfs lsSnapshottableDir
drwxr-xr-x 0 root    hdfs 0 2018-07-26 05:29 3 65536 /user/root/testsnaps
drwxr-xr-x 0 hdfs    hdfs 0 2018-08-01 14:58 1 65536 /proj/testsnap
drwxr-xr-x 0 gulshad hdfs 0 2018-08-01 14:58 1 65536 /user/gulshad 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-21-2018
	
		
		12:03 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sudharsan
 Ganeshkumar If the above has helped, please take a moment to login and click the "accept" link on the answer. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-16-2018
	
		
		10:04 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 it's defining a columnname in the filter condition. So in your case it means nothing else then column with the name Age. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-16-2018
	
		
		11:02 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sudharsan Ganeshkumar  You are not seeing anything because you are running the command as root user ! You will have to switch to the hive user and use hive or beeline  # su - hive
$ hive  Then at the prompt run the create statement  hive> CREATE TABLE IF NOT EXISTS emp ( eid int, name String,
salary String, destination String)
COMMENT ‘Employee details’
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ‘\t’
LINES TERMINATED BY ‘\n’
STORED AS TEXTFILE;   And then run   hive> show table emp;  HTH 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-16-2018
	
		
		07:10 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 In beeline or cli, after creating table, u can either do show create or describe to know the table path in hdfs.  After exiting from beeline or cli, u can use below command to see the table folder & files inside it  hadoop fs -ls -R <tablePath> 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-09-2018
	
		
		05:41 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sudharsan
 Ganeshkumar if my answer has helped you please remember to login and mark it as accepted.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		