Member since 
    
	
		
		
		05-19-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                216
            
            
                Posts
            
        
                20
            
            
                Kudos Received
            
        
                4
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 4932 | 05-29-2018 11:56 PM | |
| 7951 | 07-06-2017 02:50 AM | |
| 4425 | 10-09-2016 12:51 AM | |
| 4730 | 05-13-2016 04:17 AM | 
			
    
	
		
		
		09-02-2016
	
		
		10:51 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 If this is from HUE, hue can execute multiple command in a sequence until it reaches the first query that returns results (like a select query). For example, the following should be executed entirely in a single go.     drop table if exists foo;  create table if not exists foo (code string, description string, salary int);  insert into foo select code, description, salary from sample s where s.salary > 50000 and s.salary < 100000;  select * from foo where salary < 75000;     The following will stop after the select query, so the drop table will not be executed.  drop table if exists foo;  create table if not exists foo (code string, description string, salary int);  insert into foo select code, description, salary from sample_07 s where s.salary > 50000 and s.salary < 100000;  select * from foo where salary < 75000;  drop table foo;     But if you use beeline to execute a file containing multiple select queries, this should work without pausing. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-04-2016
	
		
		07:52 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi  I will do ,  sqoop job --show Jobname  it show illegal arguement exception. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-27-2016
	
		
		09:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @sim6  Sorry, I cannot make that time. Could you please paste the information here? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-06-2016
	
		
		03:08 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							The `delete` named column in your retailSKU table is actually a reserved  word, and that's the central point of the issue. When Sqoop's building out  the query, its currently not escaping the column names with backticks (``)  which is necessary when not intending to mean using a reserved word (  https://dev.mysql.com/doc/refman/5.7/en/keywords.html).    Would it be possible for you to alter the column name on the source  retailSKU table, or pass a custom query instead which uses the right escape  syntax, via --query:  http://archive.cloudera.com/cdh5/cdh/5/sqoop/SqoopUserGuide.html#_free_form_query_imports  ?    Edit: Just noticed you've mentioned that free-form works just fine.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-23-2016
	
		
		05:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Welcome to the Cloudera Community @sim6. Am I correct in thinking you posted the questions here because you were wondering how to use Hue on the three issues? Otherwise, it looks like you have a few different questions that may be better answered outside of the Hue board. In that case, I would suggest creating new posts for each question in the forum board that best suits it.  
   
 Question one may be a fit for the Data Ingestion board which covers Sqoop. Question two would be a match for the batch processing board that covers Oozie. Question three seems to be something for the Hadoop Concepts board. 
   
 I hope this helps. 🙂 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-14-2016
	
		
		02:18 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Neeraj Sabharwal: But it also gives this in the error before permission denied:  I see this person had the same problem :https://community.hortonworks.com/questions/18261/tez-session-is-already-shutdown-failed-2-times-due.html  . I would not really like to switch the engine to MR.  - Exception in thread "main" java.lang.RuntimeException: org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1463224371637_0088 failed 2 times due to AM Container for appattempt_1463224371637_0088_000002 exited with  exitCode: -1000
2016-05-14 19:45:56,425 INFO  [Thread-30] hive.HiveImport (LoggingAsyncSink.java:run(85)) - Exception in thread "main" java.lang.RuntimeException: org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1463224371637_0088 failed 2 times due to AM Container for appattempt_1463224371637_0088_000002 exited with  exitCode: -1000
41399 [Thread-30] INFO  org.apache.sqoop.hive.HiveImport  - For more detailed output, check application tracking page:http://warehouse.swtched.com:8088/cluster/app/application_1463224371637_0088Then, click on links to logs of each attempt.
2016-05-14 19:45:56,425 INFO  [Thread-30] hive.HiveImport (LoggingAsyncSink.java:run(85)) - For more detailed output, check application tracking page:http://warehouse.swtched.com:8088/cluster/app/application_1463224371637_0088Then, click on links to logs of each attempt.
41399 [Thread-30] INFO  org.apache.sqoop.hive.HiveImport  - Diagnostics: Permission denied: user=oozie, access=EXECUTE, inode="/tmp/hive/yarn/_tez_session_dir/f1b2db7d-0836-4330-849c-dc3e4d6dc2d1/hive-hcatalog-core.jar":yarn:hdfs:drwx------
2016-05-14 19:45:56,425 INFO  [Thread-30] hive.HiveImport (LoggingAsyncSink.java:run(85)) - Diagnostics: Permission denied: user=oozie, access=EXECUTE, inode="/tmp/hive/yarn/_tez_session_dir/f1b2db7d-0836-4330-849c-dc3e4d6dc2d1/hive-hcatalog-core.jar":yarn:hdfs:drwx------
41399 [Thread-30] INFO  org.apache.sqoop.hive.HiveImport  - 	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)  I have checked and TEZ is running. What could be the problem here?  checked the yarn logs applicationId id and got this:  ERROR org.apache.sqoop.tool.ImportTool  - Encountered IOException running import job: java.io.IOException: Hive exited with status 1
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:394)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:344)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:245)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:228)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:283)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
2016-05-14 19:45:56,806 ERROR [main] tool.ImportTool (ImportTool.java:run(613)) - Encountered IOException running import job: java.io.IOException: Hive exited with status 1
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-13-2016
	
		
		06:38 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @simran kaur You should suffix your target-dir name with date (example - rather than naming target-dir emp_snapshot, name it emp_snapshot_05132016). And then on next run, once you have updated copy of data, remove old one. It should be straight-forward. Additional benefit would be that you will know from name when last snapshot completed and was imported. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-28-2016
	
		
		08:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Removing the sqoop-site.xml from that folder also worked for me. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-16-2018
	
		
		04:23 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am reaaaaally late on this answer!!!  But since i recently faced this issue myself, i am gonna answer to help somebody else out. The solution was not to remove the sqoop keyword when passing the command tag in workflow.xml  Pass the command in this way.  <command>import --connect "jdbc:mysql://localhost;database=US_DB" --username root -P --table employee --hbase-table hb_emp --column-family cfemp --incremental append --check-column empid --last-value 304</command>  
						
					
					... View more