Member since 
    
	
		
		
		05-19-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                216
            
            
                Posts
            
        
                20
            
            
                Kudos Received
            
        
                4
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 4932 | 05-29-2018 11:56 PM | |
| 7952 | 07-06-2017 02:50 AM | |
| 4425 | 10-09-2016 12:51 AM | |
| 4732 | 05-13-2016 04:17 AM | 
			
    
	
		
		
		07-18-2016
	
		
		11:00 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @yshi: Hey, was hoping to hear back on the other issue I had created 🙂 . Is there a way I can bump that question on the forum to get some attention to it? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-13-2016
	
		
		11:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thank you. I see that but I have always had this issue and is likely to happen again on increasing the number of clusters.Could it be some configurations or something that are causing this particular issue?     Also, look forward to a  response on the other question. Thank you 🙂 @yshi 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-06-2016
	
		
		07:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Why do I get this error for standard sqoop import from mysql?  Error: java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'delete, edited, qty, created_date, sold, revenue, sold_date, image, special_pric' at line 1
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
	at com.mysql.jdbc.Util.getInstance(Util.java:386)
	at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
	at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
	at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
	at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
	at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
	at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2283)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
	... 12 more16/07/06 18:31:20 INFO mapreduce.Job: Task Id : attempt_1463739226103_5482_m_000000_0, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'delete, edited, qty, created_date, sold, revenue, sold_date, image, special_pric' at line 1
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
	at com.mysql.jdbc.Util.getInstance(Util.java:386)
	at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
	at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
	at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
	at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
	at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
	at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2283)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
	... 12 more  Datatypes in MySQL table are varchar, int, decimal ,timestamp and date.  I don't see anything problematic here.  My sqoop import looks like this:  sqoop import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://IP/erp --username root --password 'PASSWORD' --table table_name  My table schema looks like this:  CREATE TABLE `retailSKU` (
  `id` int(12) NOT NULL,
  `sku` varchar(255) NOT NULL,
  `new_price` decimal(12,4) NOT NULL,
  `price` decimal(12,4) NOT NULL,
  `cost` decimal(12,4) NOT NULL,
  `new_status` int(1) NOT NULL,
  `status` int(1) NOT NULL,
  `liquidation` int(1) NOT NULL,
  `delete` int(1) NOT NULL,
  `edited` int(1) NOT NULL,
  `qty` int(12) NOT NULL,
  `created_date` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
  `sold` int(12) NOT NULL,
  `revenue` decimal(12,4) NOT NULL,
  `sold_date` timestamp NOT NULL DEFAULT '0000-00-00 00:00:00',
  `image` varchar(255) NOT NULL,
  `special_price` decimal(12,4) DEFAULT '0.0000',
  `new_special_price` decimal(12,4) NOT NULL,
  `name` varchar(255) NOT NULL,
  `fab` varchar(255) DEFAULT NULL,
  `meter` varchar(255) DEFAULT NULL,
  `production_date` date DEFAULT NULL,
  `fab_qty` int(10) DEFAULT '0',
  PRIMARY KEY (`id`),
  KEY `id` (`id`,`sku`,`name`,`created_date`,`production_date`,`sold_date`,`qty`,`revenue`,`status`,`sold`,`special_price`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1  What could be causing the problem here?  Note: Free form query import works just fine for the table though 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Hadoop
 - 
						
							
		
			Apache Sqoop
 - 
						
							
		
			MapReduce
 - 
						
							
		
			Security
 
			
    
	
		
		
		07-05-2016
	
		
		10:00 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Awesome. Using   -Dmapred.reduce.tasks=1    worked. I used it in my job with --meta-connect arguement as well and it still works 🙂 I would really like to understand what goes behind the scenes. Could you please explain why exactly I needed to use this argument?  @yshi      Also, I posted an issue here and it seems to not have gained any attention, Could you please have a look at it and let me know if this is supported at all?  http://community.cloudera.com/t5/Data-Ingestion-Integration/change-sqoop-metastore-timezone-to-GMT/m-p/42596/highlight/true#M1701      Other than that, Thank you for keeping up all this time and helping out 🙂 . Happy holiday 🙂 🙂 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-04-2016
	
		
		09:58 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @yshi: I did sqoop job --show test_table      I get the following results:        verbose = false
incremental.last.value = 2016-07-05 05:29:33.0
db.connect.string = jdbc:mysql://10.10.11.3/testdb?zeroDateTimeBehavior=convertToNull
codegen.output.delimiters.escape = 0
codegen.output.delimiters.enclose.required = false
codegen.input.delimiters.field = 0
split.limit = null
hbase.create.table = false
db.require.password = false
hdfs.append.dir = false
db.table = test_table
codegen.input.delimiters.escape = 0
db.password = St@N!$$r3789L0vE
accumulo.create.table = false
import.fetch.size = null
codegen.input.delimiters.enclose.required = false
db.username = root
reset.onemapper = false
codegen.output.delimiters.record = 10
import.max.inline.lob.size = 16777216
hbase.bulk.load.enabled = false
hcatalog.create.table = false
db.clear.staging.table = false
incremental.col = updated_at
codegen.input.delimiters.record = 0
enable.compression = false
hive.overwrite.table = false
hive.import = false
codegen.input.delimiters.enclose = 0
accumulo.batch.size = 10240000
hive.drop.delims = false
customtool.options.jsonmap = {}
codegen.output.delimiters.enclose = 0
hdfs.delete-target.dir = false
codegen.output.dir = .
codegen.auto.compile.dir = true
relaxed.isolation = false
mapreduce.num.mappers = 4
accumulo.max.latency = 5000
import.direct.split.size = 0
codegen.output.delimiters.field = 44
export.new.update = UpdateOnly
incremental.mode = DateLastModified
hdfs.file.format = TextFile
codegen.compile.dir = /tmp/sqoop-root/compile/58af2027880638f91ea875a1c51b5de8
direct.import = false
db.split.column = id
hdfs.target.dir = /user/hue/ERP/testinc
hive.fail.table.exists = false
merge.key.col = id
jdbc.driver.class = com.mysql.jdbc.Driver
db.batch = false  Value of merge.key.col as expected is id.      But I still do not get the value other attribute you asked for. @yshi    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-04-2016
	
		
		06:39 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thank you.     I do get the logs now but none of the jobs have the two properties you asked for, Shall I share rest of the properties from the logs ? @yshi     Also, from the command line, I execute the job with both root user and hue user and the execution succedes in both the cases without any problem. My job is submitted to oozie using hue user only and that one fails.    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-04-2016
	
		
		06:15 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 The database has all its dates in GMT while sqoop automatically takes local (Asia/kolkata) for incremental updates.  It probably picks that up from JVM but I need it to use GMT for certain jobs and local for others. How do I get about this?  The link https://community.cloudera.com/t5/Data-Ingestion-Integration/Sqoop-s-metastore-timezone/td-p/16306  discusses the same issue. Is there an actual workaround for this? The solution given in the thread did not really work for me .  Here's what I have for a sqoop job:  sqoop job -D oracle.sessionTimeZone=GMT -D mapred.child.java.opts=" -Duser.timezone=GMT" --meta-connect jdbc:hsqldb:hsql://FQDN:16000/sqoop --create JOB_NAME -- import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://IP/DB?zeroDateTimeBehavior=convertToNull --username root --password 'PASSWORD' --table TABLE_NAME--incremental lastmodified --check-column updated_at --last-value 0 --merge-key entity_id --split-by entity_id --target-dir LOCATION_SPECIFIED --hive-database Magento --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '\001' --input-null-string '\\N' --input-null-non-string '\\N' --input-null-non-string '\\N' --input-fields-terminated-by '\001'  logs:  5459 [uber-SubtaskRunner] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
5497 [uber-SubtaskRunner] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version: 1.4.6-cdh5.7.0
5817 [uber-SubtaskRunner] WARN  org.apache.sqoop.tool.BaseSqoopTool  - Setting your password on the command-line is insecure. Consider using -P instead.
5832 [uber-SubtaskRunner] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
5859 [uber-SubtaskRunner] WARN  org.apache.sqoop.ConnFactory  - Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
5874 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Using default fetchSize of 1000
5874 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code generation
6306 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM sales_flat_order AS t WHERE 1=0
6330 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM sales_flat_order AS t WHERE 1=0
6434 [uber-SubtaskRunner] INFO  org.apache.sqoop.orm.CompilationManager  - HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/hadoop-mapreduce
9911 [uber-SubtaskRunner] INFO  org.apache.sqoop.orm.CompilationManager  - Writing jar file: /tmp/sqoop-yarn/compile/51c9a7f9e76b0547825eb7a852721bf9/sales_flat_order.jar
9928 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM sales_flat_order AS t WHERE 1=0
9941 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.ImportTool  - Incremental import based on column updated_at
9941 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.ImportTool  - Lower bound value: '0'
9941 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.ImportTool  - Upper bound value: '2016-06-30 11:40:36.0'
9943 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Beginning import of sales_flat_order
9962 [uber-SubtaskRunner] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM sales_flat_order AS t WHERE 1=0
10007 [uber-SubtaskRunner] WARN  org.apache.sqoop.mapreduce.JobBase  - SQOOP_HOME is unset. May not be able to find all job dependencies.
10672 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.db.DBInputFormat  - Using read commited transaction isolation
10674 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat  - BoundingValsQuery: SELECT MIN(entity_id), MAX(entity_id) FROM sales_flat_order WHERE ( updated_at >= '0' AND updated_at < '2016-06-30 11:40:36.0' )
11667 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.db.IntegerSplitter  - Split size: 86592; Num splits: 4 from: 1 to: 346372
Heart beat
42986 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 300.3027 MB in 32.9683 seconds (9.1088 MB/sec)
42995 [uber-SubtaskRunner] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 339510 records.
43008 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.ImportTool  - Saving incremental import state to the metastore
43224 [uber-SubtaskRunner] INFO  org.apache.sqoop.tool.ImportTool  - Updated data for job: sales_flat_order 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-04-2016
	
		
		05:56 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 To enable the logs, I added it to my job :     sqoop job -Dmapreduce.map.log.level=DEUBG --meta-connect jdbc:hsqldb:hsql://FQDN:16000/sqoop --verbose --create test_table -- import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://ip3/testdb?zeroDateTimeBehavior=convertToNull --username root --password 'password' --table test_table --merge-key id --split-by id --target-dir location --incremental lastmodified --last-value 0 --check-column updated_at  But trying to get the logs, I still get:     /tmp/logs/root/logs/application_1463739226103_5116does not exist.  Log aggregation has not completed or is not enabled.     As a sidenote: I have also tried to change time zone of sqoop job in similar manner using -D option and that did not work either. Am I adding it in an incorrect way to the sqoop job? @yshi 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-04-2016
	
		
		04:47 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Markus Kemper: That's right. But then we already have the file in target dir when running the import for second time. It throws error that either import using incremental append or specify  merge key when target dir already exists and using incremental lastmodified.      Using sqoop merge command in a seprate job sounds okay but I would like to have --merge-key to work when it's available 😄 .     Also, the link https://community.hortonworks.com/questions/10710/sqoop-incremental-import-working-fine-now-i-want-k.html suggests to use --merge-key with import command only and has been confirmed that it works.     This link herehttp://stackoverflow.com/questions/34400973/apache-sqoop-incremental-import?rq=1 also confirms that it works     Also, it works when I run the sqoop job through CLI. It gives this problem only when running it through oozie 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-04-2016
	
		
		12:47 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							    Hey, Tried collecting the diagnostics. I got an error: @yshi         A server error has occurred. Send the following information to Cloudera.   Path: http://warehouse.swtched.com:7180/cmf/services/51/yarnDiagnosticsCollection  Version: Cloudera Express 5.7.0 (#76 built by jenkins on 20160401-1334 git: ec0e7e69444280aa311511998bd83e8e6572f61c)   java.lang.NullPointerException:  at YarnController.java line 463  in com.cloudera.server.web.cmf.YarnController collectYarnApplicationDiagnostics()  Stack Trace:  YarnController.java line 463  in com.cloudera.server.web.cmf.YarnController collectYarnApplicationDiagnostics()  <generated> line -1  in com.cloudera.server.web.cmf.YarnController$$FastClassByCGLIB$$ac91d355 invoke()  MethodProxy.java line 191  in net.sf.cglib.proxy.MethodProxy invoke()  Cglib2AopProxy.java line 688  in org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation invokeJoinpoint()  ReflectiveMethodInvocation.java line 150  in org.springframework.aop.framework.ReflectiveMethodInvocation proceed()  MethodSecurityInterceptor.java line 61  in org.springframework.security.access.intercept.aopalliance.MethodSecurityInterceptor invoke()  ReflectiveMethodInvocation.java line 172  in org.springframework.aop.framework.ReflectiveMethodInvocation proceed()  Cglib2AopProxy.java line 621  in org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor intercept()  <generated> line -1  in com.cloudera.server.web.cmf.YarnController$$EnhancerByCGLIB$$f035552a collectYarnApplicationDiagnostics()  NativeMethodAccessorImpl.java line -2  in sun.reflect.NativeMethodAccessorImpl invoke0()  NativeMethodAccessorImpl.java line 57  in sun.reflect.NativeMethodAccessorImpl invoke()  DelegatingMethodAccessorImpl.java line 43  in sun.reflect.DelegatingMethodAccessorImpl invoke()  Method.java line 606  in java.lang.reflect.Method invoke()  HandlerMethodInvoker.java line 176  in org.springframework.web.bind.annotation.support.HandlerMethodInvoker invokeHandlerMethod()  AnnotationMethodHandlerAdapter.java line 436  in org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter invokeHandlerMethod()  AnnotationMethodHandlerAdapter.java line 424  in org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter handle()  DispatcherServlet.java line 790  in org.springframework.web.servlet.DispatcherServlet doDispatch()  DispatcherServlet.java line 719  in org.springframework.web.servlet.DispatcherServlet doService()  FrameworkServlet.java line 669  in org.springframework.web.servlet.FrameworkServlet processRequest()  FrameworkServlet.java line 585  in org.springframework.web.servlet.FrameworkServlet doPost()  HttpServlet.java line 727  in javax.servlet.http.HttpServlet service()  HttpServlet.java line 820  in javax.servlet.http.HttpServlet service()  ServletHolder.java line 511  in org.mortbay.jetty.servlet.ServletHolder handle()  ServletHandler.java line 1221  in org.mortbay.jetty.servlet.ServletHandler$CachedChain doFilter()  UserAgentFilter.java line 78  in org.mortbay.servlet.UserAgentFilter doFilter()  GzipFilter.java line 131  in org.mortbay.servlet.GzipFilter doFilter()  ServletHandler.java line 1212  in org.mortbay.jetty.servlet.ServletHandler$CachedChain doFilter()  JAMonServletFilter.java line 48  in com.jamonapi.http.JAMonServletFilter doFilter()  ServletHandler.java line 1212  in org.mortbay.jetty.servlet.ServletHandler$CachedChain doFilter()  JavaMelodyFacade.java line 109  in com.cloudera.enterprise.JavaMelodyFacade$MonitoringFilter doFilter()  ServletHandler.java line 1212  in org.mortbay.jetty.servlet.ServletHandler$CachedChain doFilter()  FilterChainProxy.java line 311  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  FilterSecurityInterceptor.java line 116  in org.springframework.security.web.access.intercept.FilterSecurityInterceptor invoke()  FilterSecurityInterceptor.java line 83  in org.springframework.security.web.access.intercept.FilterSecurityInterceptor doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  ExceptionTranslationFilter.java line 113  in org.springframework.security.web.access.ExceptionTranslationFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  SessionManagementFilter.java line 101  in org.springframework.security.web.session.SessionManagementFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  AnonymousAuthenticationFilter.java line 113  in org.springframework.security.web.authentication.AnonymousAuthenticationFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  RememberMeAuthenticationFilter.java line 146  in org.springframework.security.web.authentication.rememberme.RememberMeAuthenticationFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  SecurityContextHolderAwareRequestFilter.java line 54  in org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  RequestCacheAwareFilter.java line 45  in org.springframework.security.web.savedrequest.RequestCacheAwareFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  AbstractAuthenticationProcessingFilter.java line 182  in org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  LogoutFilter.java line 105  in org.springframework.security.web.authentication.logout.LogoutFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  SecurityContextPersistenceFilter.java line 87  in org.springframework.security.web.context.SecurityContextPersistenceFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  ConcurrentSessionFilter.java line 125  in org.springframework.security.web.session.ConcurrentSessionFilter doFilter()  FilterChainProxy.java line 323  in org.springframework.security.web.FilterChainProxy$VirtualFilterChain doFilter()  FilterChainProxy.java line 173  in org.springframework.security.web.FilterChainProxy doFilter()  DelegatingFilterProxy.java line 237  in org.springframework.web.filter.DelegatingFilterProxy invokeDelegate()  DelegatingFilterProxy.java line 167  in org.springframework.web.filter.DelegatingFilterProxy doFilter()  ServletHandler.java line 1212  in org.mortbay.jetty.servlet.ServletHandler$CachedChain doFilter()  CharacterEncodingFilter.java line 88  in org.springframework.web.filter.CharacterEncodingFilter doFilterInternal()  OncePerRequestFilter.java line 76  in org.springframework.web.filter.OncePerRequestFilter doFilter()  ServletHandler.java line 1212  in org.mortbay.jetty.servlet.ServletHandler$CachedChain doFilter()  ServletHandler.java line 399  in org.mortbay.jetty.servlet.ServletHandler handle()  SecurityHandler.java line 216  in org.mortbay.jetty.security.SecurityHandler handle()  SessionHandler.java line 182  in org.mortbay.jetty.servlet.SessionHandler handle()  SecurityHandler.java line 216  in org.mortbay.jetty.security.SecurityHandler handle()  ContextHandler.java line 767  in org.mortbay.jetty.handler.ContextHandler handle()  WebAppContext.java line 450  in org.mortbay.jetty.webapp.WebAppContext handle()  HandlerWrapper.java line 152  in org.mortbay.jetty.handler.HandlerWrapper handle()  StatisticsHandler.java line 53  in org.mortbay.jetty.handler.StatisticsHandler handle()  HandlerWrapper.java line 152  in org.mortbay.jetty.handler.HandlerWrapper handle()  Server.java line 326  in org.mortbay.jetty.Server handle()  HttpConnection.java line 542  in org.mortbay.jetty.HttpConnection handleRequest()  HttpConnection.java line 945  in org.mortbay.jetty.HttpConnection$RequestHandler content()  HttpParser.java line 756  in org.mortbay.jetty.HttpParser parseNext()  HttpParser.java line 218  in org.mortbay.jetty.HttpParser parseAvailable()  HttpConnection.java line 404  in org.mortbay.jetty.HttpConnection handle()  SelectChannelEndPoint.java line 410  in org.mortbay.io.nio.SelectChannelEndPoint run()  QueuedThreadPool.java line 582  in org.mortbay.thread.QueuedThreadPool$PoolThread run()      
						
					
					... View more