Member since 
    
	
		
		
		06-13-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                41
            
            
                Posts
            
        
                11
            
            
                Kudos Received
            
        
                1
            
            
                Solution
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 1928 | 06-15-2016 08:42 AM | 
			
    
	
		
		
		10-13-2016
	
		
		07:45 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I use the "http.context.identifier" as the "correlation Attribute Identifier", but got 2 error:  2. Failed to respond to HTTP request for StandardFlowFileRecord because FlowFile had an 'http.context.identifier' attribute of but could not find an HTTP Response Object for this identifier 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-13-2016
	
		
		02:37 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I upload the template,  I test 8013 port via soapui with multiple thread.  I cannot use the "http.context.identifier" as the "correlation Attribute Identifier", because I need use fltDate field.  hx.xml 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-12-2016
	
		
		03:18 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I used handlehttprequest / handlehttpresponse: when i got a http request, then call 2 other apis at same time to get result, then merge 2 api result.  if I run this service at 1 thread, it is ok, but got 2 errors when I run  it via more than 1 thread:  1. because FlowFile did not have an 'http.context.identifier' attribute  2. Failed to respond to HTTP request for StandardFlowFileRecord because FlowFile had an 'http.context.identifier' attribute of  but could not find an HTTP Response Object for this identifier  any suggestion? thanks in advance. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		10-09-2016
	
		
		02:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 I use convertjsontoavro + puthivestreaming  processor:  json  { "name": "张三", "num": "2", "score": "3.4", "newtime": "2016-03-01 10:10:10" }  avro schema  { "name" : "newsInfo", "type" : "record", "fields" : [{"name" : "name", "type" : "string"}, {"name" : "num", "type" : "int"}, {"name" : "score", "type" : "double"}, {"name" : "newtime", "type" : "string"}] }  but got error:  2016-10-09 10:28:05,503 WARN [put-hive-streaming-0] org.apache.hive.hcatalog.data.JsonSerDe Error [org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors   at [Source: java.io.ByteArrayInputStream@3c18563c; line: 1, column: 28]] parsing json text [{"name": "张三", "num": "2", "score": "3.4", "newtime": "2016-03-01 10:10:10"}].  2016-10-09 10:28:05,503 ERROR [Timer-Driven Process Thread-5] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=d50d1499-3137-1226-89c0-86dfeac7bf2c] Error writing record to Hive Streaming transaction  2016-10-09 10:28:05,505 ERROR [Timer-Driven Process Thread-5] o.a.n.processors.hive.PutHiveStreaming   org.apache.hive.hcatalog.streaming.SerializationError: {metaStoreUri='thrift://hive1.wdp:9083', database='newsinfo', table='test1', partitionVals=[] } SerializationError  at org.apache.nifi.util.hive.HiveWriter.write(HiveWriter.java:119) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:480) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1880) ~[na:na]  at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1851) ~[na:na]  at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:394) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) ~[nifi-api-1.0.0.jar:1.0.0]  at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) ~[na:na]  at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) ~[na:na]  at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) ~[na:na]  at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) ~[na:na]  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_101]  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[na:1.8.0_101]  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[na:1.8.0_101]  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[na:1.8.0_101]  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[na:1.8.0_101]  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[na:1.8.0_101]  at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_101]  Caused by: org.apache.hive.hcatalog.streaming.SerializationError: Unable to convert byte[] record into Object  at org.apache.hive.hcatalog.streaming.StrictJsonWriter.encode(StrictJsonWriter.java:117) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.streaming.StrictJsonWriter.write(StrictJsonWriter.java:78) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.write(HiveEndPoint.java:632) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  at org.apache.nifi.util.hive.HiveWriter$1.call(HiveWriter.java:113) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.util.hive.HiveWriter$1.call(HiveWriter.java:110) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_101]  ... 3 common frames omitted  Caused by: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors   at [Source: java.io.ByteArrayInputStream@3c18563c; line: 1, column: 28]  at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:179) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.streaming.StrictJsonWriter.encode(StrictJsonWriter.java:115) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  ... 8 common frames omitted  Caused by: org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors   at [Source: java.io.ByteArrayInputStream@3c18563c; line: 1, column: 28]  at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1433) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:521) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.codehaus.jackson.impl.JsonParserBase._parseNumericValue(JsonParserBase.java:766) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.codehaus.jackson.impl.JsonParserBase.getIntValue(JsonParserBase.java:622) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.apache.hive.hcatalog.data.JsonSerDe.extractCurrentField(JsonSerDe.java:279) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.data.JsonSerDe.populateRecord(JsonSerDe.java:218) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:174) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  ... 9 common frames omitted 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hive
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		10-09-2016
	
		
		02:20 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 i used inferavroschema + converjsontoavro + puthivestreaming :  json  { "name": "张三", "num": "2", "score": "3.4", "newtime": "2016-03-01 10:10:10" }  inferred.avro.schema  
{ "type" : "record", "name" : "test", "fields" : [ { "name" : "name", "type" : "string", "doc" : "Type inferred from '\"张三\"'" }, { "name" : "num", "type" : "string", "doc" : "Type inferred from '\"2\"'" }, { "name" : "score", "type" : "string", "doc" : "Type inferred from '\"3.4\"'" }, { "name" : "newtime", "type" : "string", "doc" : "Type inferred from '\"2016-03-01 10:10:10\"'" } ] }  then I set the   Record schema   ${inferred.avro.schema}  in converjsontoavro.  but in puthivestreaming, got this error:  2016-10-09 09:58:48,360 WARN [put-hive-streaming-0] org.apache.hive.hcatalog.data.JsonSerDe Error [org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors   at [Source: java.io.ByteArrayInputStream@7fbad804; line: 1, column: 28]] parsing json text [{"name": "张三", "num": "2", "score": "3.4", "newtime": "2016-03-01 10:10:10"}].  2016-10-09 09:58:48,360 ERROR [Timer-Driven Process Thread-9] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=d50d1499-3137-1226-89c0-86dfeac7bf2c] Error writing record to Hive Streaming transaction  2016-10-09 09:58:48,363 ERROR [Timer-Driven Process Thread-9] o.a.n.processors.hive.PutHiveStreaming   org.apache.hive.hcatalog.streaming.SerializationError: {metaStoreUri='thrift://hive1.wdp:9083', database='newsinfo', table='test1', partitionVals=[] } SerializationError  at org.apache.nifi.util.hive.HiveWriter.write(HiveWriter.java:119) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:480) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1880) ~[na:na]  at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1851) ~[na:na]  at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:394) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) ~[nifi-api-1.0.0.jar:1.0.0]  at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) ~[na:na]  at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) ~[na:na]  at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) ~[na:na]  at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) ~[na:na]  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_101]  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[na:1.8.0_101]  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[na:1.8.0_101]  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[na:1.8.0_101]  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[na:1.8.0_101]  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[na:1.8.0_101]  at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_101]  Caused by: org.apache.hive.hcatalog.streaming.SerializationError: Unable to convert byte[] record into Object  at org.apache.hive.hcatalog.streaming.StrictJsonWriter.encode(StrictJsonWriter.java:117) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.streaming.StrictJsonWriter.write(StrictJsonWriter.java:78) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.write(HiveEndPoint.java:632) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  at org.apache.nifi.util.hive.HiveWriter$1.call(HiveWriter.java:113) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at org.apache.nifi.util.hive.HiveWriter$1.call(HiveWriter.java:110) ~[nifi-hive-processors-1.0.0.jar:1.0.0]  at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_101]  ... 3 common frames omitted  Caused by: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors   at [Source: java.io.ByteArrayInputStream@7fbad804; line: 1, column: 28]  at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:179) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.streaming.StrictJsonWriter.encode(StrictJsonWriter.java:115) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]  ... 8 common frames omitted  Caused by: org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors   at [Source: java.io.ByteArrayInputStream@7fbad804; line: 1, column: 28]  at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1433) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:521) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.codehaus.jackson.impl.JsonParserBase._parseNumericValue(JsonParserBase.java:766) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.codehaus.jackson.impl.JsonParserBase.getIntValue(JsonParserBase.java:622) ~[jackson-core-asl-1.9.13.jar:1.9.13]  at org.apache.hive.hcatalog.data.JsonSerDe.extractCurrentField(JsonSerDe.java:279) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.data.JsonSerDe.populateRecord(JsonSerDe.java:218) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:174) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]  ... 9 common frames omitted 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-09-2016
	
		
		10:38 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I use convertjsontoavro processor:  json   {
  "name": "张三",
  "num": "2",
  "score": "3.4",
  "newtime": "2016-03-01 10:10:10"
}  avro schema   { 
"name" : "newsInfo", 
"type" : "record", 
"fields" :  [{"name" : "name", "type" : "string"},
{"name" : "num", "type" : "int"}, 
{"name" : "score", "type" : "double"}, 
{"name" : "newtime", "type" : "long", "logicalType" : "timestamp"}]
}  but got error:  Failed to convert 1/1 records from JSON to Avro 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-02-2016
	
		
		02:09 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 I want to move data from hdfs to hive using puthivestreaming of nifi. does anyone have example? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hadoop
- 
						
							
		
			Apache Hive
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		10-02-2016
	
		
		03:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 how to handle chinese via this method? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-08-2016
	
		
		08:51 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I want to use invokehttp put method, how to put request body ?  Thanks in advance. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		08-31-2016
	
		
		05:24 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 but we operate different group. 
						
					
					... View more