<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: NiFi: Applying an Avro Schema in ConvertCSVToAvro in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/NiFi-Applying-an-Avro-Schema-in-ConvertCSVToAvro/m-p/148555#M111081</link>
    <description>&lt;P&gt;i used inferavroschema + converjsontoavro + puthivestreaming :&lt;/P&gt;&lt;P&gt;json&lt;/P&gt;&lt;P&gt;{ "name": "张三", "num": "2", "score": "3.4", "newtime": "2016-03-01 10:10:10" }&lt;/P&gt;&lt;P&gt;inferred.avro.schema&lt;/P&gt;&lt;P&gt;
{ "type" : "record", "name" : "test", "fields" : [ { "name" : "name", "type" : "string", "doc" : "Type inferred from '\"张三\"'" }, { "name" : "num", "type" : "string", "doc" : "Type inferred from '\"2\"'" }, { "name" : "score", "type" : "string", "doc" : "Type inferred from '\"3.4\"'" }, { "name" : "newtime", "type" : "string", "doc" : "Type inferred from '\"2016-03-01 10:10:10\"'" } ] }&lt;/P&gt;&lt;P&gt;then I set the &lt;/P&gt;&lt;P&gt;Record schema   ${inferred.avro.schema}&lt;/P&gt;&lt;P&gt;in converjsontoavro.&lt;/P&gt;&lt;P&gt;but in puthivestreaming, got this error:&lt;/P&gt;&lt;P&gt;2016-10-09 09:58:48,360 WARN [put-hive-streaming-0] org.apache.hive.hcatalog.data.JsonSerDe Error [org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors&lt;/P&gt;&lt;P&gt; at [Source: java.io.ByteArrayInputStream@7fbad804; line: 1, column: 28]] parsing json text [{"name": "张三", "num": "2", "score": "3.4", "newtime": "2016-03-01 10:10:10"}].&lt;/P&gt;&lt;P&gt;2016-10-09 09:58:48,360 ERROR [Timer-Driven Process Thread-9] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=d50d1499-3137-1226-89c0-86dfeac7bf2c] Error writing record to Hive Streaming transaction&lt;/P&gt;&lt;P&gt;2016-10-09 09:58:48,363 ERROR [Timer-Driven Process Thread-9] o.a.n.processors.hive.PutHiveStreaming &lt;/P&gt;&lt;P&gt;org.apache.hive.hcatalog.streaming.SerializationError: {metaStoreUri='thrift://hive1.wdp:9083', database='newsinfo', table='test1', partitionVals=[] } SerializationError&lt;/P&gt;&lt;P&gt;at org.apache.nifi.util.hive.HiveWriter.write(HiveWriter.java:119) ~[nifi-hive-processors-1.0.0.jar:1.0.0]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:480) ~[nifi-hive-processors-1.0.0.jar:1.0.0]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1880) ~[na:na]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1851) ~[na:na]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:394) ~[nifi-hive-processors-1.0.0.jar:1.0.0]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) ~[nifi-api-1.0.0.jar:1.0.0]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) ~[na:na]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) ~[na:na]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) ~[na:na]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) ~[na:na]&lt;/P&gt;&lt;P&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;Caused by: org.apache.hive.hcatalog.streaming.SerializationError: Unable to convert byte[] record into Object&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.streaming.StrictJsonWriter.encode(StrictJsonWriter.java:117) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.streaming.StrictJsonWriter.write(StrictJsonWriter.java:78) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.write(HiveEndPoint.java:632) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.util.hive.HiveWriter$1.call(HiveWriter.java:113) ~[nifi-hive-processors-1.0.0.jar:1.0.0]&lt;/P&gt;&lt;P&gt;at org.apache.nifi.util.hive.HiveWriter$1.call(HiveWriter.java:110) ~[nifi-hive-processors-1.0.0.jar:1.0.0]&lt;/P&gt;&lt;P&gt;at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_101]&lt;/P&gt;&lt;P&gt;... 3 common frames omitted&lt;/P&gt;&lt;P&gt;Caused by: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors&lt;/P&gt;&lt;P&gt; at [Source: java.io.ByteArrayInputStream@7fbad804; line: 1, column: 28]&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:179) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.streaming.StrictJsonWriter.encode(StrictJsonWriter.java:115) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;... 8 common frames omitted&lt;/P&gt;&lt;P&gt;Caused by: org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors&lt;/P&gt;&lt;P&gt; at [Source: java.io.ByteArrayInputStream@7fbad804; line: 1, column: 28]&lt;/P&gt;&lt;P&gt;at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1433) ~[jackson-core-asl-1.9.13.jar:1.9.13]&lt;/P&gt;&lt;P&gt;at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:521) ~[jackson-core-asl-1.9.13.jar:1.9.13]&lt;/P&gt;&lt;P&gt;at org.codehaus.jackson.impl.JsonParserBase._parseNumericValue(JsonParserBase.java:766) ~[jackson-core-asl-1.9.13.jar:1.9.13]&lt;/P&gt;&lt;P&gt;at org.codehaus.jackson.impl.JsonParserBase.getIntValue(JsonParserBase.java:622) ~[jackson-core-asl-1.9.13.jar:1.9.13]&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.data.JsonSerDe.extractCurrentField(JsonSerDe.java:279) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.data.JsonSerDe.populateRecord(JsonSerDe.java:218) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:174) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]&lt;/P&gt;&lt;P&gt;... 9 common frames omitted&lt;/P&gt;</description>
    <pubDate>Sun, 09 Oct 2016 21:20:08 GMT</pubDate>
    <dc:creator>121285904</dc:creator>
    <dc:date>2016-10-09T21:20:08Z</dc:date>
  </channel>
</rss>

