Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error flume MorphlineSolrSink readJson java.lang.NoSuchFieldError: USE_DEFAULTS

Error flume MorphlineSolrSink readJson java.lang.NoSuchFieldError: USE_DEFAULTS

New Contributor

I am trying to read json from avro source and sink to Solr. When I tried readLine {} and stored as string it worked. But when trying readJson{} it throwing following error.
CDH : CDH-5.9.0-1.cdh5.9.0.p0.23

Error:

2017-01-26 06:35:38,604 ERROR org.apache.flume.lifecycle.LifecycleSupervisor: Unable to start SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@513a4842 counterGroup:{ name:null counters:{} } } - Exception follows.
java.lang.NoSuchFieldError: USE_DEFAULTS
at com.fasterxml.jackson.annotation.JsonInclude$Value.<clinit>(JsonInclude.java:204)
at com.fasterxml.jackson.databind.cfg.MapperConfig.<clinit>(MapperConfig.java:44)
at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:558)
at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:483)
at org.kitesdk.morphline.json.ReadJsonBuilder$ReadJson.<init>(ReadJsonBuilder.java:88)
at org.kitesdk.morphline.json.ReadJsonBuilder.build(ReadJsonBuilder.java:55)
at org.kitesdk.morphline.base.AbstractCommand.buildCommand(AbstractCommand.java:307)
at org.kitesdk.morphline.base.AbstractCommand.buildCommandChain(AbstractCommand.java:254)
at org.kitesdk.morphline.stdlib.Pipe.<init>(Pipe.java:46)
at org.kitesdk.morphline.stdlib.PipeBuilder.build(PipeBuilder.java:40)
at org.kitesdk.morphline.base.Compiler.compile(Compiler.java:126)
at org.kitesdk.morphline.base.Compiler.compile(Compiler.java:55)
at org.apache.flume.sink.solr.morphline.MorphlineHandlerImpl.configure(MorphlineHandlerImpl.java:101)
at org.apache.flume.sink.solr.morphline.MorphlineSink.start(MorphlineSink.java:98)
at org.apache.flume.sink.DefaultSinkProcessor.start(DefaultSinkProcessor.java:46)
at org.apache.flume.SinkRunner.start(SinkRunner.java:79)
at org.apache.flume.lifecycle.LifecycleSupervisor$MonitorRunnable.run(LifecycleSupervisor.java:251)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

 

Flume Config :

agent.sources = AvroSource
agent.channels = memorychannel1
agent.sinks = solrSink

agent.channels.memorychannel1.type = memory
agent.channels.memorychannel1.capacity = 10000
agent.channels.memorychannel1.transactionCapacity = 200

The avro source
agent.sources.AvroSource.type     = avro
agent.sources.AvroSource.bind     = 0.0.0.0
agent.sources.AvroSource.port     = 4344
agent.sources.AvroSource.channels = memorychannel1
agent.sources.AvroSource.compression-type = deflate

agent.sinks.solrSink.type = org.apache.flume.sink.solr.morphline.MorphlineSolrSink
agent.sinks.solrSink.morphlineFile = /home/flume/morphline.conf

agent.sources.AvroSource.channels = memorychannel1
agent.sinks.solrSink.channel = memorychannel1

morphline.conf:

SOLR_LOCATOR : { collection : testCollection zkHost : "192.168.21.31:2181/solr" }
morphlines : [ { id : morphline1 importCommands : ["com.cloudera.", "org.apache.solr.", "org.kitesdk.**"]
commands : [
    # read the JSON blob
  {  readJson: {}  }
  {
    extractAvroPaths {
      flatten : false
      paths : {
        id : /id
        type : /type
        name : /name
        address : /address
      }
    }
  }

  {
    sanitizeUnknownSolrFields {
      # Location from which to fetch Solr schema
      solrLocator : ${SOLR_LOCATOR}

      # renameToPrefix : "ignored_"
    }
  }

  # load the record into a Solr server or MapReduce Reducer.
  {
    loadSolr {
      solrLocator : ${SOLR_LOCATOR}
    }
  }
 ]   } ]