- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
java.lang.ClassCastException: org.apache.xerces.parsers.XIncludeAwareParserConfiguration cannot be cast to org.apache.xerces.xni.parser.XMLParserConfiguration
- Labels:
-
Apache NiFi
Created ‎06-22-2016 01:12 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Everyone,
I'm facing the some weird issue. In my project we are using everything is custom code of NiFi, For xml parsing we are using the DOM parser, after deploying the NAR file into NiFi lib directory after restart i'm facing the below error. please anybody can give me the solution on this.
Custom Processor:
XmlValidation(Domparser)
CustomPutHdfs(Hdfs)
java.lang.ClassCastException: org.apache.xerces.parsers.XIncludeAwareParserConfiguration cannot be cast to org.apache.xerces.xni.parser.XMLParserConfiguration at org.apache.xerces.parsers.DOMParser.<init>(Unknown Source) ~[xercesImpl-2.9.1.jar:na] at org.apache.xerces.parsers.DOMParser.<init>(Unknown Source) ~[xercesImpl-2.9.1.jar:na] at org.apache.xerces.jaxp.DocumentBuilderImpl.<init>(Unknown Source) ~[xercesImpl-2.9.1.jar:na] at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source) ~[xercesImpl-2.9.1.jar:na] at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482) ~[hadoop-common-2.6.2.jar:na] at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444) ~[hadoop-common-2.6.2.jar:na] at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361) ~[hadoop-common-2.6.2.jar:na] at org.apache.hadoop.conf.Configuration.get(Configuration.java:968) ~[hadoop-common-2.6.2.jar:na] at org.apache.nifi.hadoop.SecurityUtil.isSecurityEnabled(SecurityUtil.java:84) ~[nifi-hadoop-utils-0.6.0.jar:0.6.0] at org.apache.nifi.hadoop.KerberosProperties.validatePrincipalAndKeytab(KerberosProperties.java:121) ~[nifi-hadoop-utils-0.6.0.jar:0.6.0] at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.customValidate(AbstractHadoopProcessor.java:154) ~[nifi-hdfs-processors-0.6.0.jar:0.6.0] at org.apache.nifi.components.AbstractConfigurableComponent.validate(AbstractConfigurableComponent.java:123) ~[nifi-api-0.6.0.jar:0.6.0] at org.apache.nifi.controller.StandardProcessorNode.isValid(StandardProcessorNode.java:911) ~[nifi-framework-core-0.6.0.jar:0.6.0] at org.apache.nifi.controller.FlowController.getProcessorStatus(FlowController.java:2529) [nifi-framework-core-0.6.0.jar:0.6.0] at org.apache.nifi.controller.FlowController.getGroupStatus(FlowController.java:2146) [nifi-framework-core-0.6.0.jar:0.6.0] at org.apache.nifi.controller.FlowController.getGroupStatus(FlowController.java:2162) [nifi-framework-core-0.6.0.jar:0.6.0] at org.apache.nifi.controller.FlowController.getGroupStatus(FlowController.java:2162) [nifi-framework-core-0.6.0.jar:0.6.0] at org.apache.nifi.controller.FlowController.getGroupStatus(FlowController.java:2162) [nifi-framework-core-0.6.0.jar:0.6.0] at org.apache.nifi.controller.FlowController$HeartbeatMessageGeneratorTask.createHeartbeatMessage(FlowController.java:3819) [nifi-framework-core-0.6.0.jar:0.6.0] at org.apache.nifi.controller.FlowController$HeartbeatMessageGeneratorTask.run(FlowController.java:3802) [nifi-framework-core-0.6.0.jar:0.6.0] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_71] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_71] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_71] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_71] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_71] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_71] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_71]
Created ‎06-22-2016 01:19 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Please refer this screen shot for more information.
Created ‎06-22-2016 02:04 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I believe this is usually a conflict between versions of xerces. There is a xerces JAR in the nifi-hadoop-libraries-nar, is your NAR declaring that as a dependency? If so then you might need to exclude xerces from it or your own dependencies, in order to ensure there's only a single version/impl.
Created ‎06-22-2016 04:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Mburgess,
Thanks for your response.It will really helpful providing the information how to exclude the particular(xerces) jar at the time of NAR build.
Thanks,
Viswanatha Reddy
Created ‎09-19-2016 04:58 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Did you get solution for this? I am facing exact same issue.
Created ‎09-04-2017 01:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I got the same error on Azure HDInsight today, finally I got the solution in:
http://kitmenke.com/blog/2017/08/05/classcastexception-submitting-spark-apps-to-hdinsight/
Hope it can help.
