<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: NIFI PutHDFS processor error while connecting to Azure DataLake - java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem (Azure HD Insight - HDI 3.6) in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/NIFI-PutHDFS-processor-error-while-connecting-to-Azure/m-p/190350#M152439</link>
    <description>&lt;A rel="user" href="https://community.cloudera.com/users/70364/bob999c.html" nodeid="70364"&gt;@Bob T&lt;/A&gt;&lt;P&gt;The link you included has instructions to put those jars in /usr/lib/hdinsight-datalake, and then in the processor configuration for FetchHDFS, set the property "Additional Classpath Resources" to "/usr/lib/hdinsight-datalake". You don't have to use those specific directories, but they must be in a directory that NiFi can read, and NiFi needs to have read permissions on each jar.&lt;/P&gt;&lt;P&gt;Also, please remove the jars you added to NiFi's lib directory. Adding jars directly to NiFi's lib directory can break NiFi itself, or some of its components. It has to do with how classloaders are created so that different NiFi components can use the versions of dependencies that they need. If jars are placed directly in NiFi's lib directory, it may override a dependency for a component and cause it to fail.&lt;/P&gt;&lt;P&gt;Could you please perform those steps and try running the flow again?&lt;/P&gt;</description>
    <pubDate>Thu, 12 Jul 2018 03:02:40 GMT</pubDate>
    <dc:creator>jts</dc:creator>
    <dc:date>2018-07-12T03:02:40Z</dc:date>
  </channel>
</rss>

