<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: NIFI PutHDFS processor error while connecting to Azure DataLake - java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem (Azure HD Insight - HDI 3.6) in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/NIFI-PutHDFS-processor-error-while-connecting-to-Azure/m-p/190352#M152441</link>
    <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/70364/bob999c.html" nodeid="70364"&gt;@Bob T&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;Could you please put stack traces inside of code blocks to make them a bit easier to read?&lt;/P&gt;&lt;P&gt;It looks like you are still having classpath problems.  Assuming that NiFi's lib directory is now restored to how it is from a "vanilla" install, I would check to make sure that you have the proper versions of the additional jars you're adding that work with Hadoop 2.7.3.  That's the version of hadoop-client that is used by NiFi 1.5.&lt;/P&gt;&lt;P&gt;It might help if you also (using code blocks) comment with a listing of the nifi/lib dir, the /usr/lib/hdinsight-datalake dir, and the contents of (or a link to) the xml files you've listed in "Hadoop Configuration Resources", sanitized of any information you don't want to post publicly. &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 12 Jul 2018 04:01:25 GMT</pubDate>
    <dc:creator>jts</dc:creator>
    <dc:date>2018-07-12T04:01:25Z</dc:date>
  </channel>
</rss>

