<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: hash issue while using S3 as Storage Backend for HDFS in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/hash-issue-while-using-S3-as-Storage-Backend-for-HDFS/m-p/215417#M177327</link>
    <description>&lt;P&gt;You can try below changes in your submit command as they may be causing the hash value calculated to be different :&lt;/P&gt;&lt;P&gt;Submit command :&lt;/P&gt;&lt;P&gt;I believe you want to write abc.txt in s3a bucket &lt;STRONG&gt;hadoopsa&lt;/STRONG&gt; under &lt;STRONG&gt;sample&lt;/STRONG&gt; folder. As you have already set hadoopsa as your defaultFS.&lt;/P&gt;&lt;P&gt;So you should use below command &lt;/P&gt;&lt;PRE&gt;hdfs dfs -put abc.txt /sample/ #sample folder should be existing before command run.
OR
hdfs dfs -put abc.txt s3a://hadoopsa/sample/
&lt;/PRE&gt;&lt;P&gt;In your command when you put a file directly in s3a://sample/ it assumes sample as a bucket and tries to write in the base path.&lt;/P&gt;</description>
    <pubDate>Mon, 08 Oct 2018 18:36:56 GMT</pubDate>
    <dc:creator>ssulav</dc:creator>
    <dc:date>2018-10-08T18:36:56Z</dc:date>
  </channel>
</rss>

