Created on 10-27-2016 06:34 AM - edited 09-16-2022 03:45 AM
Hi I am using HDP 2.1 and trying to do distcp from hdp 2.1 to s3a.
I configured hte access key and secret key. But getting error:-
[hive@test-reair ~]$ hadoop fs -ls s3a://test-poc/
ls: No FileSystem for scheme: s3a
Urgent help needed.
Thanks in advance.
Created 10-28-2016 07:30 PM
Can you try using s3n instead of s3a? Support for s3a came into Hadoop with version 2.6.0 (https://issues.apache.org/jira/browse/HADOOP-10400). HDP 2.5 provides s3a as a Tech Preview. HDP 2.1 came with Hadoop 2.4.0 (http://hortonworks.com/blog/announcing-hdp-2-1-general-availability/)
You can see this link for using s3n https://community.hortonworks.com/articles/7296/hdp-22-configuration-required-for-s3.html
Created 10-27-2016 01:44 PM
Have you seen this HCC link, it may be helpful: https://community.hortonworks.com/questions/7165/how-to-copy-hdfs-file-to-aws-s3-bucket-hadoop-dist....
Also take a look at the official hadoop-aws docs: https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html
Created 10-28-2016 06:34 AM
@ Michael Young
I have already gone through with this link, but not helped me.
Created 10-28-2016 07:30 PM
Can you try using s3n instead of s3a? Support for s3a came into Hadoop with version 2.6.0 (https://issues.apache.org/jira/browse/HADOOP-10400). HDP 2.5 provides s3a as a Tech Preview. HDP 2.1 came with Hadoop 2.4.0 (http://hortonworks.com/blog/announcing-hdp-2-1-general-availability/)
You can see this link for using s3n https://community.hortonworks.com/articles/7296/hdp-22-configuration-required-for-s3.html
Created 11-02-2016 08:51 AM
Hi @Michael Young
I have two cluster on one it is working but on second cluster after done the same it is not working with s3a.
Created 11-08-2016 07:58 AM
I have tried with s3n. It is working fine with s3n.
Created 11-08-2016 12:43 PM
Good to know, thank you.