Support Questions
Find answers, ask questions, and share your expertise

hadoop distcp not working

New Contributor


We are trying to run the hadoop distcp from the command line to fetch various files from the Amazon s3 bucket using the s3a scheme to a hdfs path. We are hitting the following issue:

16/03/27 13:02:39 ERROR tools.DistCp: Exception encountered com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials( at at at at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize( at org.apache.hadoop.fs.FileSystem.createFileSystem( at org.apache.hadoop.fs.FileSystem.access$200( at org.apache.hadoop.fs.FileSystem$Cache.getInternal( at org.apache.hadoop.fs.FileSystem$Cache.get( at org.apache.hadoop.fs.FileSystem.get( at org.apache.hadoop.fs.Path.getFileSystem( at at at at at at at at

The required credentials are passed from the command line using -Dfs.s3a.awsAccessKeyId and -Dfs.s3a.awsSecretAccessKey variables.

Is there any limitaiton in working s3a protocol as just s3n scheme seems to be working fine?

This is on hadoop 2.7.1 and hortonworks 2.3.




Re: hadoop distcp not working


Can you try adding space between -D and fs properties.

Re: hadoop distcp not working

Hello @Shanmuga Sundaram.

As per Apache documentation, the correct configuration properties are fs.s3a.access.key and fs.s3a.secret.key.

  <description>AWS access key ID. Omit for Role-based authentication.</description>

  <description>AWS secret key. Omit for Role-based authentication.</description>