Reply
Explorer
Posts: 11
Registered: ‎01-13-2015

HiveContext: Unable to load AWS credentials from any provider in the chain on CDH 5.7

Hi,

I'm trying to create a table on s3a but I keep hitting the following error:

Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:com.cloudera.com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain)

 

I tried setting the s3a keys using the configuration object but I might be hitting SPARK-11364 :

conf.set("fs.s3a.access.key", accessKey)
conf.set("fs.s3a.secret.key", secretKey)
conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)
val sc = new SparkContext(conf)

 

I tried setting these propeties in hdfs-site.xml but i'm still getting this error.

Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY environment variables but with no luck.

 

Any ideas on how to resolve this issue ?

 

Thank you.

Daniel

Posts: 1,903
Kudos: 435
Solutions: 305
Registered: ‎07-31-2013

Re: HiveContext: Unable to load AWS credentials from any provider in the chain on CDH 5.7

I can reproduce this across CDH5 versions 5.5, 5.6, 5.7 and 5.8 (latest ones of each), but only intermittently so. The first run appears to always discover the credentials from the configs, but all subsequent runs fail to do so.

I'm looking further at this.
New Contributor
Posts: 2
Registered: ‎07-26-2016

Re: HiveContext: Unable to load AWS credentials from any provider in the chain on CDH 5.7

Has there been any resolution to this particular problem?  I'm running into the same error (we are using CDH 5.5)

Highlighted
Explorer
Posts: 11
Registered: ‎01-13-2015

Re: HiveContext: Unable to load AWS credentials from any provider in the chain on CDH 5.7

Hi Harsh,
Were you able to solve this ?

Thank you.
Daniel
Expert Contributor
Posts: 61
Registered: ‎02-03-2016

Re: HiveContext: Unable to load AWS credentials from any provider in the chain on CDH 5.7

[ Edited ]

Hi Harsh,

 

We are encountering the same problem. Our sister company puts a parquet file (1.3GB) in the Sydney, Australia region, and we are pulling it into our New Jersey Spark cluster for just reading within a DataFrame. We can read the schema and show some of the data, but when we try to process it or do a "count", we get the error. Please let me know if you have any advice.

 

com.cloudera.com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
	at com.cloudera.com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:117)

 

Cheers,

Ben