Support Questions
Find answers, ask questions, and share your expertise

HiveContext: Unable to load AWS credentials from any provider in the chain on CDH 5.7


I'm trying to create a table on s3a but I keep hitting the following error:

Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: MetaException( Unable to load AWS credentials from any provider in the chain)


I tried setting the s3a keys using the configuration object but I might be hitting SPARK-11364 :

conf.set("fs.s3a.access.key", accessKey)
conf.set("fs.s3a.secret.key", secretKey)
val sc = new SparkContext(conf)


I tried setting these propeties in hdfs-site.xml but i'm still getting this error.

Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY environment variables but with no luck.


Any ideas on how to resolve this issue ?


Thank you.



Master Guru
I can reproduce this across CDH5 versions 5.5, 5.6, 5.7 and 5.8 (latest ones of each), but only intermittently so. The first run appears to always discover the credentials from the configs, but all subsequent runs fail to do so.

I'm looking further at this.

New Contributor

Has there been any resolution to this particular problem?  I'm running into the same error (we are using CDH 5.5)

Hi Harsh,
Were you able to solve this ?

Thank you.

Rising Star

Hi Harsh,


We are encountering the same problem. Our sister company puts a parquet file (1.3GB) in the Sydney, Australia region, and we are pulling it into our New Jersey Spark cluster for just reading within a DataFrame. We can read the schema and show some of the data, but when we try to process it or do a "count", we get the error. Please let me know if you have any advice. Unable to load AWS credentials from any provider in the chain