I'm trying to create a table on s3a but I keep hitting the following error:
Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:com.cloudera.com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain)
I tried setting the s3a keys using the configuration object but I might be hitting SPARK-11364 :
val sc = new SparkContext(conf)
I tried setting these propeties in hdfs-site.xml but i'm still getting this error.
Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY environment variables but with no luck.
Any ideas on how to resolve this issue ?
We are encountering the same problem. Our sister company puts a parquet file (1.3GB) in the Sydney, Australia region, and we are pulling it into our New Jersey Spark cluster for just reading within a DataFrame. We can read the schema and show some of the data, but when we try to process it or do a "count", we get the error. Please let me know if you have any advice.
com.cloudera.com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain at com.cloudera.com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:117)