Member since
03-20-2017
8
Posts
0
Kudos Received
0
Solutions
10-20-2017
08:18 AM
I am trying to access s3 path using spark. I have tried providing hadoop.security.credential.provider.path from hive-site.xml and command line as well. But both time, I got issue as below ERROR ApplicationMaster: User class threw exception: java.lang.IllegalStateException: Failed to execute CommandLineRunner java.lang.IllegalStateException: Failed to execute CommandLineRunner at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:779) at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:760) at org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:747) at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1162) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1151) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:637) Caused by: org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on global-***********-app: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain: Unable to load AWS credentials from any provider in the chain at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:92) at org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:278) at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:243) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:435) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:198)
... View more
Labels:
- Labels:
-
Apache Spark
03-20-2017
06:45 AM
Thanks Jay.
... View more
03-20-2017
06:28 AM
Usecase : Export data from Hadoop cluster using Sqoop. Question : From which node Sqoop exports data ? is it Name node or any data node?
... View more
Labels:
- Labels:
-
Apache Sqoop