Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

ls: From option fs.s3a.aws.credentials.provider java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.AssumedRoleCredentialProvider not found

Highlighted

ls: From option fs.s3a.aws.credentials.provider java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.AssumedRoleCredentialProvider not found

Contributor

Hi!

Have error after filling/add parameters in to file /etc/hadoop/conf/core-site.xml


Hadoop hosted om EC2 instances .


[ec2-user@ip ~]$ hadoop fs -ls s3a://hive-tables/

19/03/07 11:19:10 INFO impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties

19/03/07 11:19:10 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).

19/03/07 11:19:10 INFO impl.MetricsSystemImpl: s3a-file-system metrics system started


ls: From option fs.s3a.aws.credentials.provider java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.AssumedRoleCredentialProvider not found


 grep  -A 1 -i fs.s3a.  /etc/hadoop/conf/core-site.xml
      <name>fs.s3a.access.key</name>
      <value>1111111111111</value>
--
      <name>fs.s3a.assumed.role.arn</name>
      <value>arn:aws:iam::111111111111:role/hive-tables</value>
--
      <name>fs.s3a.assumed.role.credentials.provider</name>
      <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
      <text>true</text>
--
      <name>fs.s3a.assumed.role.session.duration</name>
      <value>30m</value>
--
      <name>fs.s3a.aws.credentials.provider</name>
      <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider,com.amazonaws.auth.EnvironmentVariableCredentialsProvider, com.amazonaws.auth.InstanceProfileCredentialsProvider,org.apache.hadoop.fs.s3a.AssumedRoleCredentialProvider,org.apache.hadoop.fs.s3a.auth.AssumedRoleCredentialProvider</value>
      <text>true</text>
--
      <name>fs.s3a.fast.upload</name>
      <value>true</value>
--
      <name>fs.s3a.fast.upload.buffer</name>
      <value>disk</value>
--
      <name>fs.s3a.impl</name>
      <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
      <text>true</text>
--
      <name>fs.s3a.multipart.size</name>
      <value>67108864</value>
--
      <name>fs.s3a.secret.key</name>
      <value>111111</value>
--
      <name>fs.s3a.user.agent.prefix</name>
      <value>User-Agent: APN/1.0 Hortonworks/1.0 HDP/3.1.0.0-78</value>


[hdfs@ip- ~]$ hadoop version

Hadoop 3.1.1.3.1.0.0-78

Source code repository git@github.com:hortonworks/hadoop.git -r e4f82af51faec922b4804d0232a637422ec29e64

Compiled by jenkins on 2018-12-06T12:26Z

Compiled with protoc 2.5.0

From source with checksum eab9fa2a6aa38c6362c66d8df75774

This command was run using /usr/hdp/3.1.0.0-78/hadoop/hadoop-common-3.1.1.3.1.0.0-78.jar


Don't have an account?
Coming from Hortonworks? Activate your account here