hadoop credential list -provider jceks://hdfs@host:8020/tmp/s3a.jceks
Listing aliases for CredentialProvider: jceks://hdfs@host:8020/tmp/s3a.jceks
fs.s3a.secret.key
fs.s3a.session.token
fs.s3a.access.key
beeline --hive-conf hadoop.security.credential.provider.path=jceks://hdfs@host:8020/tmp/s3a.jceks -u "jdbc:hive2://host:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2:// host:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;
19/12/11 17:58:35 [main]: INFO jdbc.HiveConnection: Connected to host:10000
Connected to: Apache Hive (version 3.1.0.3.1.0.0-78)
Driver: Hive JDBC (version 3.1.0.3.1.0.0-78)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 3.1.0.3.1.0.0-78 by Apache Hive
0: jdbc:hive2://host:2181/defa> CREATE DATABASE IF NOT EXISTS test_Table1 LOCATION 's3a://s3aTestBucket/test/table1';
INFO : Compiling command(queryId=hive_20191211175847_aae528c7-90f7-4c90-a01e-ea3523df4592):
CREATE DATABASE IF NOT EXISTS test_Table1 LOCATION 's3a://s3aTestBucket/test/table1'
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO : Completed compiling command(queryId=hive_20191211175847_aae528c7-90f7-4c90-a01e-ea3523df4592); Time taken: 20.214 seconds
INFO : Executing command(queryId=hive_20191211175847_aae528c7-90f7-4c90-a01e-ea3523df4592): CREATE DATABASE IF NOT EXISTS test_Table1 LOCATION 's3a://s3aTestBucket/test/table1'
INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on s3aTestBucket: com.amazonaws.AmazonClientException: No AWS Credentials provided by TemporaryAWSCredentialsProvider : org.apache.hadoop.fs.s3a.CredentialInitializationException: Access key, secret key or session token is unset: No AWS Credentials provided by TemporaryAWSCredentialsProvider : org.apache.hadoop.fs.s3a.CredentialInitializationException: Access key, secret key or session token is unset
INFO : Completed executing command(queryId=hive_20191211175847_aae528c7-90f7-4c90-a01e-ea3523df4592); Time taken: 10.051 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on s3aTestBucket: com.amazonaws.AmazonClientException: No AWS Credentials provided by TemporaryAWSCredentialsProvider : org.apache.hadoop.fs.s3a.CredentialInitializationException: Access key, secret key or session token is unset: No AWS Credentials provided by TemporaryAWSCredentialsProvider : org.apache.hadoop.fs.s3a.CredentialInitializationException: Access key, secret key or session token is unset (state=08S01,code=1)
Created 12-12-2019 01:28 PM
Hi @sahilk
I worked on this last year, and it turned out at the time, it was not supported, as HS2 and HMS will generate the credentials themselves, please see my blog post using below link:
Access S3 in Hive Through hadoop.security.credential.provider.path
However, it was based on CDH, looks like you are on HDP, so stories might be different, but might need other Hive guys' input here.
Cheers
Eric