Created 08-15-2018 01:02 PM
Hi,
I followed the below document to create my hadoop.security.credential.provider.path file and i am trying to pass it from beeline command string.
But when ever i try to create a external table from S3 data it throws the below error.
i even followed the steps from https://community.hortonworks.com/questions/54123/how-to-pass-hadoopsecuritycredentialproviderpath-i...
I have whitelisted the property hadoop.security.credential.provider.path in such a way that it can be set at connection time in the JDBC connection string between beeline and hiveserver2. In hive-site:
Also tried passing the credential path as part of jdbc string as suggested from above forum answer but no luck it still throws the same error. Can someone please help me.
bash-4.2$ beeline --hive-conf hadoop.security.credential.provider.path=jceks://hdfs@clustername/pedev/user/myuser/myuser.jceks -u "jdbc:hive2://zookeeperlist:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;"
Beeline version 1.2.1000.2.6.3.0-235 by Apache Hive
0: jdbc:hive2://hostname:2> CREATE EXTERNAL TABLE HDFSaudit_data8
(access
string, action
string, agenthost
string , cliip
string , clitype
string , enforcer
string , event_count
bigint , event_dur_ms
bigint , evttime
timestamp , id
string , logtype
string , policy
bigint , reason
string , repo
string , repotype
bigint , reqdata
string , requser
string , restype
string , resource
string , result
bigint , seq_num
bigint , sess
string ) ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' LOCATION 's3a://bucketname/hivetest2';
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.io.InterruptedIOException: doesBucketExist on bucketname: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: Unable to load credentials from Amazon EC2 metadata service) (state=08S01,code=1)
Created 08-15-2018 04:12 PM
@prashanth ramesh I see the beeline example you have given above uses --hive-conf approach.
When appending the credential path to the jdbc string, are you getting the exact same error as shown above?
beeline -u "jdbc:hive2://hs2_hostname:port/default;principal=my/principal@REALM?hadoop.security.credential.provider.path=jceks://hdfs@hostname/path/to/jceks"