Created 08-15-2018 01:22 PM
Hi,
I followed the below document to create my hadoop.security.credential.provider.path file and i am trying to pass it from beeline command string.
But when ever i try to create a external table from S3 data it throws the below error.
i even followed the steps from https://community.hortonworks.com/questions/54123/how-to-pass-hadoopsecuritycredentialproviderpath-i...
I have whitelisted the property hadoop.security.credential.provider.path in such a way that it can be set at connection time in the JDBC connection string between beeline and hiveserver2. In hive-site:
Also tried passing the credential path as part of jdbc string as suggested from above forum answer but no luck it still throws the same error. Can someone please help me.
bash-4.2$ beeline --hive-conf hadoop.security.credential.provider.path=jceks://hdfs@clustername/pedev/user/myuser/myuser.jceks -u "jdbc:hive2://zookeeperlist:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;"
Beeline version 1.2.1000.2.6.3.0-235 by Apache Hive
0: jdbc:hive2://hostname:2> CREATE EXTERNAL TABLE HDFSaudit_data8
(access
string, action
string, agenthost
string , cliip
string , clitype
string , enforcer
string , event_count
bigint , event_dur_ms
bigint , evttime
timestamp , id
string , logtype
string , policy
bigint , reason
string , repo
string , repotype
bigint , reqdata
string , requser
string , restype
string , resource
string , result
bigint , seq_num
bigint , sess
string ) ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' LOCATION 's3a://bucketname/hivetest2';
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.io.InterruptedIOException: doesBucketExist on bucketname: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: Unable to load credentials from Amazon EC2 metadata service) (state=08S01,code=1)
Created 08-15-2018 02:42 PM
Is the fs.s3a.impl property set up also? Noticed it wasn't mentioned on the docu you linked.
Also can we verify that there are no typos inside the jceks file?
Created 08-15-2018 02:53 PM
@Jonathan SneepYes i did try that option 2 days ago, "fs.s3a.impl=org.apache.hadoop.fs.s3native.NativeS3FileSystem" it did not work, it threw the error saying pass in access.key and secret.key
I followed this article https://support.hortonworks.com/s/article/External-Table-Hive-s3n-bucket
Also can we verify that there are no typos inside the jceks file? Yes i am submitting Spark and hdfs distcp jobs with the same jceks.
Created 08-15-2018 03:04 PM
did you add fs.s3a.access.key and fs.s3a.secret.key to custom hive-site also? Noticed a similar post here; https://stackoverflow.com/questions/50710582/write-to-s3-from-hive-fails
Created 08-15-2018 03:11 PM
No i dont want to add the keys to hive-site.xml. Or update the core-site.xml with credential path that is why i am trying to pass hadoop.security.credential.provider.path as part of the beeline command line.