Support Questions

Find answers, ask questions, and share your expertise

Pass the hadoop.security.credential.provider.path to beeline string

avatar
Contributor

Hi,

I followed the below document to create my hadoop.security.credential.provider.path file and i am trying to pass it from beeline command string.

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/s3-credential-...

But when ever i try to create a external table from S3 data it throws the below error.

i even followed the steps from https://community.hortonworks.com/questions/54123/how-to-pass-hadoopsecuritycredentialproviderpath-i...

I have whitelisted the property hadoop.security.credential.provider.path in such a way that it can be set at connection time in the JDBC connection string between beeline and hiveserver2. In hive-site:

  1. hive.security.authorization.sqlstd.confwhitelist.append = hadoop.security.credential.provider.path

Also tried passing the credential path as part of jdbc string as suggested from above forum answer but no luck it still throws the same error. Can someone please help me.

bash-4.2$ beeline --hive-conf hadoop.security.credential.provider.path=jceks://hdfs@clustername/pedev/user/myuser/myuser.jceks -u "jdbc:hive2://zookeeperlist:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;"

Beeline version 1.2.1000.2.6.3.0-235 by Apache Hive

0: jdbc:hive2://hostname:2> CREATE EXTERNAL TABLE HDFSaudit_data8(access string, actionstring, agenthost string , cliip string , clitype string , enforcer string , event_count bigint , event_dur_ms bigint , evttimetimestamp , id string , logtype string , policy bigint , reason string , repo string , repotype bigint , reqdata string , requser string , restype string , resource string , result bigint , seq_num bigint , sess string ) ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' LOCATION 's3a://bucketname/hivetest2';

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.io.InterruptedIOException: doesBucketExist on bucketname: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: Unable to load credentials from Amazon EC2 metadata service) (state=08S01,code=1)

4 REPLIES 4

avatar

Is the fs.s3a.impl property set up also? Noticed it wasn't mentioned on the docu you linked.
Also can we verify that there are no typos inside the jceks file?

avatar
Contributor

@Jonathan SneepYes i did try that option 2 days ago, "fs.s3a.impl=org.apache.hadoop.fs.s3native.NativeS3FileSystem" it did not work, it threw the error saying pass in access.key and secret.key

I followed this article https://support.hortonworks.com/s/article/External-Table-Hive-s3n-bucket

Also can we verify that there are no typos inside the jceks file? Yes i am submitting Spark and hdfs distcp jobs with the same jceks.

avatar
@prashanth ramesh

did you add fs.s3a.access.key and fs.s3a.secret.key to custom hive-site also? Noticed a similar post here; https://stackoverflow.com/questions/50710582/write-to-s3-from-hive-fails

avatar
Contributor
@Jonathan Sneep

No i dont want to add the keys to hive-site.xml. Or update the core-site.xml with credential path that is why i am trying to pass hadoop.security.credential.provider.path as part of the beeline command line.