Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

passing hadoop.security.credential.provider.path in beeline?

Hi,

I followed the below document to create my hadoop.security.credential.provider.path file and i am trying to pass it from beeline command string.

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/s3-credential-...

But when ever i try to create a external table from S3 data it throws the below error.

i even followed the steps from https://community.hortonworks.com/questions/54123/how-to-pass-hadoopsecuritycredentialproviderpath-i...

I have whitelisted the property hadoop.security.credential.provider.path in such a way that it can be set at connection time in the JDBC connection string between beeline and hiveserver2. In hive-site:

  1. hive.security.authorization.sqlstd.confwhitelist.append = hadoop.security.credential.provider.path

Also tried passing the credential path as part of jdbc string as suggested from above forum answer but no luck it still throws the same error. Can someone please help me.

bash-4.2$ beeline --hive-conf hadoop.security.credential.provider.path=jceks://hdfs@clustername/pedev/user/myuser/myuser.jceks -u "jdbc:hive2://zookeeperlist:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;"

Beeline version 1.2.1000.2.6.3.0-235 by Apache Hive

0: jdbc:hive2://hostname:2> CREATE EXTERNAL TABLE HDFSaudit_data8(access string, actionstring, agenthost string , cliip string , clitype string , enforcer string , event_count bigint , event_dur_ms bigint , evttimetimestamp , id string , logtype string , policy bigint , reason string , repo string , repotype bigint , reqdata string , requser string , restype string , resource string , result bigint , seq_num bigint , sess string ) ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' LOCATION 's3a://bucketname/hivetest2';

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.io.InterruptedIOException: doesBucketExist on bucketname: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: Unable to load credentials from Amazon EC2 metadata service) (state=08S01,code=1)

1 REPLY 1

@prashanth ramesh I see the beeline example you have given above uses --hive-conf approach.

When appending the credential path to the jdbc string, are you getting the exact same error as shown above?

beeline -u "jdbc:hive2://hs2_hostname:port/default;principal=my/principal@REALM?hadoop.security.credential.provider.path=jceks://hdfs@hostname/path/to/jceks"