Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

Pass the to beeline string



I followed the below document to create my file and i am trying to pass it from beeline command string.

But when ever i try to create a external table from S3 data it throws the below error.

i even followed the steps from

I have whitelisted the property in such a way that it can be set at connection time in the JDBC connection string between beeline and hiveserver2. In hive-site:

  1. =

Also tried passing the credential path as part of jdbc string as suggested from above forum answer but no luck it still throws the same error. Can someone please help me.

bash-4.2$ beeline --hive-conf -u "jdbc:hive2://zookeeperlist:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;"

Beeline version 1.2.1000. by Apache Hive

0: jdbc:hive2://hostname:2> CREATE EXTERNAL TABLE HDFSaudit_data8(access string, actionstring, agenthost string , cliip string , clitype string , enforcer string , event_count bigint , event_dur_ms bigint , evttimetimestamp , id string , logtype string , policy bigint , reason string , repo string , repotype bigint , reqdata string , requser string , restype string , resource string , result bigint , seq_num bigint , sess string ) ROW FORMAT SERDE '' STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' OUTPUTFORMAT '' LOCATION 's3a://bucketname/hivetest2';

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException( doesBucketExist on bucketname: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: Unable to load credentials from Amazon EC2 metadata service) (state=08S01,code=1)



Is the fs.s3a.impl property set up also? Noticed it wasn't mentioned on the docu you linked.
Also can we verify that there are no typos inside the jceks file?


@Jonathan SneepYes i did try that option 2 days ago, "fs.s3a.impl=org.apache.hadoop.fs.s3native.NativeS3FileSystem" it did not work, it threw the error saying pass in access.key and secret.key

I followed this article

Also can we verify that there are no typos inside the jceks file? Yes i am submitting Spark and hdfs distcp jobs with the same jceks.

@prashanth ramesh

did you add fs.s3a.access.key and fs.s3a.secret.key to custom hive-site also? Noticed a similar post here;

@Jonathan Sneep

No i dont want to add the keys to hive-site.xml. Or update the core-site.xml with credential path that is why i am trying to pass as part of the beeline command line.