<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Issue creating/accessing hive external table with s3 location from spark thrift server in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Issue-creating-accessing-hive-external-table-with-s3/m-p/178874#M141120</link>
    <description>&lt;P&gt;I have configured the &lt;STRONG&gt;s3 keys&lt;/STRONG&gt; (access key and secret key) in a &lt;STRONG&gt;jceks&lt;/STRONG&gt; file using &lt;STRONG&gt;hadoop-credential&lt;/STRONG&gt; api. Commands used for the same are as below:&lt;/P&gt;&lt;P&gt;&lt;EM&gt;hadoop credential create fs.s3a.access.key -provider jceks://hdfs@nn_hostname/tmp/s3creds_test.jceks&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;hadoop credential create fs.s3a.secret.key -provider jceks://hdfs@nn_hostname/tmp/s3creds_test.jceks&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;Then, I am opening a connection to &lt;STRONG&gt;Spark Thrift Server&lt;/STRONG&gt; using beeline and passing the jceks file path in the connection string as below:&lt;/P&gt;&lt;P&gt;&lt;EM&gt;beeline -u "jdbc:hive2://hostname:10001/;principal=hive/_HOST@?hadoop.security.credential.provider.path=jceks://hdfs@nn_hostname/tmp/s3creds_test.jceks;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;Now, when I try to create an external table with the location in s3, it fails with the below exception:&lt;/P&gt;&lt;P&gt;&lt;EM&gt;CREATE EXTERNAL TABLE IF NOT EXISTS test_table_on_s3 (col1 String, col2 String) row format delimited fields terminated by ',' LOCATION 's3a://bucket_name/kalmesh/';&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Exception:&lt;/STRONG&gt; Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.nio.file.AccessDeniedException s3a://bucket_name/kalmesh: getFileStatus on s3a://bucket_name/kalmesh: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: request_id), S3 Extended Request ID: extended_request_id=) (state=,code=0)&lt;/P&gt;&lt;P&gt;However, this works fine with Hive Thrift Server.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;HDP version:&lt;/STRONG&gt; HDP 2.5&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Spark version: &lt;/STRONG&gt;1.6&lt;/P&gt;</description>
    <pubDate>Fri, 19 May 2017 12:56:19 GMT</pubDate>
    <dc:creator>sam_kalmesh</dc:creator>
    <dc:date>2017-05-19T12:56:19Z</dc:date>
  </channel>
</rss>

