<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Hive with Google Cloud Storage in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211287#M78741</link>
    <description>&lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;&lt;/A&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;@Geoffrey Shelton Okot&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I was able to create hive external table pointing my storage as GCS. But it only works as hive superuser but doesn't work as a normal hive user meaning, hdpuser1 cannot create hive table it fails with above error, but if execute su - hive it works .&lt;/P&gt;&lt;P&gt;I am no sure how to rectify this.&lt;/P&gt;</description>
    <pubDate>Thu, 31 May 2018 20:54:30 GMT</pubDate>
    <dc:creator>sudarshan_shrid</dc:creator>
    <dc:date>2018-05-31T20:54:30Z</dc:date>
    <item>
      <title>Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211279#M78733</link>
      <description>&lt;P&gt;I have installed a &lt;CODE&gt;hadoop 2.6.5&lt;/CODE&gt; version cluster in GCP using VM's instances. Used GCP connector and pointed by hdfs to use gs bucket. Added the below 2 entries in &lt;CODE&gt;coresite.xml&lt;/CODE&gt;:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;google.cloud.auth.service.account.json.keyfile=&amp;lt;Path-to-the-JSON-file&amp;gt; 
fs.gs.working.dir=/
&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;When using hadoop gs -ls / works fine , but when I am creating a hive tables&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;CREATE EXTERNAL TABLE test1256(name string,id  int)   LOCATION   'gs://bucket/';
&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;I get the following error:&lt;/P&gt;&lt;BLOCKQUOTE&gt;
&lt;P&gt;Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hdpuser1, path="gs://bucket/":hive:hive:drwx------) (state=08S01,code=1)&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;Apart form changes to coresite.xml are there any changes to be made at hive.xml also?&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/authentication-gcp.html" target="_blank"&gt;https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/authentication-gcp.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 25 May 2018 20:09:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211279#M78733</guid>
      <dc:creator>sudarshan_shrid</dc:creator>
      <dc:date>2018-05-25T20:09:02Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211280#M78734</link>
      <description>&lt;P&gt;&lt;EM&gt;&lt;A href="@sudi ts"&gt; @sudi ts&lt;/A&gt;&lt;BR /&gt;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;You need to copy the connector into the hadoop-client and hive-client location otherwise you will hit an error&lt;/EM&gt;&lt;/P&gt;&lt;PRE&gt;cp gcs-connector-latest-hadoop2.jar /usr/hdp/current/hadoop-client/lib/ 
cp gcs-connector-latest-hadoop2.jar /usr/hdp/current/hive-client/lib &lt;/PRE&gt;&lt;P&gt;&lt;EM&gt;The below command should run successfully&lt;/EM&gt;&lt;/P&gt;&lt;PRE&gt;$ hdfs dfs -ls gs://bucket/ &lt;/PRE&gt;&lt;P&gt;&lt;EM&gt;This should run fine, but the issue you are having is with permission for hdpuser1 you will need to correct by running &lt;/EM&gt;&lt;/P&gt;&lt;PRE&gt;$ hdfs dfs -chown hdpuser1 gs://bucket/ &lt;/PRE&gt;&lt;P&gt;&lt;EM&gt;Now your create table should work, while logged in as hdpuser1 &lt;/EM&gt;&lt;/P&gt;&lt;PRE&gt;CREATE EXTERNAL TABLE test1256(name string,id int) LOCATION 'gs://bucket/'; &lt;/PRE&gt;&lt;P&gt;&lt;EM&gt;Please let me know. If you found this answer addressed your question, please take a moment to log in and click the "&lt;STRONG&gt;Accept&lt;/STRONG&gt;" link on the answer.&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 26 May 2018 14:59:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211280#M78734</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2018-05-26T14:59:56Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211281#M78735</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Thanks a lot for the info, but still facing the same issue.&lt;/P&gt;&lt;P&gt;I did create the user in AD and have a valid ticket , hdfs command does work accessing the GCS  but cannot create external hive table.&lt;/P&gt;</description>
      <pubDate>Mon, 28 May 2018 21:27:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211281#M78735</guid>
      <dc:creator>sudarshan_shrid</dc:creator>
      <dc:date>2018-05-28T21:27:22Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211282#M78736</link>
      <description>&lt;P&gt;&lt;EM&gt;&lt;A href="https://community.hortonworks.com/questions/195766/@sudi%20ts"&gt;@sudi ts&lt;/A&gt;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Can you share the latest error?&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 29 May 2018 04:25:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211282#M78736</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2018-05-29T04:25:48Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211283#M78737</link>
      <description>&lt;P&gt;Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hdpuser1, path="gs://bucket/":hive:hive:drwx------) (state=08S01,code=1)&lt;/P&gt;&lt;P&gt;hdpuser1 is an AD user, using the same user I execute &lt;/P&gt;&lt;P&gt;$ hdfs dfs -ls gs://bucket/ &lt;/P&gt; but using beeline when I try to create an external table it fails</description>
      <pubDate>Tue, 29 May 2018 20:05:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211283#M78737</guid>
      <dc:creator>sudarshan_shrid</dc:creator>
      <dc:date>2018-05-29T20:05:10Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211284#M78738</link>
      <description>&lt;P&gt;&lt;EM&gt;&lt;A href="https://community.hortonworks.com/questions/195766/@sudi%20ts"&gt;@sudi ts&lt;/A&gt;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;This is clearly a permission issue  "&lt;STRONG&gt;Permission denied: user=hdpuser1, path="gs://bucket/":hive:hive:drwx------)"&lt;/STRONG&gt;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Have you tried using ACL's&lt;/EM&gt;&lt;/P&gt;&lt;PRE&gt;&lt;EM&gt;gsutil acl ch -u hdpuser1:WRITE gs://bucket/&lt;/EM&gt;&lt;/PRE&gt;&lt;P&gt;&lt;EM&gt;And retry&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 30 May 2018 01:43:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211284#M78738</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2018-05-30T01:43:52Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211285#M78739</link>
      <description>&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;@Geoffrey Shelton Okot&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I did try, but still fails. &lt;/P&gt;&lt;P&gt;CommandException: hdpuser1:WRITE is not a valid ACL changehdpuser1 is not a valid scope type&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The GCS bucket has storage admin rights given to service account&lt;/P&gt;&lt;P&gt;hadoop fs -ls gs://bucket/ = works fine&lt;/P&gt;</description>
      <pubDate>Wed, 30 May 2018 03:19:32 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211285#M78739</guid>
      <dc:creator>sudarshan_shrid</dc:creator>
      <dc:date>2018-05-30T03:19:32Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211286#M78740</link>
      <description>&lt;P&gt;&lt;EM&gt;&lt;A href="https://community.hortonworks.com/questions/195766/@sudi%20ts"&gt;@sudi ts&lt;/A&gt;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;To you have access to the GCP  IAM console?  When treating a service account as a resource, you can grant permission to a user to access that service account. You can grant the Owner, Editor, Viewer, or &lt;A href="https://cloud.google.com/iam/docs/service-accounts#the_service_account_user_role"&gt;Service Account User&lt;/A&gt; role to a user to access the service account. &lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 30 May 2018 03:33:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211286#M78740</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2018-05-30T03:33:18Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211287#M78741</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;&lt;/A&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;@Geoffrey Shelton Okot&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I was able to create hive external table pointing my storage as GCS. But it only works as hive superuser but doesn't work as a normal hive user meaning, hdpuser1 cannot create hive table it fails with above error, but if execute su - hive it works .&lt;/P&gt;&lt;P&gt;I am no sure how to rectify this.&lt;/P&gt;</description>
      <pubDate>Thu, 31 May 2018 20:54:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211287#M78741</guid>
      <dc:creator>sudarshan_shrid</dc:creator>
      <dc:date>2018-05-31T20:54:30Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211288#M78742</link>
      <description>&lt;P&gt;Hi &lt;A href="https://community.hortonworks.com/questions/195766/hive-with-google-cloud-storage.html#"&gt;@sudi ts&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Can you share some more information about this deployment.&lt;/P&gt;&lt;P&gt;- Is doAs enabled (hive.server2.enable.doAs)&lt;/P&gt;&lt;P&gt;- What is the authorization mechanism? Is the Ranger Authorizer being used.&lt;/P&gt;&lt;P&gt;If you can pull a stack trace from the HiveServer2 logs, that'll be very useful.&lt;/P&gt;&lt;P&gt;HDP-2.6.5 ships with the Google connector, so there's no need to replace any jars. The GS connectivity is working given that you can create this table if logging in as the hive user, and list files via hadoop fs -ls.
&lt;/P&gt;&lt;P&gt;Cloud storage Access Control is generally handled via Cloud Provider constructs - such as IAM roles. Hadoop interaction in terms of file owners and permissions doesn't capture this. The user returned by hadoop fs -ls will typically be the logged in user, and the permissions don't indicate much.&lt;/P&gt;</description>
      <pubDate>Fri, 08 Jun 2018 09:15:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211288#M78742</guid>
      <dc:creator>sseth</dc:creator>
      <dc:date>2018-06-08T09:15:16Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211289#M78743</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/298/sseth.html" nodeid="298"&gt;@sseth&lt;/A&gt; &lt;/P&gt;&lt;P&gt;Issue is resolved after adding following property in core-site.xml &lt;/P&gt;&lt;P&gt;    fs.gs.reported.permissions=777&lt;/P&gt;&lt;P&gt;Normal Users can access hive and create external table pointing to GCS location.&lt;/P&gt;</description>
      <pubDate>Fri, 08 Jun 2018 20:33:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211289#M78743</guid>
      <dc:creator>sudarshan_shrid</dc:creator>
      <dc:date>2018-06-08T20:33:24Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211290#M78744</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/298/sseth.html" nodeid="298"&gt;@sseth&lt;/A&gt; &lt;/P&gt;&lt;P&gt;I have downloaded the latest jar &lt;/P&gt;&lt;P&gt;&lt;A href="https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar"&gt;&lt;STRONG&gt;&lt;/STRONG&gt;&lt;/A&gt;&lt;A href="https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar" target="_blank"&gt;https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Tried creating the external table and its failing with following error:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;FAILED: HiveAccessControlException Permission denied: user [abcd] does not have [READ] privilege on [gs://hdp-opt1/forhive/languages] (state=42000,code=40000)&lt;/P&gt;&lt;P&gt;I have enabled Hive plugin and set the permission of 777 in coresite.xml&lt;/P&gt;&lt;P&gt;Where there any changes made to jar?? I also see few properties have changed in this link:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/gcp-cluster-config.html"&gt;https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/gcp-cluster-config.html&lt;/A&gt; &lt;/P&gt;&lt;P&gt;Is it mandatory to use the json key? If my vm instance has required permission to talk to gcs &lt;/P&gt;</description>
      <pubDate>Thu, 19 Jul 2018 21:23:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211290#M78744</guid>
      <dc:creator>sudarshan_shrid</dc:creator>
      <dc:date>2018-07-19T21:23:20Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211291#M78745</link>
      <description>&lt;P&gt;&lt;EM&gt;&lt;A href="https://community.hortonworks.com/questions/195766/@sudi%20ts"&gt;@sudi ts&lt;/A&gt; Were you able to resolve this issue?&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 19 Mar 2019 16:13:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/211291#M78745</guid>
      <dc:creator>agrawalreetika</dc:creator>
      <dc:date>2019-03-19T16:13:23Z</dc:date>
    </item>
    <item>
      <title>Re: Hive with Google Cloud Storage</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/376414#M78746</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;/P&gt;&lt;P&gt;You need to set in hive-site.xml, these three tags to get this working with hive:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;&amp;lt;property&amp;gt;
&amp;lt;name&amp;gt;google.cloud.auth.service.account.json.keyfile&amp;lt;/name&amp;gt;
&amp;lt;value&amp;gt;/home/hadoop/keyfile.json&amp;lt;/value&amp;gt;
&amp;lt;/property&amp;gt;

&amp;lt;property&amp;gt;
&amp;lt;name&amp;gt;fs.gs.reported.permissions&amp;lt;/name&amp;gt;
&amp;lt;value&amp;gt;777&amp;lt;/value&amp;gt;
&amp;lt;/property&amp;gt;
&amp;lt;property&amp;gt;
&amp;lt;name&amp;gt;fs.gs.path.encoding&amp;lt;/name&amp;gt;
&amp;lt;value&amp;gt;/home/hadoop/&amp;lt;/value&amp;gt;
&amp;lt;/property&amp;gt;&lt;/LI-CODE&gt;&lt;P&gt;Same xml tags we can have it on hadoop in core-site.xml to have it working with hdfs,&lt;/P&gt;&lt;P&gt;On beeline, just execute this and it shall work:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INSERT OVERWRITE DIRECTORY 'gs://bucket/table' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' SELECT * FROM table;&lt;/LI-CODE&gt;&lt;P&gt;Please upvote if you found helpful!&lt;/P&gt;</description>
      <pubDate>Sat, 16 Sep 2023 15:51:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-with-Google-Cloud-Storage/m-p/376414#M78746</guid>
      <dc:creator>nanu</dc:creator>
      <dc:date>2023-09-16T15:51:30Z</dc:date>
    </item>
  </channel>
</rss>

