Member since
01-25-2018
10
Posts
2
Kudos Received
2
Solutions
03-14-2018
03:57 PM
@Neeraj Sabharwal Very nice complete article. It worked as described. However i ran into a small issue, hive is throwing "Permission denied" error for gcs data. Here is full trace. I was able to do "ls" on gcs successfully. Only difference i see from your trace is owner for gcs shows as "hdfs" for you and "root" for me. Any suggestions please? I tried making file in bucket as public link as well. hive> !hdfs dfs -ls gs://bmasna-csv-test/ ;
18/03/14 15:50:39 INFO gcs.GoogleHadoopFileSystemBase: GHFS version: 1.7.0-hadoop2
18/03/14 15:50:40 WARN gcs.GoogleHadoopFileSystemBase: No working directory configured, using default: 'gs://bmasna-csv-test/'
Found 1 items
drwx------ - root root 0 2018-03-13 17:23 gs://bmasna-csv-test/data1
hive> CREATE EXTERNAL TABLE IF NOT EXISTS EMP(
> EmployeeID INT,FirstName STRING, Title STRING,
> State STRING)
> COMMENT 'Employee'
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY ','
> STORED AS TEXTFILE
> LOCATION 'gs://bmasna-csv-test/data1/';
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=root, path="gs://bmasna-csv-test/data1":atxhive:atxhive:drwx------)
... View more