Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Unable to store data into hive table from spark using Hive Warehouse Connector in HDP 3.0.1 from eclipse

Unable to store data into hive table from spark using Hive Warehouse Connector in HDP 3.0.1 from eclipse

New Contributor

Tried below configurations:

pom.xml:

<dependency>
<groupId>com.hortonworks.hive</groupId>
<artifactId>hive-warehouse-connector_2.11</artifactId>
<version>1.0.0.7.1.1.0-565</version>
</dependency>
</dependencies>

<repository>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
<id>hortonworks.extrepo</id>
<name>Hortonworks HDP</name>
<url>http://repo.hortonworks.com/content/repositories/releases</url>
</repository>
<repository>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
<id>Cloudera.extrepo</id>
<name>Cloudera HDP</name>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>

 

Code changes done:

import com.hortonworks.hwc.HiveWarehouseSession
import com.hortonworks.hwc.HiveWarehouseSession._
import com.hortonworks.spark.sql.hive.llap.HiveWarehouseConnector

 

val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build()
hive.showDatabases().show()
hive.setDatabase("default")
hive.createTable("newTable").ifNotExists().column("abc", "string").create()
datasetFile.write.format("com.hortonworks.spark.sql.hive.llap.HiveWarehouseConnector")
.option("database","default").option("table", "newTable")
.save()

 

Getting below error:

Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [anonymous] does not have [USE] privilege on [default]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:483)

 

Tried below fixes:

1) Apache ranger policy reset:

Have set owner of anonymous user in all privileges section of all-databases policy

 

2) Have set below properties before creating spark session:

conf.set("spark.hadoop.metastore.catalog.default","hive")
conf.set("spark.datasource.hive.warehouse.load.staging.dir","/tmp")

 

3) I am able to access the hive tables using spark-shell (spark.sql) using below command:

spark-shell --conf spark.hadoop.metastore.catalog.default=hive

But I am unable to write data from spark to hive table.

Kindly help me on this.

Don't have an account?
Coming from Hortonworks? Activate your account here