Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark create table error: does not have privileges

avatar
New Contributor

Hello, we are using spark for ETL. In some cases we create tables from spark.
And we have problems with permissions. But when we try same query in hive, it works well.
Help us to solve this problem.
CDH 5.8.3

Steps for reproduce:
- grant privileges in sentry:
    GRANT ALL ON DATABASE some_database TO ROLE some_role;
    GRANT ALL ON URI "/some/path" TO ROLE some_role WITH GRANT OPTION
- grant privileges in hdfs
    hdfs dfs -chown -R some-user:some_role /some/path
- start spark-shell
- sqlContext.sql("CREATE EXTERNAL TABLE some_database.some_table( id decimal(38,12), ...)
    PARTITIONED BY (`deleted` string, `month` string)
    ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
    STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat'
    OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat' LOCATION '/some/path/some_table'");
- get error:
    2017-08-15 09:11:17,755 ERROR [Driver] exec.DDLTask (DDLTask.java:failed(520)) - org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:User some-user does not have privileges for CREATETABLE)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:764)
        at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4082)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:306)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1782)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1539)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1318)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1127)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115)
        at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:486)
        at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:475)
        at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:281)
        at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)
        at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)
        at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)
        at org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:475)
        at org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:465)
        at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:607)
        at org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:33)
        at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
        at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
        at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
        at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
        at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
        at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
        at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
        at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
        at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
        at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
        at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
        at ru.sberbank.bigdata.cloud.rb.internal.sources.history.SaveTableChanges.createResultTable(SaveTableChanges.java:104)
        at ru.sberbank.bigdata.cloud.rb.internal.sources.history.SaveTableChanges.run(SaveTableChanges.java:74)
        at ru.sberbank.bigdata.cloud.rb.internal.sources.history.SaveTableChanges.main(SaveTableChanges.java:180)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:542)
Caused by: MetaException(message:User some-user does not have privileges for CREATETABLE)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:30072)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:30040)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:29966)
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1079)
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1065)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2084)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:681)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:669)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
        at com.sun.proxy.$Proxy26.createTable(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2016)
        at com.sun.proxy.$Proxy26.createTable(Unknown Source)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:758)
        ... 40 more

2017-08-15 09:11:17,756 ERROR [Driver] ql.Driver (SessionState.java:printError(939)) - FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:User some-user does not have privileges for CREATETABLE)
2017-08-15 09:11:17,756 INFO  [Driver] log.PerfLogger (PerfLogger.java:PerfLogEnd(168)) - </PERFLOG method=Driver.execute start=1502777477667 end=1502777477756 duration=89 from=org.apache.hadoop.hive.ql.Driver>
2017-08-15 09:11:17,756 INFO  [Driver] ql.Driver (Driver.java:execute(1704)) - Completed executing command(queryId=); Time taken: 0.089 seconds
2017-08-15 09:11:17,756 INFO  [Driver] log.PerfLogger (PerfLogger.java:PerfLogBegin(127)) - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2017-08-15 09:11:17,756 INFO  [Driver] log.PerfLogger (PerfLogger.java:PerfLogEnd(168)) - </PERFLOG method=releaseLocks start=1502777477756 end=1502777477756 duration=0 from=org.apache.hadoop.hive.ql.Driver>
2017-08-15 09:11:17,765 ERROR [Driver] client.ClientWrapper (Logging.scala:logError(74)) -

4 REPLIES 4

avatar
Champion

@makcuk

 

Are you using dataframes? then it can be one of the known issue mentioned in the below link. 

 

Look for the topic: Tables saved with the Spark SQL DataFrame.saveAsTable method are not compatible with Hive

 

https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_rn_spark_ki.html#concept_...

 

if so, the link has a workaround as well

avatar
New Contributor

@saranvisa

Hello, i use sqlContext.sql(see steps for reproduce).

avatar
Champion

@makcuk

 

i went through the steps in detail now and i've few questions/suggestions, it may help you

 

1. Grant DATABASE to ROLE 

    Grant URI to ROLE 

 

the above 2 grants are fine but ROLE should be granted to User/Group. Pls include this step

 

2. You have mentioned, "when we try same query in hive, it works well". So how did you try, via beeline, hive CMD (or) Hue?

 

3. hope you should have enabled Kerberos. if so, make sure the required principals are added for hive and spark

 

4. if you try crating table via CMD/beeline, pls check the klist default principal

 

5. if you try via Hue, make sure CM -> Hue -> Configuration -> Sentry Service is enabled

 

avatar
New Contributor

@saranvisa

Thanks for answers!

1. Yes, we have Grant role TO user

2. We try it in HUE

5. Yes, Sentry service is enabled.

I added Grant all on server, and with this grant all works fine...