Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory

Contributor

Hi we have a Hana Hadoop integrated implementation using Spark Controller.

When we are creating data preview at Hana for the virtual table which is created from a Hive table ,we are getting class not found error at spark controller log.

While creating the data preview HANA send a select query request to hive via spark controller.

Other functionalities like table details like name and schema are all coming in Hana. Only issue we are facing is it cannot fetch the data.

Hana SPS10

Spark 1.4.1.2.3

Spark Controller 1.5 Patch 0

Hive 1.2.1.2.3

Error log :

2016-11-21 06:09:23,632 [DEBUG] <?x-ml version="1.0"?><hana:queryplan xmlns:hana=" SELECT'>http://www.sap.com/hana"><select><sql>SELECT "employee"."name", "employee"."dept", "employee"."level" FROM "big_poc"."employee" "employee" LIMIT 200 </sql><typeInfo><column><name>COL0</name><type>29</type></column><column><name>COL1</name><type>29</type></column><column><name>COL2</name><type>3</type></column></typeInfo><parameters/><relocators/></select></hana:queryplan>

2016-11-21 06:09:25,078 [ERROR] java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:399) at org.apache.hadoop.hive.ql.session.SessionState.getAuthenticator(SessionState.java:867) at org.apache.hadoop.hive.ql.session.SessionState.getUserFromAuthenticator(SessionState.java:589) at org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(Table.java:174) at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:116) at org.apache.spark.sql.hive.client.ClientWrapper.org$apache$spark$sql$hive$client$ClientWrapper$$toQlTable(ClientWrapper.scala:237) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getAllPartitions$1.apply(ClientWrapper.scala:297) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getAllPartitions$1.apply(ClientWrapper.scala:296) at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:155) at org.apache.spark.sql.hive.client.ClientWrapper.getAllPartitions(ClientWrapper.scala:296) at org.apache.spark.sql.hive.client.HiveTable.getAllPartitions(ClientInterface.scala:74) at org.apache.spark.sql.hive.MetastoreRelation.<init>(HiveMetastoreCatalog.scala:645) at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:248) at org.apache.spark.sql.hive.hana.HanaESSQLContext$$anon$1.org$apache$spark$sql$hive$hana$HanaESCatalogNew$$super$lookupRelation(HanaESSQLContext.scala:27) at org.apache.spark.sql.hive.hana.HanaESCatalogNew$class.lookupRelation(HanaESCatalogNew.scala:27) at org.apache.spark.sql.hive.hana.HanaESSQLContext$$anon$1.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HanaESSQLContext.scala:27) at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:165) at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:165) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:165) at org.apache.spark.sql.hive.hana.HanaESSQLContext$$anon$1.lookupRelation(HanaESSQLContext.scala:27) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:222) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$7.applyOrElse(Analyzer.scala:233) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$7.applyOrElse(Analyzer.scala:229) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:221) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:242) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) at scala.collection.AbstractIterator.to(Iterator.scala:1157) at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:242) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) at scala.collection.AbstractIterator.to(Iterator.scala:1157) at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:212) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:229) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:219) at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:61) at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:59) at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111) at scala.collection.immutable.List.foldLeft(List.scala:84) at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:59) at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:51) at scala.collection.immutable.List.foreach(List.scala:318) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:51) at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:933) at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:933) at org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:931) at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131) at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755) at org.apache.spark.sql.hive.hana.HanaSQLContext$class.executeHANAQueryTask(HanaSQLContext.scala:115) at org.apache.spark.sql.hive.hana.HanaESSQLContext.executeHANAQueryTask(HanaESSQLContext.scala:23) at com.sap.hana.spark.network.CommandHandler$$anonfun$receive$2$$anonfun$applyOrElse$8.apply(CommandRouter.scala:278) at com.sap.hana.spark.network.CommandHandler$$anonfun$receive$2$$anonfun$applyOrElse$8.apply(CommandRouter.scala:275) at scala.collection.immutable.List.foreach(List.scala:318) at com.sap.hana.spark.network.CommandHandler$$anonfun$receive$2.applyOrElse(CommandRouter.scala:275) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at com.sap.hana.spark.network.CommandHandler.aroundReceive(CommandRouter.scala:162) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:376) at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:381) ... 91 more Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:154) at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:142) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:366) ... 92 more

1 ACCEPTED SOLUTION

Contributor

Hi

Changing the value for

hive.security.authorization.manager = org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider

worked.

Changed the hive-site.xml at spark controller.

hive-site conf at Hive Client is having proper authorizations.

Issue Resolved.

View solution in original post

3 REPLIES 3

Cloudera Employee

Are u using SQL standard based Hive authorization ??

looks like you didn't configure it :

[ERROR] java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory

Please provide following properties :

  • hive.security.authorization.manager=org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory
  • hive.security.authorization.enabled=true
  • hive.security.authenticator.manager=org.apache.hadoop.hive.ql.security.SessionStateUserAuthenticator

Contributor

Hi

Changing the value for

hive.security.authorization.manager = org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider

worked.

Changed the hive-site.xml at spark controller.

hive-site conf at Hive Client is having proper authorizations.

Issue Resolved.

Cloudera Employee

Glad that's helped!

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.