Member since
08-10-2016
170
Posts
14
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
19906 | 01-31-2018 04:55 PM | |
4269 | 11-29-2017 03:28 PM | |
1889 | 09-27-2017 02:43 PM | |
2044 | 09-12-2016 06:36 PM | |
1979 | 09-02-2016 01:58 PM |
04-05-2018
02:21 PM
Using hdp 2.6.3 with LLAP in Zeppelin. Hive impersonation = true LLAP doas = false insert into table myDatabase.test_sql values "test" I get the following error message shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [test] does not have [CREATE] privilege on [default/tmp_0218b93f51bf49afb291f47ca315ee57] I do not have permissions to create tables in default.. only in myDatabase. How can i change where this internal temp table is being created so that it's not trying to write to default, but instead myDatabase which I do have full permissions on?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
03-29-2018
09:12 PM
No idea why but livy stopped working today. It spits the following out non stop. No idea why anyone got an idea of where to look. org.apache.zeppelin.livy.LivyException: Session 349 is finished, appId: application_1522011783190_0196, log: [ at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866), at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66), at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:766), at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala), 18/03/29 17:06:54 INFO ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: java.lang.ClassNotFoundException: org.apache.livy.rsc.driver.RSCDriverBootstrapper), 18/03/29 17:06:54 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Uncaught exception: java.lang.ClassNotFoundException: org.apache.livy.rsc.driver.RSCDriverBootstrapper), 18/03/29 17:06:54 INFO ApplicationMaster: Deleting staging directory hdfs://lrdccdhm01.cloud.res.bngf.local:8020/user/andm013/.sparkStaging/application_1522011783190_0196, 18/03/29 17:06:55 INFO ShutdownHookManager: Shutdown hook called, , Failing this attempt. Failing the application.] Livy logs: 18/03/29 17:07:35 INFO RSCClient: Failing pending job 94534a77-1855-45c9-bc18-8e53c8e42165 due to shutdown.
18/03/29 17:07:35 INFO InteractiveSession: Failed to ping RSC driver for session 348. Killing application.
18/03/29 17:07:35 INFO InteractiveSession: Stopping InteractiveSession 348...
18/03/29 17:07:35 WARN InteractiveSession: (Fail to get rsc uri,java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for context to start.)
18/03/29 17:07:38 INFO InteractiveSession: Stopped InteractiveSession 348.
18/03/29 17:08:09 ERROR RSCCli
ent: Failed to connect to context.
java.util.concurrent.TimeoutException: Timed out waiting for context to start.
at org.apache.livy.rsc.ContextLauncher.connectTimeout(ContextLauncher.java:134)
at org.apache.livy.rsc.ContextLauncher.access$300(ContextLauncher.java:63)
at org.apache.livy.rsc.ContextLauncher$2.run(ContextLauncher.java:122)
at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
18/03/29 17:08:09 INFO RSCClient: Failing pending job 04982890-5fc8-4015-98a9-ad888b4e6b45 due to shutdown.
18/03/29 17:08:09 INFO InteractiveSession: Failed to ping RSC driver for session 349. Killing application.
18/03/29 17:08:09 INFO InteractiveSession: Stopping InteractiveSession 349...
18/03/29 17:08:09 WARN InteractiveSession: (Fail to get rsc uri,java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for context to start.)
18/03/29 17:08:09 INFO InteractiveSession: Stopped InteractiveSession 349.
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache Zeppelin
03-27-2018
07:55 PM
I ran into this issue too.... THeir clearly are daemons running just they aren't able to do anything...
... View more
03-14-2018
05:36 PM
This doesn't work for hdp 2.6.3
... View more
03-14-2018
05:15 PM
@Kshitij Badani How do we get full write access to llap in HDP 2.6.3? I'm happy to do work to make this work, otherwise I'll have to tell my client to down grade back to 2.6.2. I'd prefer not to do that.
... View more
03-01-2018
02:53 AM
Could I use the 1.1.3-2.1 jar in livy to give the feature I require?
... View more
03-01-2018
02:22 AM
@Kshitij Badani Thanks so much for replying and for writing the original article. I confess I can't read the support Matrix. I would have thought as I"m using spark 2.2 and HDP 2.6.3. (Which is admittedly not on the chart) that I would get the equivalent of v1.1.3-2.1. I am sure you can read this table better and understand this better. Can you explain? I'm not questioning you are right... I'm looking for understanding.
... View more
02-28-2018
04:49 PM
I am trying to get row level security for Zeppelin. I followed: https://community.hortonworks.com/articles/110093/using-rowcolumn-level-security-of-spark-with-zeppe.html https://community.hortonworks.com/content/kbentry/101181/rowcolumn-level-security-in-sql-for-apache-spark-2.html https://community.hortonworks.com/questions/132769/problem-with-zeppelin-sparklivy-and-llap-in-kerber.html Huge thanks to @Dongjoon Hyun, @Kshitij Badani, and @Berry Österlund for their work in this area Hive: "Run as end user instead of Hive user" to 'false' I am running a simple test in Zeppelin: %livy2.spark
val wordsCounts = spark.sparkContext.parallelize(Seq(("a",1),("b",2))).toDF
wordsCounts.write.saveAsTable("ZeppelinTest")
I am now getting an error: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.security.AccessControlException: Permission denied: user=ingest, access=READ, inode="/apps/hive/warehouse":hive:hadoop:drwxrwx---
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:353)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:252)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkDefaultEnforcer(RangerHdfsAuthorizer.java:428)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:304)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1956)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1940)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1914)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8792)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1466)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
);
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.getDatabase(HiveExternalCatalog.scala:189)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getDatabaseMetadata(SessionCatalog.scala:241)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.defaultTablePath(SessionCatalog.scala:443)
at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:609)
at org.apache.spark.sql.DataFrameWriter.createTable(DataFrameWriter.scala:419)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:398)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:354)
... 50 elided
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.security.AccessControlException: Permission denied: user=ingest, access=READ, inode="/apps/hive/warehouse":hive:hadoop:drwxrwx---
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:353)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:252)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkDefaultEnforcer(RangerHdfsAuthorizer.java:428)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:304)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1956)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1940)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1914)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8792)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1466)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
)
at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1305)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getDatabase$1.apply(HiveClientImpl.scala:349)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getDatabase$1.apply(HiveClientImpl.scala:355)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:291)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:232)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:231)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:274)
at org.apache.spark.sql.hive.client.HiveClientImpl.getDatabase(HiveClientImpl.scala:348)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getDatabase$1.apply(HiveExternalCatalog.scala:190)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getDatabase$1.apply(HiveExternalCatalog.scala:190)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
... 69 more
Caused by: org.apache.hadoop.hive.metastore.api.MetaException: java.security.AccessControlException: Permission denied: user=edh_Ingest, access=READ, inode="/apps/hive/warehouse":hive:hadoop:drwxrwx---
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:353)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:252)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkDefaultEnforcer(RangerHdfsAuthorizer.java:428)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:304)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1956)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1940)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1914)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8792)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1466)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_database_result$get_database_resultStandardScheme.read(ThriftHiveMetastore.java:15345)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_database_result$get_database_resultStandardScheme.read(ThriftHiveMetastore.java:15313)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_database_result.read(ThriftHiveMetastore.java:15244)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:654)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:641)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1158)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
at com.sun.proxy.$Proxy35.getDatabase(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1301)
... 79 more
My Livy interpreter settings: livy.spark.hadoop.hive.llap.daemon.serivice.hosts @llap0
livy.spark.jars /user/zeppelin/lib/spark-llap-assembly-1.0.0.2.6.3.0-235.jar
livy.spark.jars.packages
livy.spark.sql.hive.hiveserver2.jdbc.url jdbc:hive2://hive.local:10500/
livy.spark.sql.hive.hiveserver2.jdbc.url.principal hive/_HOST@SOMETHING.LOCAL
livy.spark.sql.hive.llap true
livy.spark.yarn.security.credentials.hiveserver2.enabled true
zeppelin.interpreter.localRepo /usr/hdp/current/zeppelin-server/local-repo/2C8A4SZ9T_livy2
zeppelin.interpreter.output.limit 102400
zeppelin.livy.concurrentSQL false
zeppelin.livy.displayAppInfo true
zeppelin.livy.keytab /etc/security/keytabs/zeppelin.server.kerberos.keytab
zeppelin.livy.principal zeppelin@SOMETHING.LOCAL
zeppelin.livy.pull_status.interval.millis 1000
zeppelin.livy.session.create_timeout 120
zeppelin.livy.spark.sql.maxResult 1000
zeppelin.livy.url http://livy.local:8999
Versions: Spark2 2.2.0 Zeppelin Notebook 0.7.3 Hive 1.2.1000 HDP 2.6.3 FYI again I have set "Run as end user instead of Hive user" to 'false' Any ideas or thoughts would be appreicated.
... View more
02-16-2018
02:57 AM
Here you go: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_command-line-installation/content/determine-hdp-memory-config.html
... View more
02-07-2018
09:04 PM
Maybe it's clearer to call it cluster-name instead of "identity-assertion" Apache Knox https://{gateway-host}:{gateway-port}/{gateway-path}/{cluster-name}/webhdfs
... View more