Member since
02-24-2016
175
Posts
56
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1277 | 06-16-2017 10:40 AM | |
10972 | 05-27-2016 04:06 PM | |
1281 | 03-17-2016 01:29 PM |
01-23-2017
01:58 PM
@jzhang , Checking the log. there is not much info : ERROR [2017-01-23 13:55:13,539] ({pool-2-thread-2} LivyHelper.java[createSession]:97) - sessionId:0.0 state is starting
ERROR [2017-01-23 13:55:14,569] ({pool-2-thread-2} LivyHelper.java[createSession]:97) - sessionId:0.0 state is starting
INFO [2017-01-23 13:55:18,001] ({pool-2-thread-2} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1485179674236 finished by scheduler org.apache.zeppelin.livy.LivySparkInterpreter187724336
And in the Notebook UI : java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier sqlContext found
sqlContext.sql("show databases").show()
... View more
01-23-2017
01:19 PM
hi @jzhang, Thanks for quick response. We are interested in listing the datbases more than only tables. (This is because, want to make sure 'logged in user' sees only the databases and tables which he has access to. Show tables currently lists only the tables which are registered as temp tables. For example : In Zeppelin Notebook UI %livy sqlContext.sql("Show tables").show()
+---------+-----------+
|tableName|isTemporary|
+---------+-----------+
| bank| true|
+---------+-----------+
Now for listing databases: %livy sqlContext.sql("Show databases").show()
java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier Show found
Show databases
^
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36)
at org.apache.spark.sql.catalyst.DefaultParserDialect.parse(ParserDialect.scala:67)
at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
Now from spark-shell console ( after having valid kerb ticket, for the same user who logged in zeppelin UI) scala> sqlContext.sql("show databases").show()
+-----------------+
| result|
+-----------------+
| default|
| prod_tst|
| hive01|
| test_db1|
+-----------------+
How can we make "show databases" work in the zeppelin notebook. Thanks.
... View more
01-23-2017
11:44 AM
Tagging : @azeltov
... View more
01-23-2017
11:38 AM
Hello Experts, After going through Livy documentations available on the internet. It looks like Livy can impersonate the end user, we decide to test it for our use case before setting it up in Production. I could run simple wordcount problem through Livy interpreter and read and write data to HDFS. Next hurdle is to use it run queries, listing db/tables,etc. I am trying to run using (one at a time, but I see the same error) using zeppelin notebook in a kerberized environment. %livy %livy.spark %livy.sql sqlContext.sql("show databases") // Doesn't work. show databases // does not work. sqlContext.sql("show databases").show() //does not work Every time I see : java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier "use or select or sqlContext" found. Any idea if I am missing something here? Sc However, when I try to run following I see the list of tables (but can't list databases yet) 1) Read the data from HDFS using sc.textFile() 2) Define Case class 3) Parse the file from step#1, and build the RDD of case objects 4) convert to DataFrame and create temp table 5) sqlContext.sql("show tables").toDF().show() // This lists only temp table. I am interested in listing ALL THE DATBASES and ALL THE TABLES ( TEMP and PERM) What is that I am missing here? Thanks
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache Zeppelin
11-16-2016
04:16 PM
Hi @Sunile Manjee , I am also looking for Encryption at rest for Kafka messages. But prefer not to pass on the messages to the HDFS. Consider our requirement more or less like "disk theft" which contains credit card transaction logs for 100s of credit card holders or military data. Thanks in advance.
... View more
11-03-2016
12:02 PM
You are right Arpit. It is a deeper issue, related to groups in LDAP.
... View more
10-27-2016
08:23 AM
@Ashnee Sharma No one deleted the groups. Well digging further found this issue is different from deletion of the groups. Its something related with infra initial setup 🙂 Anyway accepting the answer as it can help with normal scenarios. Thank you.
... View more
10-26-2016
12:50 PM
Before accepting I am trying to understand the root cause, why I do not have groups for the users on my cluster.
... View more
10-26-2016
10:52 AM
Thank you @Ashnee Sharma Is it a work around or hdfs user needs to be in the hdfs group? I also checked in the log, for the spark user it says no groups found for user spark, for user yarn same is the case. Do you think we need to create groups for each user? Thank you.
... View more
10-26-2016
09:56 AM
1 Kudo
When I see the Name Node logs, I continuously see the message going on in the logs. hadoop-hdfs-namenode-<HOST_NAME>.log WARN security.UserGroupInformation (UserGroupInformation.java:getGroupNames(1573)) - No groups available for user hdfs I checked on the Name node the user hdfs does exist on the host and it is member of the gorup hadoop. $id hdfs uid=123(hdfs) gid=663(hadoop) groups=993(hadoop) I am concerned because of three things. 1) Lack of useful information is hidden under spams 2) Possible performance hit due to I/O 3) Unnecessarily logs are getting bigger and occupying disk space. What configuration do you think are we missing here? Thanks, SS
... View more
Labels:
- Labels:
-
Apache Hadoop