Member since
08-13-2019
84
Posts
233
Kudos Received
15
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2101 | 02-28-2018 09:27 PM | |
3190 | 01-25-2018 09:44 PM | |
6248 | 09-21-2017 08:17 PM | |
3584 | 09-11-2017 05:21 PM | |
3238 | 07-13-2017 04:56 PM |
02-28-2018
09:27 PM
1 Kudo
@Matt Andruff The operation you are trying to do is basically save a temporary spark table into Hive via Livy (i.e a spark-app). If you check the 2nd table in this support matrix, this one is not a supported operation via spark-llap connector https://github.com/hortonworks-spark/spark-llap/wiki/7.-Support-Matrix#spark-shells-and-spark-apps But such operations(i.e. creating a table) should be supported by jdbc(spark1) interpreter as mentioned in the table 1 on the same link. jdbc(spark1) will direct the query through spark thrift server which is running as 'hive' principal as mentioned in the same wiki. If you however want above operation to succeed, then you logged in user in Zeppelin should have proper authorizations on hive warehouse directory. Then only spark will be able to save the table in hive warehouse for you. Hope that helps
... View more
01-25-2018
09:44 PM
1 Kudo
@Sridhar Reddy Since Spark2 interpreter is in globally shared mode, there is only one Spark2 session (i.e. Spark2 context) shared between all users and all notebooks in zeppelin. A variable defined in one paragraph of one notebook maybe accessed freely in other paragraphs of the same notebook, and for that matter paragraphs of other notebooks as well. Attaching screenshots screen-shot-2018-01-25-at-14317-pm.png screen-shot-2018-01-25-at-14344-pm.png
... View more
10-12-2017
09:47 PM
3 Kudos
Thanks @dbalasundaran for pointing to the article. This works for me There is one caveat in this though, If your cluster is kerberos enabled, then there is one more step required before installing the service in last step: Send a POST request to "/credentials/kdc.admin.credential" with data as '{ "Credential" : { "principal" : "user@EXAMPLE.COM", "key" : "password", "type" : "temporary" } }'
... View more
10-12-2017
08:14 PM
5 Kudos
I want to install 'Zeppelin' service via ambari REST API and have Zeppelin server running on one particular node. How do I do it?
... View more
Labels:
- Labels:
-
Apache Ambari
09-22-2017
05:53 PM
2 Kudos
@Shota Akhalaia My guess is that when you have /** = authc before /api/interpreter/** = authc, roles[admin] the authorization that you give to 'admin' users only for /api/interpreter/** is getting overridden by /** = authc which basically allows all apis to be accessible to all roles. I tried it on my instance, and ordering /** = authc as the first line really makes interpreters page accessible to all the users. Whereas making it as the last line makes it accessible only to the 'admin' users. The linked document also suggests to make it as the last line So please try this and let me know if it works [urls]
/api/interpreter/** = authc, roles[admin]
/api/configuration/** = authc, roles[admin]
/api/credential/** = authc, roles[admin]
/** = authc
#/** = anon
... View more
09-21-2017
08:17 PM
6 Kudos
@Shota Akhalaia Can you try once to configure [urls] section as mentioned in this example document: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.0/bk_zeppelin-component-guide/content/config-example.html ? I am just wondering whether order of this line matters in shiro.ini : /** = authc ?
... View more
09-21-2017
06:35 PM
5 Kudos
@Sudheer Velagapudi If you look at the zeppelin jdbc interpreter configuration, you will see these 4 properties: default.driver ( Your driver, for e.g. org.postgresql.Driver) default.url (jdbc connection URL) default.user default.password You can configure these 4 properties and then use %jdbc as to do SQL queries Please go through this page for more information: https://zeppelin.apache.org/docs/0.6.1/interpreter/jdbc.html
... View more
09-11-2017
05:21 PM
8 Kudos
@anjul tiwari 1. when I provide only read permission to user and share it in report mode, user is able to view the notebook but not allowed the run the paragraphs. That is expected behavior. A person who has 'Read Only' permission will be able to view the notebook and the visualizations/tables but will not be able to run the paragraphs or change the code 2. when i provide both read and write permissions then user is not only allowed to run the code but also able to view the code and change the mode as well.
When you provide 'read' and 'write' permissions to the user, he will be able to change the code as well as he will be able to run the code. But he should not be allowed to change the 'mode'/'permissions' . Can you confirm if `zeppelin.anonymous.allowed` and `zeppelin.notebook.public` properties are set to 'false' ? In any case, if I understand correctly - you want a mode where a person should not be able to read/modify code but he should still be able to run the code and visualize the results. This mode is not supported in Zeppelin currently.
... View more
07-13-2017
04:56 PM
6 Kudos
@shivanand khobanna Are you defining those variables with %spark interpreter ? In that case, the default mode of %spark interpreter is 'Globally Shared' In Shared mode, single JVM process and single Interpreter Group serves all Notes. Hence you might see variables defined in one note available to all users and all notebooks. So the behavior you are seeing is by design. You can change your interpreter modes through interpreters page. But better use 'livy' interpreter which uses 'Per user scoped' mode by default on HDP installed zeppelin. That means that you will see different YARN APPs for each user who is trying to use %livy interpreter and hence different spark context for each user which disables the sharing of namespace variables defined by one user from the other user. Please checkout this article for more info on various zeppelin interpreter modes and what each of the modes means: https://medium.com/@leemoonsoo/apache-zeppelin-interpreter-mode-explained-bae0525d0555
... View more