Member since
06-09-2016
529
Posts
129
Kudos Received
104
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1674 | 09-11-2019 10:19 AM | |
9200 | 11-26-2018 07:04 PM | |
2397 | 11-14-2018 12:10 PM | |
5102 | 11-14-2018 12:09 PM | |
3054 | 11-12-2018 01:19 PM |
07-24-2018
01:07 PM
@Sriram Could you share a screenshot of the zeppelin jdbc (hive) interpreter configuration? Also if you can tail the hiveserver2 log to which the zeppelin is configured to connect and check what is happening (which is the user used and if there are any ranger issues) that would be helpful.
... View more
07-24-2018
12:51 PM
@Sriram If there is no policy for zeppelin user or public group (to which zeppelin usually belongs to) - Then I suggest you check which policy id is granting the access. You can check this in Ranger Admin UI Access tab as seen in the next image: If you click in the policy id shown above it will provide more details on the access. This way you will know why is the access being granted to zeppelin user. The above will show only if ranger plugin is correctly configured for hive. If you don't see any entries then check hiveserver2 logs and double check the hive ranger plugin is properly configured. HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
07-24-2018
11:57 AM
@Sriram Unless you have configured impersonation for jdbc interpreter all access for all users to hive using zeppelin jdbc interpreter will be perform as user zeppelin. Therefore, please make sure your zeppelin jdbc interpreter is configured for impersonation: https://community.hortonworks.com/articles/113228/how-to-enable-user-impersonation-for-jdbc-interpre-1.html HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
07-20-2018
06:42 PM
@forest lin then that is possibly a different issue. Initially you were getting java.lang.NoSuchMethodError: org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix$default$4()Lscala/Option; and only for spark 2 you are now getting Userclass threw exception: java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame Please review the following link https://community.hortonworks.com/content/supportkb/150292/errorexception-in-thread-main-javalangnoclassdeffo.html Also I think is best to take this error in separate thread as is not same as the initial problem which got solved by adding the configuration I mentioned before. HTH
... View more
07-19-2018
11:07 PM
@Gaurav Parmar Check in spark history server ui to see which applications are run in local mode: In above image you can see local-* are applications launched in local mode and application_* are applications launched in yarn master. If you like to switch the default local to yarn, perhaps you can add export MASTER=yarn on the spark-env so that users that forget to add --master yarn will by default run in yarn master. Please let me know if this helps you. HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
07-19-2018
05:39 PM
@Mani Please check this link and let me know if that helps: https://stackoverflow.com/questions/39255973/split-1-column-into-3-columns-in-spark-scala https://stackoverflow.com/questions/39235704/split-spark-dataframe-string-column-into-multiple-columns HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
07-19-2018
04:03 PM
@David Pocivalnik yes, that is correct! Please if you are satifsfied with the answer remember to login and mark as accepted.
... View more
07-19-2018
02:20 PM
@Harish Vaibhav Kali Please also try using FQDN for nifi node instead of localhost ! Thanks
... View more
07-19-2018
02:16 PM
@Harish Vaibhav Kali Perhaps not relevant but please check the spaces in your command above, specially after -d as I don't see the space there. I've personally run into problems with curl when missing spaces or due order of arguments passed. Try this curl --tlsv1.2 -ik -H 'Content-Type: application/json' -H 'Authorization: Bearer {token from above command}' -d '{"id":"03bc6708-6a44-4069-8fe3-77ec056639e7","state":"STOPPED"}' -XPUT 'https://localhost:9966/nifi-api/flow/process-groups/03bc6708-6a44-4069-8fe3-77ec056639e7' HTH
... View more
07-19-2018
01:45 PM
1 Kudo
@Papil Patil cache function is lazy, so in order to see the data cached you should actually perform an action that would trigger the execution of the dag. For example: df = spark.read
.format("jdbc")\
.option("url","---------------------------")\
.option("driver","com.sap.db.jdbc.Driver")
.option("CharSet","iso_1")\
.option("user","---------------------------")\
.option("password", "---------------------------")\
.option("dbtable","(select * from schema.table_name ) tmp ")\
.load()
df.cache()
//this will trigger the dag and you should see data cache
val count = df.count()
//next time it will just use the data in cache so it should be faster to execute
val count2 = df.count()
HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more