Member since
11-07-2016
637
Posts
253
Kudos Received
144
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2225 | 12-06-2018 12:25 PM | |
2281 | 11-27-2018 06:00 PM | |
1777 | 11-22-2018 03:42 PM | |
2832 | 11-20-2018 02:00 PM | |
5135 | 11-19-2018 03:24 PM |
10-13-2017
06:01 AM
@Neha G, Did you create the admin principal ? If not create the admin principal kadmin.local -q "addprinc admin/admin" Ensure that the KDC ACL file includes an entry so to allow the admin principal to administer the KDC for your specific realm. When using a realm that is different than EXAMPLE.COM, be sure there is an entry for the realm you are using. If not present, principal creation will fail. For example, for an admin/admin@HADOOP.COM principal, you should have an entry:
*/admin@HADOOP.COM * Thanks, Aditya
... View more
10-13-2017
04:39 AM
@Ashikin, Try setting PYSPARK_DRIVER environment variable so that Spark uses Anaconda/Miniconda. From the logs looks like spark is using pyspark which is bundled. Check the link for more info https://spark.apache.org/docs/1.6.2/programming-guide.html#linking-with-spark Thanks, Aditya
... View more
10-13-2017
04:37 AM
@Ashikin, Try setting PYSPARK_DRIVER_PYTHON environment variable so that Spark uses Anaconda/Miniconda. From the logs looks like spark is using pyspark which is bundled Thanks, Aditya
... View more
10-13-2017
04:00 AM
@Sen Ke, Glad that it worked for you. Can you please accept the answer and start a new thread for this so that the thread will not get deviated from the main topic. Just tag me in the new thread along with the gateway.log file. I guess the user1 is not added under Advanced users-ldif in Knox advanced config. Thanks, Aditya
... View more
10-12-2017
05:00 PM
@Sai Sandeep, I'm not sure of any other solution right now. I will definitely post it if I find any. Meanwhile, can you please give this a try and see if it works.
... View more
10-12-2017
04:45 PM
@Sai Sandeep, There are multiple reasons for which this may happen. You can try below 1) Try refreshing the page and check if it works. 2) Restart ambari server and hive service 3) If above 2 doesn't work, try restarting yarn To restart ambari server, run the below in ambari server host ambari-server restart Additional reference: https://community.hortonworks.com/questions/1288/with-hive-view-what-causes-h100-unable-to-submit-s.html Thanks, Aditya
... View more
10-12-2017
08:10 AM
@Sen Ke, Was this issue resolved.
... View more
10-12-2017
05:18 AM
@Ashikin, Do you have pandas installed ? Try installing pandas and run conda install pandas Other useful libraries are matplotlib, numpy if you want to install. Thanks, Aditya
... View more
10-11-2017
01:16 PM
1 Kudo
@Gagandeep Singh Chawla, Try running in R shell. as.data.frame(installed.packages()[,c(1,3:4)]) Please check screenshot for reference. Thanks, Aditya screen-shot-2017-10-11-at-64511-pm.png
... View more
10-11-2017
12:33 PM
1 Kudo
@btandel @Saurabh, Did you try setting "livy.spark.yarn.appMasterEnv.PYSPARK3_PYTHON" in the livy interperter setting. Use %livy.python3 in your paragraph. Thanks, Aditya
... View more