Member since
09-17-2015
436
Posts
736
Kudos Received
81
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 5202 | 01-14-2017 01:52 AM | |
| 7537 | 12-07-2016 06:41 PM | |
| 8964 | 11-02-2016 06:56 PM | |
| 2872 | 10-19-2016 08:10 PM | |
| 7355 | 10-19-2016 08:05 AM |
02-03-2016
04:44 PM
1 Kudo
Can you double check that the ports Zeppelin requires are open? (Eg by trying to telnet from your laptop) I installed Zeppelin last night via Ambari on 2.3.4 cluster and it worked fine. You can confirm Zeppelin is running on that port by running below from the node it's running on netstat -tulpn | grep 9995
... View more
02-02-2016
06:19 AM
3 Kudos
You can also create Ranger policies by issuing grant/revoke commands via hbase shell. Check slide 25 here from @sneethiraj: http://www.slideshare.net/Hadoop_Summit/securing-hadoop-with-apache-ranger So if customer can get a list of the grant commands they previously issued, they could possible re-run them and get the Ranger policies automatically created.
... View more
02-01-2016
07:15 AM
1 Kudo
@Henry Sowell could you check that on the node where Nifi is setup, the java location Ambari is starting Nifi with (usually /usr/java/default) exists and has the appropriate version of java?
... View more
02-01-2016
06:59 AM
3 Kudos
@Sourygna Luangsay It seems like you took HDP 2.3.2 sandbox (which comes with Spark 1.4.1) and upgraded to 2.3.4 (which has spark 1.5.1). However Zeppelin on 2.3.2 was compiled with Spark 1.4.1 - which is why it outputs sc.version =1.4.1 Since Zeppelin on 2.3.2 sandbox was deployed via the Ambari service, you can just follow steps below to delete it: https://github.com/hortonworks-gallery/ambari-zeppelin-service#remove-zeppelin-service Then follow the steps here to re-install it and it should install Zeppelin bits compiled with Spark 1.5.1 https://github.com/hortonworks-gallery/ambari-zeppelin-service#setup-the-ambari-service
... View more
01-30-2016
06:56 PM
3 Kudos
@Rainer Geissendoerfer Others have encountered similar issues with the Nifi service (usually on Java 😎 as well but I have not been able to consistently reproduce it yet. From what I have seen of this issue, if you run the steps to start Nifi manually via CLI, it works (but for some reason not from Ambari). You can try below manual commands to start Nifi and populate the pid file so that Ambari can track its status: su - nifi
export JAVA_HOME=/usr/java/default
/opt/nifi-1.1.1.0-12/bin/nifi.sh start >> /var/log/nifi/nifi-setup.log
cat /opt/nifi-1.1.1.0-12/bin/nifi.pid | grep pid | sed 's/pid=\(\.*\)/\1/' > /var/run/nifi/nifi.pid
#run below as root
chown nifi:nifi /var/run/nifi/nifi.pid
If you are encountering the same issue, could you provide the below details so we can try to reproduce it: java version OS/version Ambari + HDP version (is this sandbox?) are you installing Nifi service on same node as Ambari?
... View more
01-29-2016
01:52 AM
1 Kudo
Glad to hear it!
... View more
01-28-2016
05:47 PM
1 Kudo
The Zeppelin ambari service downloads version of Zeppelin prebuilt with version of Spark found in your HDP installation. This version probably changed when you upgraded the cluster which is resulting in your error. Easiest thing would be to uninstall and re-install Zeppelin service. Step by step instructions here https://github.com/hortonworks-gallery/ambari-zeppelin-service#remove-zeppelin-service In the future, there will be enhancements made to allow the service to support upgrades.
... View more
01-27-2016
04:55 PM
2 Kudos
@Randy Gelhausen has also automated setup of Python 3 and numerous libraries/modules via his Jupyter Ambari service https://community.hortonworks.com/content/repo/4565/jupyter-service.html
... View more
01-26-2016
06:50 AM
4 Kudos
Couple of options: 1. From Ambari, to smoke test components one at a time, you can select "Run Service Check" from the "Service Actions" menu for that component 2. You can also invoke the smoke test via API https://cwiki.apache.org/confluence/display/AMBARI/Running+Service+Checks 3. You can manually run the validation checks provided in the doc: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/rpm_validating_the_core_hadoop_installation.html
... View more
01-23-2016
01:21 AM
1 Kudo
Yes both AD and IPA provide integrated KDC/LDAP experience which is great for most cases. The problem with FreeIPA is that Ambari doesn't natively support it yet (so you have to use manual option in security wizard where you have to manually create principals/distribute keytabs - JIRA has been logged on this). But every so often there are customers who require some corner case setup which doesn't work. Am guessing @Predrag Minovic is running into one of those
... View more