Member since
06-09-2016
529
Posts
129
Kudos Received
104
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1788 | 09-11-2019 10:19 AM | |
| 9427 | 11-26-2018 07:04 PM | |
| 2562 | 11-14-2018 12:10 PM | |
| 5563 | 11-14-2018 12:09 PM | |
| 3245 | 11-12-2018 01:19 PM |
06-11-2018
01:37 PM
1 Kudo
@Snehal S Keystore file path is already set to gateway.jks by default and you should not change this. Once you perform the configuration mentioned on the link you shared you need to import the 1. client public certificate to the knox truststore (on knox server machine) 2. the knox public certificate to the client truststore (on client server machine) After 1 and 2 and if proper configuration was done this should work. HTH
... View more
06-11-2018
01:31 PM
@Dmitro Vasilenko I would recommend you review Livy logs under /var/log/livy2 to understand why the application submission is failing. HTH
... View more
06-08-2018
06:00 PM
@Anpan K Yes, Knox should be accessible to external users, so it has to be installed on node which can be access from outside the cluster. Like an edge node. This node can be still be managed by ambari. HTH
... View more
06-07-2018
07:12 PM
@Anpan K Those are utility packages required when you install HDP. Repository file should contain both HDP-UTILS and HDP-2.6.X repo - If you think this has helped please take a moment to login and click the "accept" link on the answer
... View more
06-07-2018
06:57 PM
@Anpan K You should have only one with enabled=1. Otherwise this could lead to issues. HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
06-06-2018
08:02 PM
1 Kudo
@Rahul Kumar I think @Vinicius Higa Murakami is correct, if you take a closer look you are running as user rahul and probably the log file under logs directory belongs to different user, or you dont have write permissions on it. If you are starting from command line always double check which user you are running with and switch to the appropiate service account. HTH
... View more
06-06-2018
07:53 PM
@RAUI Another option is to build a spark-streaming application that pulls those files directly from hdfs and process them. https://spark.apache.org/docs/latest/streaming-programming-guide.html#file-streams HTH
... View more
06-06-2018
07:44 PM
@raghavendra v Error is due yum command at os level is failing. Try to run the same command yourself on the host you plan to install hive. yum -d 0 -e 0 -y install mysql-community-server See exactly why the above is failing. If not able to find the mysql-community-server package you may need to add the repository for this to work: wget https://dev.mysql.com/get/mysql57-community-release-el7-11.noarch.rpm
rpm -ivh mysql57-community-release-el7-11.noarch.rpm HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
06-06-2018
02:08 PM
2 Kudos
@Dmitro Vasilenko Yes, you can use the zeppelin REST API along with the InvokeHTTP processor. Zeppelin REST API https://zeppelin.apache.org/docs/0.7.0/rest-api/rest-notebook.html#run-all-paragraphs InvokeHTTP Processor https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.5.0/org.apache.nifi.processors.standard.InvokeHTTP/index.html HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more