Member since
06-09-2016
529
Posts
129
Kudos Received
104
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1737 | 09-11-2019 10:19 AM | |
| 9343 | 11-26-2018 07:04 PM | |
| 2492 | 11-14-2018 12:10 PM | |
| 5341 | 11-14-2018 12:09 PM | |
| 3156 | 11-12-2018 01:19 PM |
05-30-2018
02:40 PM
@Sparsh Singhal following link shows the supported authentication mechanisms and contains the links to the configuration steps: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_security/content/authentication_providers.html HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-30-2018
02:28 PM
2 Kudos
Following steps will help you install miniconda, conda or anaconda using wget. 1. Locate the version you need to install by opening the following link on your browser https://repo.continuum.io/miniconda/ for example lets say you decided to use https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh 2. Download the installer yum install wget bzip2
wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh 3. Run the installer and follow the steps chmod +x Miniconda3-latest-Linux-x86_64.sh
./Miniconda3-latest-Linux-x86_64.sh Note: I recommend to select an install location like /opt/miniconda3 or similar that can be shared and used by different users
... View more
Labels:
05-30-2018
02:24 PM
1 Kudo
@Victor I usually install following the next steps: 1. Locate the version you need to install by opening the following link on your browser https://repo.continuum.io/miniconda/ for example lets say you decided to use https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh 2. Download the installer yum install wget bzip2 wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh chmod +x Miniconda3-latest-Linux-x86_64.sh 3. Run the installer and follow the steps ./Miniconda3-latest-Linux-x86_64.sh Note: I usually select a install location like /opt/miniconda3 or similar that can be shared and used by different users HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-30-2018
01:27 PM
2 Kudos
In order to use a non operative system default version of python with Zeppelin Pyspark interpreter you need to add the following configuration in the interpreter configuration page: Previous to this configuration you need to install the python version you like to use along side the operative system one in separate folder. Important: Changing the default os python could have an impact on other services and applications running on the machine. Hence we don't recommend doing so. Once you have installed the python version you wish to use set the above configuration pointing to the path of the python binary. After adding the above, save and restart the interpreter. To test changes: \ Important: The new python version installed needs to be present on zeppelin machine and all cluster worker nodes. Use same directory and make sure access permissions to the python dir is correctly set.
... View more
Labels:
05-30-2018
11:50 AM
1 Kudo
@Victor I recently helped another user configure zeppelin with anaconda python3. Please check the following link: https://community.hortonworks.com/questions/188038/version-of-python-of-pyspark-for-spark2-and-zeppel.html?childToView=194004#answer-194004 HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-30-2018
02:31 AM
1 Kudo
@Developer Developer There is not simple OOB solution to this AFAIK. Have you considered using multiple threads on driver side to do this? In the following threads they discuss using Future to do just that, perhaps this could help you. https://stackoverflow.com/questions/31912858/processing-multiple-files-as-independent-rdds-in-parallel https://stackoverflow.com/questions/46981424/how-to-process-multiple-dataframes-concurrently-in-spark HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-29-2018
02:10 PM
1 Kudo
@Sungwoo Park Try installing anaconda3 on /opt/anaconda3 instead of under /root. And add the following configuration to your interpreter: The results while having this configuration is: Important: Since zeppelin runs spark2 interpreter in yarn-client mode by default you need to make sure the /root/anaconda3/bin/python3 is installed on the zeppelin machine and on all cluster worker nodes. Additional resources https://community.hortonworks.com/content/supportkb/146508/how-to-use-alternate-python-version-for-spark-in-z.html HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-28-2018
05:39 PM
@Aditya Jadia This could probably be related to firewall/communication issue from within cluster worker nodes to email server. To find out exactly what went wrong you should fetch the application logs. yarn logs -applicationId <appId> - Also test email server from within cluster worker nodes with different tool (like telnet) to check connectivity. HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-28-2018
02:18 PM
@bharat sharma If the above answer helped addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
05-28-2018
02:17 PM
@Tamil Selvan K If the above answer helped addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more