Member since
01-15-2019
60
Posts
37
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2975 | 07-20-2021 01:05 AM | |
16565 | 11-28-2019 06:59 AM |
10-10-2018
02:20 PM
Hi @Jeff Storck, Thanks for telling me this. > To create an HDF cluster with Cloudbreak, a KDC must be configured After using a test KDC , the CB 2.7 can provision HDF 3.1 successfully.
... View more
10-08-2018
06:45 AM
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'loginIdentityProvider': FactoryBean threw exception on object creation; nested exception is java.lang.Exception: The specified login identity provider 'kerberos-provider' could not be found. CloudBreak 2.7 built HDF 3.1 has error: 'kerberos-provider' could not be found. The reason is HDF 3.1's blueprint has problem: nifi.security.user.login.identity.provider= is set to kerberos-provider but that provider doesn't exist in login-identity-providers.xml, or is commented out
... View more
09-30-2018
03:46 AM
Got error [User home directory not found] when creating WorkFlow Manager view in Ambari Resolve: sudo -u hdfs hdfs dfs -mkdir /user/admin
sudo -u hdfs hdfs dfs -chown admin /user/admin
Resule:
... View more
Labels:
04-30-2018
12:49 PM
It works! Thanks for your info. yum install -y epel-release
... View more
03-22-2018
09:19 AM
Latest Nifi Monitoring docs: (HDF 3.1.1) https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.1/bk_getting-started-with-apache-nifi/content/monitoring-nifi.html
... View more
03-21-2018
01:51 PM
In Zeppelin 0.7.0 Document, it is said that library should use 0.13. Reference : How to execute HIVE LLAP queries from Zeppelin in HDP 2.6.1 Hive Interpreter for Apache Zeppelin However it is too old. For LLAP, it need to be newer one. Properties :
hive2.driver = org.apache.hive.jdbc.HiveDriver
hive2.url = < can be obtained from ambari. check HiveServer2 Interactive JDBC URL >
eg.
hive2.url = jdbc:hive2://zzeng-hdp-3.example.com:2181,zzeng-hdp-1.example.com:2181,zzeng-hdp-2.example.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-hive2
Dependencies :
org.apache.hive:hive-jdbc:2.2.0
org.apache.hadoop:hadoop-common:2.6.0
With this setting, I can access Hive LLAP now:
... View more
Labels:
03-21-2018
01:31 PM
Before reading this post, I tried official document and failed. https://zeppelin.apache.org/docs/0.7.0/interpreter/hive.html Then I found my problem - I should set the correct properties with interpreter name.
... View more
03-14-2018
03:52 PM
1 Kudo
The article describes how to install and configure Sparkmagic to run in HDP2.5 against Livy Server and Spark 1.6.2 Reference: Using Jupyter with Sparkmagic and Livy Server on HDP 2.5 in HCC
1) Install Jupyter http://jupyter.org/install ###### Init venv
## Just first time
sudo yum install python-pip python-dev python-virtualenv -y
mkdir ~/jupyter_env
## After 2nd
virtualenv --system-site-packages ~/jupyter_env
source ~/jupyter_env/bin/activate
curl -O https://bootstrap.pypa.io/get-pip.py
sudo python get-pip.py
sudo easy_install -U pip
python -m pip install --upgrade pip
# for jupyter, it need gcc
sudo yum install gcc -y
sudo pip install jupyter notebook ipython http://zzeng-hdp-ambari:8888/tree?token=17dfdcb7525ff7470a637752450bbd586f607eddccc86a7f
2) Use Jupyter to connect Spark Livy https://community.hortonworks.com/articles/70501/using-jupyter-with-sparkmagic-and-livy-server-on-h.html # Failed building wheel for pykerberos
sudo yum install krb5-devel -y
sudo -H pip install sparkmagic
sudo pip install hdijupyterutils
sudo pip install autovizwidget
sudo pip install sparkmagic
pip show sparkmagic
pip show autovizwidget
cd /usr/lib/python2.7/site-packages
jupyter-kernelspec install --user sparkmagic/kernels/sparkkernel
jupyter-kernelspec install --user sparkmagic/kernels/pysparkkernel
sudo -H jupyter nbextension enable --py --sys-prefix widgetsnbextension
3) Start Notebook cd ~/
jupyter notebook --ip=0.0.0.0 4) Connect from Jupyter to remote Spark cluster Readme: https://github.com/jupyter-incubator/sparkmagic In[ ]: %load_ext sparkmagic.magics In[ ]: %manage_spark
... View more
Labels:
02-24-2018
02:42 PM
2 Kudos
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-release' returned 1. Error: Nothing to do Ambari got the error above. Try install mysql directly: [centos@zzeng-hdp-2 ~]$ sudo yum install mysql-community-release
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* base: mirror.web-ster.com
* extras: mirror.hostduplex.com
* updates: mirrors.sonic.net
No package mysql-community-release available.
Error: Nothing to do https://bugs.mysql.com/bug.php?id=81037 Try MySQL 5.7 [centos@zzeng-hdp-2 ~]$ sudo rpm -ivh http://dev.mysql.com/get/mysql57-community-release-el7-8.noarch.rpm
Retrieving http://dev.mysql.com/get/mysql57-community-release-el7-8.noarch.rpm
warning: /var/tmp/rpm-tmp.hGWJu5: Header V3 DSA/SHA1 Signature, key ID 5072e1f5: NOKEY
Preparing... ################################# [100%]
Updating / installing...
1:mysql57-community-release-el7-8 ################################# [100%]
[centos@zzeng-hdp-2 ~]$
This still didn't resolve the problem. At last, I tried to use MySQL 5.6 ( http://dev.mysql.com/get/mysql-community-release-el7-5.noarch.rpm ) [centos@zzeng-hdp-2 ~]$ sudo yum remove mysql57-community-release
Loaded plugins: fastestmirror
Resolving Dependencies
--> Running transaction check
---> Package mysql57-community-release.noarch 0:el7-8 will be erased
--> Finished Dependency Resolution
Dependencies Resolved
============================================================================================================================================================================================================================================
Package Arch Version Repository Size
============================================================================================================================================================================================================================================
Removing:
mysql57-community-release noarch el7-8 installed 8.2 k
Transaction Summary
============================================================================================================================================================================================================================================
Remove 1 Package
Installed size: 8.2 k
Is this ok [y/N]: y
Downloading packages:
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
Erasing : mysql57-community-release-el7-8.noarch 1/1
warning: /etc/yum.repos.d/mysql-community.repo saved as /etc/yum.repos.d/mysql-community.repo.rpmsave
Verifying : mysql57-community-release-el7-8.noarch 1/1
Removed:
mysql57-community-release.noarch 0:el7-8
Complete!
[centos@zzeng-hdp-2 ~]$ sudo rpm -ivh mysql-community-release-el7-5.noarch.rpm
Preparing... ################################# [100%]
Updating / installing...
1:mysql-community-release-el7-5 ################################# [100%]
[centos@zzeng-hdp-2 ~]$
... View more
Labels:
02-02-2018
09:28 PM
9 Kudos
resource_management.core.exceptions.Fail: Failed to download file from http://ambari1.hdp.hadoop:8080/resources//mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found By the following command, the problem is fixed: Ambari Server:(Ref: https://discuss.pivotal.io/hc/en-us/articles/115001611807-Hive-Services-Fail-to-Start-giving-Error-HTTP-Error-404-Not-Found- ) sudo yum install mysql-connector-java*
ls -al /usr/share/java/mysql-connector-java.jar
cd /var/lib/ambari-server/resources/
ln -s /usr/share/java/mysql-connector-java.jar mysql-connector-java.jar
... View more
Labels:
- « Previous
- Next »