Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2724 | 04-27-2020 03:48 AM | |
| 5284 | 04-26-2020 06:18 PM | |
| 4449 | 04-26-2020 06:05 PM | |
| 3576 | 04-13-2020 08:53 PM | |
| 5377 | 03-31-2020 02:10 AM |
06-07-2018
07:05 AM
@Sabarigirivasan Kuttuva If this answers your query/issue then please mark this HCC thread as answered by clicking on "Accept" link on the correct answer, That way it will help other HCC users to quickly find the answers.
... View more
06-07-2018
05:38 AM
1 Kudo
@Sabarigirivasan Kuttuva Your java.home set inside the ambari.properties is incorrect. It is pointing to the "java" file instead of pointing to the directory where the JDK is installed java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.171-8.b10.el7_5.ppc64le/jre/bin/java . Can you please try this path: java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.171-8.b10.el7_5.ppc64le . Please check if your PATH contains the "bin" directory as following: # ls -l /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.171-8.b10.el7_5.ppc64le/bin/java Java home is a directory up t; the directory where the "bin" is present. Example: # grep 'java.home' /etc/ambari-server/conf/ambari.properties
java.home=/usr/jdk64/jdk1.8.0_112
stack.java.home=/usr/jdk64/jdk1.8.0_112
[root@latest1 ~]# ls -l /usr/jdk64/jdk1.8.0_112
total 25916
drwxr-xr-x. 2 root root 4096 Sep 23 2016 bin
-r--r--r--. 1 root root 3244 Sep 23 2016 COPYRIGHT
drwxr-xr-x. 4 root root 4096 Sep 23 2016 db
drwxr-xr-x. 3 root root 4096 Sep 23 2016 include
-rwxr-xr-x. 1 root root 5094021 Sep 22 2016 javafx-src.zip
drwxr-xr-x. 5 root root 4096 Sep 23 2016 jre
drwxr-xr-x. 5 root root 4096 Sep 23 2016 lib
-r--r--r--. 1 root root 40 Sep 23 2016 LICENSE
drwxr-xr-x. 4 root root 44 Sep 23 2016 man
-r--r--r--. 1 root root 159 Sep 23 2016 README.html
-rw-r--r--. 1 root root 526 Sep 23 2016 release
-rw-r--r--. 1 root root 21112816 Sep 23 2016 src.zip
-rwxr-xr-x. 1 root root 110114 Sep 22 2016 THIRDPARTYLICENSEREADME-JAVAFX.txt
-r--r--r--. 1 root root 177094 Sep 23 2016 THIRDPARTYLICENSEREADME.txt .
... View more
06-06-2018
09:48 PM
@Sami Ahmad Can you please try the following command to setup the classpath. (Please note that in the command i am using Backtick and not the single quote) Example:
-----------
# javap -classpath `hadoop classpath`:`hbase classpath`:.: RetriveData
# javap -classpath `hadoop classpath`:`hbase classpath`:.: com.test.your.packagename.RetriveData
For Testing:
------------
# javap -classpath `hadoop classpath`:`hbase classpath` org.apache.hadoop.hbase.HBaseConfiguration
... View more
06-06-2018
08:14 AM
1 Kudo
@Manish Roy 1- Please suggest how I can integrate existing HDP cluster with HDF. >>>> If you have HDP cluster already configured and you want to install HDF services on the same cluster using abari then you will have to simply install HDF mpack via ambari as mentioned in the following doc: https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.1/bk_installing-hdf-on-hdp/content/ch_install-mpack.html . You can find the Mpack based on your OS int he following link: https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.1/bk_release-notes/content/ch_hdf_relnotes.html#repo-location 2- How to manage both HDP & HDF cluster from single Ambari server. >>>>> Once the HDF MPack is installed and ambari sderver is restarted then you should be able to see the HDF services in the ambari UI while clicking on "Add Service" button. As mentioned in: https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.1/bk_installing-hdf-on-hdp/content/ch_add-hdf-to-hdp.html
... View more
06-01-2018
11:04 AM
1 Kudo
@Victor You can get detailed explaination about Yarn "Cluster Memory" in the following thread. https://community.hortonworks.com/questions/144325/why-my-cluster-memory-is-less-even-though-physical.html Regaarding 0 (Zero Memory utiliaztion). Until you run a job you wont see the memory being utilized. So please try running some job like following and then refresh the ambari Yarn dashboard. # su - spark
# export SPARK_MAJOR_VERSION=2
# cd /usr/hdp/current/spark2-client
# ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-client --num-executors 3 --driver-memory 512m --executor-memory 512m --executor-cores 1 examples/jars/spark-examples*.jar 10
.
... View more
06-01-2018
10:58 AM
1 Kudo
@Victor If your VM has enough resources then there is no harm in setting the values to the recommended one.
... View more
06-01-2018
10:07 AM
1 Kudo
@Kai Chaza For Ambari Managed cluster you are not supposed to edit the files manually on the filesystem like "/etc/hive/conf/" or "/etc/spark/conf/" This is because when we restart those services from Ambari UI (API Calls) Or reboot the VM (which will cause those service restart) then ambari will reset the content of those files from the configuration that is stored on Ambari DB. So you should always make the changes via Ambari wither using Ambari UI or using Ambari API calls (or using config.py) script. https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.5/bk_ambari-operations/content/modify_hdfs_configurations.html
... View more
06-01-2018
08:35 AM
1 Kudo
@Ian Bradshaw If you want to do SSH inside your HDP sandbox then you will have to use SSH Port 2222 (instead of 22) for SSH. So when you use Putty then while choosing SSH session you will need to specify SSH port as 2222 SSH Username : root
SSH Password: hadoop Based on the API call output that you shared it looks like your ambari server is running fine and responding on port 8080. http://127.0.0.1:8080/api/v1/services/AMBARI/components/AMBARI_SERVER . So the same ambari admin credential should work from Browser as well. Have you tried using Incognito Mode (Private Mode) browser (Just to avoid Browser caching issue). Or try a different browser.
... View more
05-31-2018
02:18 PM
1 Kudo
@yazeed salem If you will make changed in the script/conf files manually then upon every restart of Nifi (via Ambari UI/API call) the content of the file will be overwritten. So can check the template here and should make changes from Ambari UI like. Ambari UI --> Nifi --> Configs --> Advanced --> "Advanced nifi-bootstrap-env" --> "Template for bootstrap.conf"
. Try changing the template snippet as following: From: # JVM memory settings
java.arg.2=-Xms{{nifi_initial_mem}}
java.arg.3=-Xmx{{nifi_max_mem}} To # JVM memory settings
java.arg.2=-Xms1G
java.arg.3=-Xmx1G . Then restart NiFi
... View more
05-31-2018
12:14 PM
1 Kudo
@Rahul Kumar There is a typo in your command. You are running it as : # cd /home/rahul/nifi-1.6.0/bin
# . nifi.sh start Where as you should run the command as following: # cd /home/rahul/nifi-1.6.0/bin
# ./nifi.sh start . NOTE: "./nifi.sh". There is no space between DOT and nifi.sh
... View more