Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2452 | 04-27-2020 03:48 AM | |
4890 | 04-26-2020 06:18 PM | |
3977 | 04-26-2020 06:05 PM | |
3221 | 04-13-2020 08:53 PM | |
4928 | 03-31-2020 02:10 AM |
03-10-2017
09:27 AM
@Pradeep kumar - When you are running the "ambari-server setup" for upgrading the java to 1.8 what error are you getting? - Do you see the "jdk1.8" tar.gz file inside your Ambari Server host ? # ls -l /var/lib/ambari-server/resources/jdk-8*
-rw-r--r--. 1 root root 181238643 Aug 18 2016 /var/lib/ambari-server/resources/jdk-8u60-linux-x64.tar.gz . - If you are managing cluster via ambari then "ambari-server setup" option is better to upgrade the java. However you can take a look at the following link to see how to point to upgraded java: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_command-line-installation/content/meet-min-system-requirements.html
... View more
03-09-2017
05:04 PM
2 Kudos
@vrathod The following ambari API can be used to see the required components list by a particular service like "Hive" http://localhost:8080/api/v1/stacks/HDP/versions/2.5/services/HIVE required_services: ["ZOOKEEPER",
"HDFS",
"YARN",
"TEZ",
"PIG",
"SLIDER"
],
.
... View more
03-09-2017
04:57 PM
1 Kudo
@vrathod Ambari services uses "metainfo.xml" file which provides the information about the dependencies "the list of components that this component depends on" https://cwiki.apache.org/confluence/display/AMBARI/Writing+metainfo.xml For example if we want to see "HIVE_SERVER" service is dependent on which all components. https://github.com/apache/ambari/blob/release-2.4.2/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/metainfo.xml#L64-L87
... View more
03-09-2017
04:37 PM
@Rick Turner In ambari-server.log you will find complete stackTrace of the error can you please share it. The "Caused By.." Section of the error can give us more details.
... View more
03-09-2017
04:17 PM
@Sedat Kestepe
For Pig View the hcat proxy settings are needed. Please see the following link for more details on this: https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.2.0/bk_ambari-views/content/setup_WebHCat_proxy_user_pig_view.html
We must set up an HDFS proxy user for WebHCat and a WebHCat proxy user for the Ambari Server daemon account. To setup the HDFS proxy user for WebHCat : In Ambari Web, browse to Services > HDFS > Configs. Under the Advanced tab, navigate to the Custom core-site section. Click Add Property… to add the following custom properties: hadoop.proxyuser.hcat.groups=*
hadoop.proxyuser.hcat.hosts=* .
... View more
03-09-2017
03:22 PM
@Alex Raj Looks like in your JAVA Path you have added some extra character "%3A" which means ":" /usr/java/jdk1.7.0_67/bin/java%3A . Please check where it was added and then remove the extra ":" from " /usr/java/jdk1.7.0_67/bin/java:
... View more
03-09-2017
02:32 PM
@Inam Ur Rehman
Unlike your original query which was related to Sandbox based mysql (that uses 'hadoop' as root user password by default) , Your current query is very specific to MySQL instead of HDP Sandbox specific settings. You should refer to standard MySQL documentation in order to know how to reset the root passwords: https://dev.mysql.com/doc/refman/5.7/en/resetting-permissions.html - If you will use default sandbox then the mysql root user password will be 'hadoop'.
... View more
03-09-2017
02:28 PM
@Alex Raj In your shell script try adding the following entries to tell where the JAVA is present as following: export JAVA_HOME=/PATH/To/jdk1.8.0_67
$JAVA_HOME/bin/java -cp $LOCAL_DIR/libs/integration-tools.jar com.audit.reporting.GenerateExcelReport $LOCAL_DIR/input.txt $LOCAL_DIR/ .
... View more
03-09-2017
12:20 PM
@Inam Ur Rehman Make sure that you are entering correct password. Else try the following: # mysql -u root -padmin123
mysql> use hadoop_test ;
... View more
03-09-2017
11:14 AM
@Viswa One of the reason can be special characters (Non printable) in the command Because the error that you are getting can occur if "hdfs" command is not able to locate the class related to the argument. Example:: # hdfs dsfadmin -safemode enter
Error: Could not find or load main class dsfadmin - In this example i intentionally made a spelling mistake in 'dfsadmin" argument to explain. In your case it might be due to some special hidden chars. . So you should try check that you are not copying and pasting the command in the terminal... instead try to manually type the command. Also check the output of the following to see if you are getting the correct command output: # hdfs --help | grep dfsadmin
dfsadmin run a DFS admin client
... View more