Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2720 | 04-27-2020 03:48 AM | |
| 5280 | 04-26-2020 06:18 PM | |
| 4445 | 04-26-2020 06:05 PM | |
| 3570 | 04-13-2020 08:53 PM | |
| 5377 | 03-31-2020 02:10 AM |
06-30-2017
03:01 AM
@zahain Can you please try like this: #### First check if the `hadoop classpath` is returning proper results.
# hadoop classpath
#### If yes, then set the CLASSPATH as following.
#### Please NOTE the value is in back tick `` Not single quote ''
# export CLASSPATH=`hadoop classpath` . Also please note that the "HADOOP_CLASSPATH" variable is normally used by the standard hadoop scripts, But for your standalone java code you should set the value in CLASSPATH variable instead. .
... View more
06-30-2017
01:11 AM
1 Kudo
@Mohit Varshney While configuring SMTP notifications from ambari UI did you select the following checkbox? Start TLS .
... View more
06-29-2017
05:58 PM
@John Bowler Please try cleaning the "/var/lib/ambari-server/resources/views/work/HIVE*" Directories present there and then try restarting the ambari-server. # rm -rf /var/lib/ambari-server/resources/views/work/HIVE* - On restart of ambari server it will re-extarct the Hive view JAR to the work directory. - Also try from other browsers. (like Chrome and Firefox). - If you notice any error in ambari-server.log or in the view logs then please share: # /var/log/ambari-server/hive**/*.log .
... View more
06-29-2017
05:54 PM
1 Kudo
@John Bowler I faced this issue when i used IE11 browser, Because it caches the current View of directories and hence when we create the new directories in File View (even though the directory is actually created in the HDFS) the File View still shows old view. - Disabling Browser Cache in IE11: https://library.netapp.com/ecmdocs/ECMLP2411996/html/GUID-1E34D79A-2EB8-4A0D-AA47-E7BB62B8C77B.html Please disable the Browser Cache in IE11 and then try again or use any other browser like Chrome / Firefox.
... View more
06-29-2017
02:05 PM
@Darko Milovanovic After deleting you will see the config group name as "Deleted" in the ambari UI that is an issue reported as part of : https://issues.apache.org/jira/browse/AMBARI-20435 . In order to clean the Deleted config group completely you will need to find the config_group id and then you can run the following SQL queries then restart "ambari-server" delete from ambari."serviceconfighosts" where service_config_id in (select service_config_id from serviceconfig where group_id=<REPLACE>);
delete from ambari."serviceconfigmapping" where service_config_id in (select service_config_id from serviceconfig where group_id=<REPLACE>);
delete from ambari.serviceconfig where group_id=<REPLACE>;
. **NOTE:** Please collect the Ambari DB Dump before modifying the DB manually. Replace the <REPLACE> value with the config_group id which you want to cleanup. .
... View more
06-29-2017
10:27 AM
2 Kudos
@D Giri Please refer to the following doc: Enable User Home Directory Creation:
https://docs.hortonworks.com/HDPDocuments/Ambari-2.5.0.3/bk_ambari-administration/content/create_user_home_directory.html By editing the "/etc/ambari-server/conf/ambari-properties" and adding the following: ambari.post.user.creation.hook=/var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh
. For kerberized environment you must modify the kinit file path in the default user creation hook script. /var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh .
... View more
06-29-2017
06:59 AM
@zahain It might be a hdfs client jar / classpath issue. Can you please check if your classpath is pointing to correct Hadoop jars? You can also use the "hadop classpath" command output to find the classpath to be used. Example: # javac -cp `hadoop classpath` -d TikaMapreduce.java
# java -cp `hadoop classpath`:.: tikka.com.TikaMapreduce.java . You can also add your own JAR directories in the classpath like: -cp `hadoop classpath`:$YOUR/LIB/PATH:.: .
... View more
06-29-2017
02:28 AM
1 Kudo
@Aaron Norton Are you using RedHat Enterprise Linux? Can you please check if the "hs_err_pid.log" file is generated for your DataNode, this file is created when the JVM crashes. As it is jsvc crash (and if you have recently upgraded your redhat kernel) then it might be related to: https://issues.apache.org/jira/browse/HDFS-12029 https://issues.apache.org/jira/browse/DAEMON-363 If this is the case then increasing the StackSize (to something higher like -Xss2048k) Of your DataNode might help DataNode to comeup fine.
.
... View more
06-27-2017
07:46 PM
@Manoj Dixit Please check the MD5 sum result of your downloaded sandbox is matching the one which is mentioned in the Site or not? https://hortonworks.com/downloads/ Example: # md5sum HDP_2.6_virtualbox_05_05_2017_14_46_00_hdp.ova . Usually it happens when the downloaded file is corrupted or incompletely downloaded. Also if you have the OVA file then you do not need to extract it. Simply import it to your virtualbox. https://hortonworks.com/tutorial/hortonworks-sandbox-guide/section/1/
... View more
06-27-2017
05:51 AM
@ed day Good to see your resolved your query. However your Exception that is mentioned in the query is different. And your Accepted answer does not really answers query/exception that was asked. Regarding admin user directory "/user/admin" please refer to the Hortonworks Documentation, Which will help other users to know why do we need to create "/user/admin" directory and assign permission to it as "admin:hadoop" https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-views/content/ch_using_falcon_view.html
... View more