Member since
06-08-2016
33
Posts
10
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4744 | 06-29-2016 09:32 PM |
03-14-2017
07:09 PM
Ugh that'd be highly annoying. Is there no other known way(s) to remove the old version? I pretty much cssh'd myself to all the worker nodes and did rpm -qa | grep 2.4 and went ahead and remove all the packages with 2.4 HDP. Somehow when I go to deregister the old distribution under ambari it still reports the nodes has having the software installed even though it's been obliterated.
... View more
03-14-2017
03:07 PM
I tried adding the py script manually into the usual custom actions location but no cigar either...
... View more
03-14-2017
03:06 PM
I'm on the latest HDP/Ambari and I get the following error: "status" : 500,
"message" : "An internal system exception occurred: Action remove_previous_stacks does not exist"
... View more
01-04-2017
06:42 PM
Before you read further, keep in mind that I am a SysAdmin and I'm not too familiar with development related configuration. After changing the replication factor on our hadoop system to 2, I'm enforcing the replication factor on the existing files with the following command line as the hdfs user from my primary NameNode (HA configuration)
hdfs setrep -R -w 2 / But then after a while I get this error: java.lang.OutOFMemoryError: Java heap space For java HEAP configuration on Ambari-YARN there's quite a few configuration: ResourceManager Java heap size, NodeManager Java heap size, AppTimelineServer Java heap size & YARN Java heap size which are all set to 1024 MB. To be very honest, I've done some of the official hortonworks administration courses but I'm not quite sure what is considered a proper "heap" size configuration for my developers to run jobs for each of the configs stated above and what is the corrolation between them. Also could the error above have related errors to the host itself. Please help, thanks! At the time of writing, I am generating a log to capture the entire error and will post a clean output on this post.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache YARN
-
HDFS
09-29-2016
06:28 PM
So I got the job to "start" [ INFO ] [main] Task Id : attempt_1475173438027_0002_m_000000_0, Status : FAILED
Error: Java heap space
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
[ INFO ] [main] Task Id : attempt_1475173438027_0002_m_000000_1, Status : FAILED
Error: Java heap space
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
[ INFO ] [main] Task Id : attempt_1475173438027_0002_m_000000_2, Status : FAILED
Error: Java heap space
[ INFO ] [main] map 100% reduce 100%
[ INFO ] [main] Job job_1475173438027_0002 failed with state FAILED due to: Task failed task_1475173438027_0002_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
... View more
09-29-2016
05:26 PM
2 Kudos
I'm trying to finish https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_hdfs_admin_tools/content/ch04.html this tutorial to manually compress my existing data on the hdfs but the file called "hadoop-examples-1.1.0-SNAPSHOT.jar" is seemingly non-existent? Documentation outdated maybe?
... View more
Labels:
09-29-2016
05:02 PM
1 Kudo
So I followed this guide: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_hdfs_admin_tools/content/ch04.html lzo is enabled, restarted all the services... But now I'm trying to apply the lzo compression to the existing data to gain some space but for the life of me I cannot find the infamous examples jar file? I tried the command on just about every hadoop-examples*.jar file I could find and get no result. I'm no programming/big data expert by any means... At this point my goal is to get the data compressed, the cluster is used mainly for long term storage after it's analysed by other systems.
... View more
Labels:
- Labels:
-
Apache Hadoop
09-29-2016
04:52 PM
Is there no actual step by step guide on how to do that... Been searching for weeks, nothing concrete so far.
... View more
09-29-2016
03:58 PM
I changed the to the right directory location, still the same error though 😕
... View more