Member since
06-08-2016
33
Posts
10
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4751 | 06-29-2016 09:32 PM |
09-29-2016
03:54 PM
Yeah I've been trying to run the JAR file above... which is essentially running it on pre-existing data but it's failing miserably 😕
... View more
09-29-2016
03:53 PM
I went through the tutorial above for HDP 2.4.2 without success...
... View more
09-29-2016
03:49 PM
So I tried has root: [root@nn samples]# hadoop jar hadoop-examples.jar sort sbr"-Dmapred.compress.map.output=true" sbr"-Dmapred.map.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec"sbr "-Dmapred.output.compress=true" sbr"-Dmapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec"sbr -outKey org.apache.hadoop.io.Textsbr -outValue org.apache.hadoop.io.Text input output WARNING: Use "yarn jar" to launch YARN applications. Then I tried with yarn running as root: Exception in thread "main" java.lang.ClassNotFoundException: sort
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) Then I sudo su - yarn [yarn@nn ~]$ yarn jar hadoop-examples.jar sort sbr"-Dmapred.compress.map.output=true" sbr"-Dmapred.map.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec"sbr "-Dmapred.output.compress=true" sbr"-Dmapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec"sbr -outKey org.apache.hadoop.io.Textsbr -outValue org.apache.hadoop.io.Text input output Not a valid JAR: /home/yarn/hadoop-examples.jar So far manually trying to run that job is a no-go 😕
... View more
09-29-2016
03:44 PM
What I see is: /usr/hdp/2.2.6.0-2800/knox/samples/hadoop-examples.jar
/usr/hdp/2.4.0.0-169/knox/samples/hadoop-examples.jar /usr/hdp/2.4.2.0-258/knox/samples/hadoop-examples.jar
/usr/lib/hue/apps/jobsub/data/examples/hadoop-examples.jar /usr/lib/hue/apps/oozie/examples/lib/hadoop-examples.jar
... View more
09-29-2016
03:41 PM
hadoop-examples-1.1.0-SNAPSHOT.jar I don't seem to have the above file at all on either of my nn and snn or other masters? Option I: To use GzipCodec with a one-time only job: hadoop jar hadoop-examples-1.1.0-SNAPSHOT.jar sort sbr"-Dmapred.compress.map.output=true" sbr"-Dmapred.map.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec"sbr "-Dmapred.output.compress=true" sbr"-Dmapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec"sbr -outKey org.apache.hadoop.io.Textsbr -outValue org.apache.hadoop.io.Text input output
... View more
09-29-2016
03:39 PM
Basically I'm looking for "Block Level" type of compression of pre-existing data. I went through all the settings and Lzo is now enabled, just not sure how to compress existing data. Mind you a SysOps and not DevOps so dealing with programming languages is not my forte.
... View more
09-27-2016
06:37 PM
Okay so once the above is done, I still see 80% of the space in use.... Shouldn't that initiate a block level compression of the data on hdfs? If not how is it done.... If it's possible. Also I can't find that hadoop-examples.jar mentioned in their tutorial....
... View more
09-08-2016
07:52 PM
3 Kudos
So I've enabled lz0 compression as per HortonWorks guide... I've got 120TB of storage capacity so far and a defacto replication factor of 3. My data usage is at 75% and my manager is starting to wonder if lz0 can be used to compress the the file system "a la windows" where the file system is compressed but the data is accessible "as per usual" through the dfs path? Any hint would be greatly appreciated....
... View more
Labels:
- Labels:
-
Apache Hadoop
07-26-2016
12:50 PM
Interesting, I'm no DB guru still learning my way around PostGre, will check it out as I'm sure the value is there and will nag me far into the future. What would be the exact query to do that really?
... View more
07-11-2016
01:30 PM
Has there been any progress so far on that issue... I've tried so many approach that I've resorted to making this script that checks the node status every minute...
... View more