Member since
04-07-2017
80
Posts
33
Kudos Received
0
Solutions
02-28-2016
05:23 AM
2 Kudos
Thank you!!! I am happy to join this network for the level of support being provided. It keeps me motivated!!!
... View more
02-27-2016
10:43 AM
2 Kudos
Hi, I am executing the below MapReduce program for KeyValueTextInputFormat and get 0 records from reducer.
Could you help to identify the reasons. urlcount.txt
urlcountm.txt
urlcountr.txt
Data:
http://url12.com 36
http://url11.com 4
http://url20.com 36
http://url1.com 256
http://url1.com 267 Thanks in advance!!!
... View more
Labels:
- Labels:
-
Apache Hadoop
02-23-2016
12:45 AM
1 Kudo
I was successful in executing a MapReduce Job. Since the method Job.setBy.JarName(WordCount.class) was missing it was unable to find out the Mapper class. Thanks!!!
... View more
02-21-2016
07:09 AM
Thanks. Its working fine now.
... View more
02-21-2016
07:07 AM
Thank you. It is working fine now.
... View more
02-21-2016
06:03 AM
2 Kudos
Hi, Could you help me to resolve this error. [root@sandbox ~]# yarn jar /tmp/MRJar/WordCount.jar com.denmark.danskeBank.vo.WordCount /tmp/data/hamlet.txt /tmp/output
Not a valid JAR: /tmp/MRJar/WordCount.jar
This is what I have done:
1. Created the WordCount.jar file in eclipse with Hadoop1.x jars
2. Uploaded to HDFS dir - /tmp/MRJar
3. I got this error. Then I tried - [root@sandbox ~]# hadoop fs -copyToLocal /tmp/MRJar/WordCount.jar /MapReduce
16/02/21 05:46:07 WARN hdfs.DFSClient: DFSInputStream has been closed already I also tried the steps given to run through gradle. 1. While executing - ~/gradle-1.9/bin/gradle clean jar, I got an error: [cascade@sandbox part2]$ ~/gradle-1.9/bin/gradle clean jar
ERROR: JAVA_HOME is set to an invalid directory: /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.91.x8
I would wanted to create my MapReduce and try executing for practising - I am not a java developer. Could you guide me!!! Thanks.
... View more
Labels:
- Labels:
-
Apache Hadoop
02-21-2016
02:20 AM
1 Kudo
I tried both and get org.apache.hcatalog.pig.HCatLoader. The arguments of script is set as -useHCatalog. Thanks!!!
... View more
02-21-2016
02:19 AM
1 Kudo
Thank you. I have tried with MapReduce2 and Tez2. The MapReduce2 is set to green.
The heap size is same as in the attachement. job-1456016544519-0006-logs.txt job-1456016544519-0007-logs.txt It errors with Could not resolve org.apache.hcatalog.pig.HCatLoader. In the arguments -useHCatalog is set. ERROR org.apache.pig.PigServer - exception during parsing: Error during parsing. Could not resolve org.apache.hcatalog.pig.HCatLoader using imports: [, java.lang., org.apache.pig.builtin., org.apache.pig.impl.builtin.]
Failed to parse: Pig script failed to parse:
<file script.pig, line 1, column 29> pig script failed to validate: org.apache.pig.backend.executionengine.ExecException: ERROR 1070: Could not resolve org.apache.hcatalog.pig.HCatLoader using imports: [, java.lang., org.apache.pig.builtin., org.apache.pig.impl.builtin.]
at org.apache.pig.parser.QueryParser
... View more
02-20-2016
12:36 PM
1 Kudo
Hi, Could you help me to run a pig script with successful status. As mentioned in the Lab 3 exercise, geolocation table was created in hive and the pig script was created with only load transformation. a = LOAD 'geolocation' using org.apache.hive.hcatalog.pig.HCatLoader(); DUMP a LIMIT 1; Tried executing the script(not on Tez) arguments with -useHCatalog. In the ResourceManagerUI the status show succeeded with Cantainers allocated 2.
But, in the PIGUI the status is still RUNNING. It neither errors or return a result tab. What might be the issue? How to check if the metadata exist in HCatalog?
WebHCat server is in started status. Where should I look for Heap memory? Thanks In advance for the support. hive.png service.png pig-script.png pig-status.png resourcemanagerui.png
... View more
Labels:
- Labels:
-
Apache Pig
02-16-2016
12:37 AM
4 Kudos
Hi, Could you help me to start working with HDP. I have downloaded Hortonworks Sandbox with HDP2.3.2_1 and have started the Sandox and got the ip address:-http://127.0.0.1:8888. I have opened the Google chrome browser with http://127.0.0.1:8888 and got the (get started, try, what's new) page. Then I opened SSH client @ http://127.0.0.1:4200/ with root username and changed the password. Typed ipconfig and got the ipaddress: 10.0.2.15. I opened the webbrowser with http://10.0.2.15:8080 and gets the message "This webpage is not available". sanbox-login.png. Checked the status of Ambari and it is running(in the attachement). Could you help me to resolve this and start working with the lab exercise. Please reply if I have to provide any details. I am able to login into http://127.0.0.1:8080 with username & password: admin. This works but not from my own session. Thank you.
... View more
Labels:
- Labels:
-
Apache Ambari
- « Previous
- Next »