Member since
05-29-2017
408
Posts
123
Kudos Received
9
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2786 | 09-01-2017 06:26 AM | |
1698 | 05-04-2017 07:09 AM | |
1460 | 09-12-2016 05:58 PM | |
2061 | 07-22-2016 05:22 AM | |
1626 | 07-21-2016 07:50 AM |
02-29-2016
11:45 AM
3 Kudos
When I am trying to do distcp in High Availability Cluster then it is failing with below error. [s0998@test ~]$ hadoop distcp hdfs://HDPINFHA/user/s0998/sampleTest.txt hdfs://HDPTSTHA/user/root/
16/02/29 06:32:38 ERROR tools.DistCp: Invalid arguments:
java.lang.IllegalArgumentException: java.net.UnknownHostException: HDPTSTHA
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:406)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:311)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:678)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2653)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:216)
at org.apache.hadoop.tools.DistCp.run(DistCp.java:116)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.tools.DistCp.main(DistCp.java:430)
Caused by: java.net.UnknownHostException: HDPTSTHA Though I have configured via below urls. http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.0-Win/bk_HDP_RelNotes_Win/content/behav-changes-230_Win.html
... View more
Labels:
- Labels:
-
Apache Hadoop
02-26-2016
12:27 PM
@Zack Riesland: I feel there is not a stright forward way to move zookeeper through ambari like any other service. But below link may help you to do it manually. I have not tested it but it may help you. https://community.hortonworks.com/questions/4272/process-for-moving-hdp-services-manually.html
... View more
02-26-2016
10:44 AM
1 Kudo
@Artem Ervits: I have added oozie.action.sharelib.for.hive = hive,hcatalog,sqoop but still getting same error. And when I removed atlas property then it is working fine. But now other user's jobs are failing which are dependent on atlas. So not understanding where is the root cause.
... View more
02-25-2016
01:03 PM
@Benjamin Leonhardi: You mean remove from hive_env.sh file ?
... View more
02-25-2016
10:44 AM
1 Kudo
Hi Team, My hive action with oozie is failing with below error. hive.exec.post.hooks Class not found:org.apache.atlas.hive.hook.HiveHook FAILED: Hive Internal Error: java.lang.ClassNotFoundException(org.apache.atlas.hive.hook.HiveHook) Please note that this error started after upgrade only from 2.2 to 2.3. I have implemented below solution as well but still getting error, can anyone please help me to get it resolve. http://stackoverflow.com/questions/32759933/hive-internal-error-java-lang-classnotfoundexceptionorg-apache-atlas-hive-hook
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Hive
-
Apache Oozie
02-24-2016
01:37 PM
@Benjamin Leonhardi : I can do it easily with local but I am looking for hdfs /tmp/hive dir. So do we have anything like this for hdfs.
... View more
02-24-2016
01:12 PM
1 Kudo
Do we have any script which we can use to clean /tmp/hive/ dir frequently on hdfs. Because it is consuming space in TB. I have gone through below one but I am looking for any shell script. https://github.com/nmilford/clean-hadoop-tmp/blob/master/clean-hadoop-tmp
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
02-23-2016
11:58 AM
@Prakash Punj: As of now you only can see how many current jobs are running at some point of time but no current active users.
... View more
02-22-2016
12:32 PM
1 Kudo
@Neeraj Sabharwal: Today I noticed that it is not working as I don't have sssd running and it is not configured also. And I feel it is mandatory for group mapping with hdfs. I tried to configure sssd with ldap but did not get success, so I need your hep to configure sssd, do you have any doc or instruction to do that ?
... View more
02-21-2016
11:30 AM
@Ancil McBarnett @Benjamin Leonhardi @Jonas Straub: I have successfully disabled "kill Application" button by adding below property to yarn-site.xml. yarn.resourcemanager.webapp.ui-actions.enabled= false. For more detailed steps you can refer to below link . http://www.hadoopadmin.co.in/bigdata/how-to-disable-kill-application-button-in-resource-manager-web-ui/
... View more