Member since
05-23-2016
16
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1189 | 05-30-2016 07:44 PM |
02-11-2017
06:28 PM
@Jay SenSharma - The jquery.js files are getting blocked. When we inspect using document inspector we are getting 404 for the jquery.js and js.gz files. Don't know how to fix it and where its getting filtered.
... View more
02-10-2017
12:14 PM
When I open the resource manager URL in the browser, it display partial data and show "This page will not function without javascript enabled. Please enable javascript on your browser". When I debug further it says JQuery.js files are not loaded. What could be the issue? Kindly help. Thanks, Subramanian S.
... View more
Labels:
- Labels:
-
Cloudera Manager
05-30-2016
07:44 PM
Hi Plevinson, Its an Ambari install. The issue was resolved. Actually the cleanup script is not removing contents fully in /etc folder. I am able to find a blog to manually remove them. Also I have to remove the postgres data file along with postgres installation. Now able to bring up the system. Thanks for all your support and the post in the community forums. Regards, Subramanian S.
... View more
05-30-2016
07:08 PM
Hi, The issue is fixed. When I am running the cleanup script I removed the users, so the folder permission become zombie. I fixed them using chown. Now its working fine. Thanks @Rahul Pathak and @Kuldeep Kulkarni. Regards, Subramanian S.
... View more
05-30-2016
02:21 PM
1 Kudo
Hi, After reinstalling HDP2.3, I am getting the following error when I try to restart the service. org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 3, volumes configured: 9, volumes failed: 6, volume failures tolerated: 0
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.<init>(FsDatasetImpl.java:289)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetFactory.newInstance(FsDatasetFactory.java:34)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetFactory.newInstance(FsDatasetFactory.java:30)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1412)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1364)
at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:317)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:224)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:821)
at java.lang.Thread.run(Thread.java:745) When I digged the data dir, some of them contains directory from prior installation. How to fix this issue. Thanks in advance. Regards, Subramanian S.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
HDFS
05-29-2016
05:57 PM
Hi, My initial installation got aborted. When I try to do reinstall, there was a notification to run the cleanup script. I did that and proceed with the installation again. Still I am unable to proceed further. Can you please guide me in resolving the issue. Regards, Subramanian S.
... View more