Member since
03-07-2016
7
Posts
0
Kudos Received
0
Solutions
03-17-2016
06:44 AM
Yeah...Thanks for the reply..few hours before I relaized that could be done... Thank you !!! @chennuri_gouris wrote: Add more volumes/disks to the dfs.datanode.data.dir in hdfs-site.xml in cloudera manager to increase the HDFS capacity.
... View more
03-15-2016
10:26 PM
Hi, I uesd cloudera-manager for my installation and used core-hadoop as my option to install hadoop stuff.Now that cluster with 5nodes is up and working properly.By default it took 200GiB space for HDFS.But my disks having far more than that spcae avaliable I need to use that because I am doing benchamarking stuff where need to generate 1TB data and then sort it and validate it. Can anybody tell me how can I increase the HDFS space.??
... View more
Labels:
- Labels:
-
Apache Hadoop
-
HDFS
03-07-2016
11:51 PM
Hi there, I have been using centos 7 and database for my cloudera manager is postgresql.My project is in secure enviornment where there is no internet connection so I choosed path C for my installtion using tarball.I got stucked at one point my server is unable to start following is the screenshots of cloudera-scm-server log file where there are two problems : Please anyone suggest me what can be done to make this work I am breaking my head to make this work ????
... View more
Labels:
- Labels:
-
Cloudera Manager