Member since
01-26-2016
5
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2658 | 09-18-2017 04:11 PM |
12-01-2017
10:39 AM
Change the directories below for Service Monitor since the procedure is the same as for the Host Monitor. You can salvage the contents of the Host Monitor by using the LDBStoreTool Java Class to repair the corrupted LDB: Make sure the Host Monitor is stopped completely (it should be since it is unable to open this LDB). Backup the /var/lib/cloudera-host-monitor directory with tar or cp. Run the LDBStoreTool Java class to try and bring the corrupt database to a consistent state (please adjust the directory to the one reported in the exception): java -cp "/usr/share/cmf/lib/*" com.cloudera.cmon.tstore.leveldb.tool.LDBStoreTool repair --directory /var/lib/cloudera-host-monitor/subject_record/subject_ts/partitions/subject_ts_2017-10-30T18:03:04.415Z
[ main] log INFO Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
[ main] CMONConfiguration INFO Config: jar:file:/usr/share/cmf/common_jars/firehose-5.12.1.jar!/cmon.conf
[ main] ConfigUtil WARN Could not find configuration file cmon-cm-auth.conf
[ main] LDBResourceManager INFO Max file descriptors: 4096
[ main] LDBResourceManager INFO Setting maximum open fds to: 2048
Running repair command
Success
Start the Host Monitor and it should start now. If the LDBStoreTool Java class is unable to repair the corrupt LDB then you will have to purge the /var/lib/cloudera-host-monitor directory similar to steps noted above by Michalis.
... View more
09-18-2017
04:11 PM
[ SOLVED ] After posting this request for help, it got me thinking about re-wording my Google search parameters which led me to a Horton site for copying data between Horton and S3 buckets. It didn't have an exact answer, but what it did have was the field for specifying an end point. It was exactly what I needed to add to my cli command. Here is the complete command I used to get distcp working for s3-govcloud to hdfs. Opposite also works, hdfs to s3-govcloud. #AWS_SECRET=xxxxxxxxx #AWS_KEY_ID=xxxxxxxxx #AWS_BUCKET=xxxxxxxx <-- name of your govcloud bucket #hadoop distcp -D fs.s3a.bucket.#AWS_BUCKET.endpoint=s3-us-gov-west-1.amazonaws.com -D fs.s3a.awsAccessKeyId=$AWS_KEY_ID -D fs.s3a.awsSecretAccessKey=$AWS_SECRET s3a://$AWS_BUCKET/path/to/files/ /path/to/hdfs/files/ Links I used for reference: AWS GovCloud Endpoints http://docs.aws.amazon.com/govcloud-us/latest/UserGuide/using-govcloud-endpoints.html Horton Amazon S3 Bucket Configuration https://hortonworks.github.io/hdp-aws/s3-copy-data/index.html
... View more