Member since
08-16-2018
8
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1681 | 01-30-2019 07:19 AM |
04-08-2019
09:05 AM
Thanks, @gzigldrum , for the explanation!
... View more
04-05-2019
01:21 PM
Hi, @gzigldrum , thaks for your message, i'll do that. Would you be able to explay why it's not possible to be done?
... View more
04-03-2019
02:00 PM
Adding more information: It seems that my issue has been caused by not stopping Host Monitor and Service Monitor roles BEFORE moving data between folders. I've just done it when it was needed due to the reconfiguration of firehose.storage.base.directory (the moment that cloudera manager demands a restart of these roles). That being said, I slightly change my query here: is it possible to merge information from 2 different firehose.storage.base.directory folders? When I try to do that, the charts show only the information related to the most recent folder.
... View more
04-03-2019
09:25 AM
Hi, I've changed the Host Monitor Storage Directory (firehose.storage.base.directory) from /var/lib/cloudera-host-monitor to somewhere else in order to make some tests. After that, I turned back to the old place, having never deleted the data from before those directory changes. Nevertheless, after this directory changes, the data collected before is not used to plot the charts. I can only see on charts data collected after al those changes, despite having much more information in the directory. Is there anything I could do for the old data to be seen on charts? My current CDH and CDM version is 5.15 Thanks!
... View more
Labels:
- Labels:
-
Cloudera Manager
01-30-2019
07:19 AM
2 Kudos
There was a file missing in the workflow's file addition via hue. Solved adding the path to the specific file on HUE's oozie editor
... View more
01-25-2019
12:24 PM
Hello, I'm trying to run a bash script located at a HDFS folder via oozie workflow. This script has a lot o configurations to be done by the user, and they are passed to the script via a file the script has to read and is put at the same HDFS folder as the script. The script cannot read it, though, and I wonder if I should comply with some specific path configurations for this to work on HDFS. When this code here is reached: if [ ! -e `pwd`/vars.sh ] ; then
echo "Unable to continue: file vars.sh doesn't exist!"
exit 1
fi the script exits. Is the "pwd" a problem? I know that it pressuposes that the oozie workflow would respect the path of execution of my script. Is this a mistake? Thanks!
... View more
Labels:
- Labels:
-
Apache Oozie
-
HDFS