Member since
11-18-2014
196
Posts
18
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
8663 | 03-16-2016 05:54 AM | |
3997 | 02-05-2016 04:49 AM | |
2848 | 01-08-2016 06:55 AM | |
16300 | 09-29-2015 01:31 AM | |
1728 | 05-06-2015 01:50 AM |
12-15-2015
07:51 AM
Hello, I tried to do : export HADOOP_USER_NAME=my_user
load_events=`yarn logs -applicationId $application_id` I also tried: export HADOOP_USER_NAME=hdfs
load_events=`yarn logs -applicationId $application_id` However I get : Logs not available at /tmp/logs/hdfs/logs/application_1449728267224_0138 Log aggregation has not completed or is not enabled. And this is the message that I get when I do this command with un unauthorised user..
... View more
12-15-2015
02:34 AM
Hello, Suddently one of the 3 flume agents that are on the same machine is not starting anymore. All I have in logs is: DEBUG December 15 2015 10:16 AM Shell
Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:327)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:104)
at org.apache.hadoop.security.Groups.<init>(Groups.java:86)
at org.apache.hadoop.security.Groups.<init>(Groups.java:66)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:269)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:246)
at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:323)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:317)
at org.apache.flume.sink.hdfs.HDFSEventSink.authenticate(HDFSEventSink.java:557)
at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:272)
at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:413)
at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:98)
at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
The health test result for FLUME_AGENT_SCM_HEALTH has become concerning: This role's process exited while starting. A retry is in process.
The health test result for FLUME_AGENT_SCM_HEALTH has become bad: This role's process is starting. This role is supposed to be started.
However, the files remain in .tmp (are never rolled anymore).. I cannot understant how come this agent has the error of the hadoop home dir and the other 2 don"t ... Thank you! Alina
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Hadoop
-
HDFS
-
Security
12-14-2015
11:48 PM
Hello, I told myself that if I cannot add an attachement, I'll just add a link to my log files from HDFS. So I did a : yarn application -list -appStates FINISHED |grep 'my_workflow_name' |grep -Po 'application_\d+_\d+' | sed 's/.*application://' | tail -n 1 in order to find the application id ($my_application_id) that I needed. Afterwards I wanted to do a : yarn logs -applicationId $my_application_id However, this doesn't return any logs if it is not executed with a user that has the rights to read the logs. So I wanted to change it into: sudo -u hdfs yarn logs -applicationId $application_id but then I got the ERROR: sudo: sorry, you must have a tty to run sudo Is there a proper way to find out the logs without changing the level of securiyt security? (http://unix.stackexchange.com/questions/122616/why-do-i-need-a-tty-to-run-sudo-if-i-can-sudo-without-a-password ) Thank you!
... View more
12-14-2015
05:28 AM
You were right, this is linked to : OOZIE-2160 - Oozie email action now supports attachments with an <attachment> element I am in CDH 5.3 Thank you!
... View more
12-10-2015
11:41 PM
Hello, I saw that there is an attachement element, however, I cannot add it in hue... Thank you!
... View more
11-27-2015
02:32 AM
Thank you for your answer. I will try to do a schell script for gathering the logs. However, is there a way to add the logs in the attachement? Thank you! Alina
... View more
11-20-2015
07:59 AM
Hello, Is there a way to attach or add in the content of the email (email action in oozie) the job logs? (the logs from all the actions of the job) I didn't find any wf parameter that could help .. thank you!
... View more
Labels:
- Labels:
-
Apache Oozie
11-16-2015
07:34 AM
Hello, My fault. I had a timestamp problem. Since in France the hour changed (to UTC+1), and we are having the servers still on UTC+2, I tought that it is not changing. Thank you!
... View more
11-08-2015
09:32 AM
I upgraded to CDH 5.3.5 and now I can delete projects/add projects/add components to projects.
... View more
11-08-2015
07:03 AM
Hello, I just tring to understand better the HDFS Federation. If I get it right: - we should use it in order to split for example the real time space and the batch space. - if we want to split the namespace into N namespaces than we have to have N namenodes Thank you! Alina
... View more
Labels:
- Labels:
-
HDFS