Member since
06-24-2018
59
Posts
8
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
9536 | 01-12-2019 05:48 AM | |
16941 | 08-26-2018 10:41 AM | |
6809 | 08-13-2018 05:39 AM | |
5609 | 08-06-2018 07:45 AM |
08-26-2018
09:11 AM
1 Kudo
restrict your cluster to only whitelisted ip, use some firewall, it will be solved
... View more
08-26-2018
09:04 AM
1 Kudo
are you sure that resource manager is on this host ? ip-172-31-35-169
... View more
08-26-2018
06:59 AM
1 Kudo
Alright, few questions reply asap. Is your cluster exposed to internet ? what kind of server you are using ? Do you have any firewall ? can you verify hitting this command on resource manager host and tell me what do you see "sudo -u yarn crontab -l"
... View more
08-23-2018
06:45 AM
1 Kudo
I solved this, it was not memory related issue anyways. Thank you commenting. But i wonder if you can help me with oozie? I am trying create workflow of flume with oozie, any useful examples will be help
... View more
08-19-2018
09:22 AM
yeah, this dr who is actually name of any unautherized user, it happens when your cluster is exposed to internet. you might also want to check any running crons and kill them using sudo -u yarn crontab -l on resource manager host. Thanks
... View more
08-19-2018
08:24 AM
well, this is not helping, can you post resource manager logs ? unexpected exists are usually occured because of something going on with resource manager. Beside do check in application section of Resource manager web ui are there any running apps? if yes who is the user ?
... View more
08-19-2018
07:13 AM
He created, resource pool in yarn for omitting this issue, but in my case it was totally different. I would be happy to help you if you share logs or something. Who is creating the job ? is it Dr who or your own use r?
... View more
08-19-2018
12:25 AM
Go to, resource manager of yarn, from there you can get the logs
... View more
08-18-2018
10:34 AM
Sure, please place your logs here first, so i can see is the issue similar
... View more
08-16-2018
04:49 AM
Hello, I am trying to work with kafka for data ingestion but being new to this, i kind of pretty much confused. I have multiple crawlers, who extract data for me from web platforms, now the issue is i want to ingest that extracted data to hadoop using kafka without any middle scripts/service file . Main commplication is that, platforms are disparate in nature and one web platform is providing real-time data other batch based. Can integrate my crawlers some how with kafka producers ? and they keep running all by themselves. is it possible ? I think it is but i am not getting in right direction. any help would be appreciated. Thanks
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache Spark
-
Cloudera Manager