Member since
05-19-2016
216
Posts
20
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4281 | 05-29-2018 11:56 PM | |
7154 | 07-06-2017 02:50 AM | |
3847 | 10-09-2016 12:51 AM | |
3657 | 05-13-2016 04:17 AM |
03-13-2021
07:29 AM
Hi @EricL This happens due to the hive server is not working or stopped due to some problem. Restart the hive server with the below command and try again. hive --service hiveserver2 & Regards, Dhirendra
... View more
11-15-2019
05:17 AM
Can you more elaborate on "Hostname tied to the actual IP address?" and "Use "ifconfig -a" to see a listing of your network interfaces and choose one that has an actual IP address." How do I know which hostname and ip address to use. Because while installing single node cluster of cloudera how would you know which hosts to specify? Thank you!
... View more
04-10-2019
08:31 AM
Hi, Any update on this? Is this issue resolved? if yes, please let us know the solution, we are also facing the same issue.
... View more
10-18-2018
12:00 AM
link is no longer available. Please advise.... ThankS!
... View more
10-09-2018
09:04 AM
1 Kudo
Setting the cron job will take this particular error away but eventually, you are bound to run into a lot of other issues. Feel free to try though. Also, let me know your experience after trying that 🙂
... View more
07-18-2018
08:29 AM
Not anle read or write from to/from HDFS in CM - getting error: block size is 0 and no data is present .. When checking the CM UI all HDFS nodes are hving green status and no issue .
... View more
07-03-2018
07:29 AM
1 Kudo
It could be a memory issue, and it does not relate to the condition of the server at all. The containers running on YARN have their max memory size, and the NodeManager carefully watch over. When the container allocates more memory, then it gets killed. Also keep in mind that the container size is the whole JVM, so if it is not 100% for map or reduce data. For example on Spark you have to calculate how big container do you need if you want to use X MB of memory for cache and Y MB of memory for code.
... View more
06-21-2018
03:06 AM
Basically issue is the Hadoop is running an id command against the user and not including -- in front of the user if the username starts with a -. 1. As per POSIX standard, a username should not start with a hyphen character, and "-". It is causing this problem. 2. The authorization is done over Kerberos which is working fine. 3. Seems like this userid needs its Unix groups to perform some action on HDFS which is throwing this error (eg writing to a directory using group permissions). This is because the group lookup is not working due to above mentioned issue.
... View more