Member since
02-24-2016
84
Posts
19
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4563 | 01-09-2017 08:03 AM | |
9826 | 11-22-2016 02:34 AM | |
3906 | 06-10-2016 04:16 AM | |
2691 | 05-18-2016 02:00 AM | |
2679 | 05-09-2016 05:58 AM |
09-15-2016
05:40 AM
Without hadoop sprk will not work Refere below link for more http://stackoverflow.com/questions/30906412/noclassdeffounderror-com-apache-hadoop-fs-fsdatainputstream-when-execute-spark-s
... View more
09-15-2016
05:17 AM
As per the log Can't open /var/run/cloudera-scm-agent/process/117-hdfs-NAMENODE/supervisor.conf: Permission denied. Check directory permissions
... View more
07-18-2016
08:09 AM
Thanks @Romainr for reply. Is there any other way to fix apart from changing the db..?
... View more
07-18-2016
04:44 AM
How did you solve the issue, i'm also facing the same. Could you plz post your solution
... View more
06-26-2016
11:11 PM
2 Kudos
Its a common issue, if you don't have enough memory (RAM) on nodes
Memory on host xxxxx is overcommitted. The total memory allocation is 107.8 GiB bytes but there are only 62.7 GiB bytes of RAM (12.5 GiB bytes of which are reserved for the system). Visit the Resources tab on the Host page for allocation details. Reconfigure the roles on the host to lower the overall memory allocation. Note: Java maximum heap sizes are multiplied by 1.3 to approximate JVM overhead
First calculate installed services on perticulor node, you can find these details in host Resource tab, then memory section. Now calculate allocated memory, if its maching your total installed memory, you won't get any "Memory Overcommit Validation Threshold" warnings, if its coming morethat what you have memory, then you will get this type of warnings.
You can overcome by doing either reduce service memory or increasing the node memory (RAM)
... View more
06-20-2016
09:21 PM
I cannot explain more clear, please check below steps Directory permissions: Log-in into Hue (as admin or hdfs), click file browse and select the directory which you are facing issue, goto Actions, then check permissions and owner User permissions: Goto users using Hue web UI, and check user permissions on permissions tab
... View more
06-19-2016
09:33 PM
Mostly its related to Hue permissions on that directory or User or Group. Check all the permissions
... View more
06-19-2016
09:31 PM
I have faced this type of problem several times. I tried as like you. but problem couln't resolved. Then i changed below properties mapreduce.map.memory.mb = 0 mapreduce.reduce.memory.mb = 0 Now its working fine for me. Please try above and post the result
... View more
06-10-2016
04:16 AM
I am using Python 2.7.11 and i followed Version 5.7 same document and it worked.. Thanks
... View more
06-07-2016
02:34 AM
Hi, I foloowed http://blog.cloudera.com/blog/2014/08/how-to-use-ipython-notebook-with-apache-spark/ post to integrate iPython and Spark. I have followed everything. But its not working, while executing i ma getting below message [admin@hostname~]$ ipython notebook --profile=pyspark
[TerminalIPythonApp] WARNING | Subcommand `ipython notebook` is deprecated and will be removed in future versions.
[TerminalIPythonApp] WARNING | You likely want to use `jupyter notebook`... continue in 5 sec. Press Ctrl-C to quit now.
[W 14:49:14.602 NotebookApp] Unrecognized alias: '--profile=pyspark', it will probably have no effect.
[I 14:49:14.842 NotebookApp] The port 8888 is already in use, trying another random port.
[I 14:49:14.845 NotebookApp] Serving notebooks from local directory: /home/admin
[I 14:49:14.845 NotebookApp] 0 active kernels
[I 14:49:14.845 NotebookApp] The Jupyter Notebook is running at: http://hostname:8889/
[I 14:49:14.845 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
[I 14:51:21.271 NotebookApp] 302 GET / (10.x.x.x) 0.60ms Now the problem is it is not recognizing the profile Unrecognized alias: '--profile=pyspark', it will probably have no effect. Help me to integrate iPython and Spark
... View more
Labels:
- Labels:
-
Apache Spark
- « Previous
- Next »