Member since
07-09-2015
70
Posts
29
Kudos Received
12
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
12617 | 11-23-2018 03:38 AM | |
2950 | 10-07-2018 11:44 PM | |
3639 | 09-24-2018 12:09 AM | |
5858 | 09-13-2018 02:27 AM | |
3972 | 09-12-2018 02:27 AM |
09-24-2018
05:54 PM
Correct, I'm using CDSW 1.4. The workaround has resolved the issue. Thank you
... View more
04-11-2018
01:33 AM
1 Kudo
Hi, The documentation has an image explaining this: https://www.cloudera.com/documentation/data-science-workbench/latest/topics/cdsw_dist_comp_with_Spark.html The answer is yes, if you start a Python 2 session and you create a SparkSession object there you will run the Spark application in client mode and the Spark driver will be inside the CDSW session (docker container). This is the primary use-case for CDSW. Regards, Peter
... View more
04-04-2018
11:29 AM
It sounds like requests is not installed on your executors. You could manually install these libraries on all executors or ship it using Spark following the techniques outlined in this blog post: https://blog.cloudera.com/blog/2017/04/use-your-favorite-python-library-on-pyspark-cluster-with-cloudera-data-science-workbench/ . Tristan
... View more
10-09-2017
11:21 AM
Hello, I can reproduce the "Engine exited with status 2." if I'm using the v1 engine but it works with the v2 engine. What version are you using? - If you are using 1.0.x then I recommend upgrading to 1.1.1. - If you are already using 1.1.1 then you should use the v2 engine. You can go to your Settings menu on the Project page and select the "Base Image v2, docker.repository.cloudera.com/cdsw/engine:2" on the Engines tab. You can also change the default engine in the Admin menu but it will be applied to new projects, for already existing ones you need to select it manually. Regards, Peter
... View more
09-14-2017
07:32 AM
Can you please provide more details of what service you have restarted?? Thanks, MK
... View more
07-26-2017
03:08 AM
Indeed, the cdsw host was added to the cdh manager but it just needed a spark gateway deployment Thanks
... View more
07-20-2017
07:04 AM
Hello, I fixed it. In the Spark2 Configuration Screen (In Cloudera Manager for the CDH cluster), Hiver Service was set to none I set it to Hive and CDSW is now working as expected. Thanks!
... View more
07-07-2017
01:40 AM
Found it. I interpreted "MASTER" as being Master node of the CDH cluster 😉 Unsing the right IP did fix the issue Thanks
... View more
07-05-2017
02:29 AM
Hi the problem when stuck in ContainerCreating state, base on logs generated by command"kubectl describe pod web-3826671331-5b7wk" was failing to mount masterserverIP:/var/lib/cdsw/projects the reason was the the incorrect master IP at /etc/cdsw/config/cdsw.conf Regards. NES
... View more
- « Previous
- Next »