Member since
01-13-2016
4
Posts
0
Kudos Received
0
Solutions
05-21-2016
01:30 AM
Knox supports such a use case by means of multiple topology files. However, Ambari supports only one single topology file, for the cluster managed by Ambari in which Knox is running as a service. For your requirements it's the best to install Knox stand-alone and configure it manually. It's an easy operation, you can find details here.
... View more
01-14-2016
01:56 AM
1 Kudo
This message will pop up any time an application is requesting more resources from the cluster than the cluster can currently provide. What resources you might ask? Well Spark is only looking for two things: Cores and Ram. Cores represents the number of open executor slots that your cluster provides for execution. Ram refers to the amount of free Ram required on any worker running your application. Note for both of these resources the maximum value is not your System’s max, it is the max as set by the your Spark configuration. 1. Check out the current state of your cluster (and it’s free resources) at SparkMasterIP:7080 2.Make sure you have not started Spark Shell in 2 different terminals.The first Spark shell might consume all the available cores in the system leaving the second shell waiting for resources. Until the first spark shell is terminated and its resources are released, all other apps will display the above warning. The short term solution to this problem is to make sure you aren’t requesting more resources from your cluster than exist or to shut down any apps that are unnecessarily using resources. If you need to run multiple Spark apps simultaneously then you’ll need to adjust the amount of cores being used by each app.
... View more