Member since
06-24-2019
1
Post
0
Kudos Received
0
Solutions
06-24-2019
05:26 PM
As a complement to Matt Foley's answer: concerning MLOptimizer, I think they were either meaning generic optimization algorithms such as Gradient Descent, available in mllib.optimization package (see https://spark.apache.org/docs/2.3.0/mllib-optimization.html), or they were meaning ML algorithm hyper-parameter optimization. Hyper-parameter tuning using e.g. cross-validation and grid-search is available in the Spark ML tuning package (see https://spark.apache.org/docs/2.2.0/ml-tuning.html). However, if they were meaning automatic hyper-parameter optimization using for example Bayesian optimization, then I would like to know more about it...
... View more