Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Saving Spark ML model to Another Sub-Cluster of VMs running HDFS

Saving Spark ML model to Another Sub-Cluster of VMs running HDFS

New Contributor

I am a developer using HDP 2.4.2, Spark 1.6.1, NiFi HDF 1.2.0.0 (0.6.1)

We have a cluster of 3 VMs with NiFi, 9 VMs for Spark, and 6 VMs with HDFS. I would like to save a Spark ML model to a different VM, but I cannot seem to specify the HDFS URI, mainly because I cannot find the correct address. I can see configuration in Ambari as well as core-site.xml and hdfs-site.xml for hadoop. I just do not know which address and port to use.

2 REPLIES 2

Re: Saving Spark ML model to Another Sub-Cluster of VMs running HDFS

Rising Star

You have to specify the address and the port of the HDFS NameNode.

Re: Saving Spark ML model to Another Sub-Cluster of VMs running HDFS

New Contributor

Which field in which config file/Ambari would I find this value?

Don't have an account?
Coming from Hortonworks? Activate your account here