Support Questions

Find answers, ask questions, and share your expertise

unknown host exception in Spark SQL context

  • client OS is RHEL 6.7
  • Our cluster is running HDP 2.4.2
  • Our cluster is configured with an HA namenode,
  • We're using Kerberos for authentication.

scala> val parquetFile ="hdfs://clustername/folder/file")

any idea whats the issue ???


java.lang.IllegalArgumentException: cluster name


Expert Contributor

your host name is set to "cluster name" which is incorrect. instead of the hdfs://clustername/folder/file" use hdfs://hostname/folder/file", update it with your hostname.

About this answer: this isn't an answer, that's why we want HA and the cluster name. Is there no other way?

New Contributor

I have used localhost as address assuming you are use a standalone conf, else you need to find out the name of the namenode. Also it might just need hdfs instead of webhdfs as prefix of the address. law dissertation writing service