Support Questions

Find answers, ask questions, and share your expertise

unknown host exception in Spark SQL context

avatar
Expert Contributor
  • client OS is RHEL 6.7
  • Our cluster is running HDP 2.4.2
  • Our cluster is configured with an HA namenode,
  • We're using Kerberos for authentication.

scala> val parquetFile = sqlContext.read.parquet("hdfs://clustername/folder/file")

any idea whats the issue ???

Error:

java.lang.IllegalArgumentException: java.net.UnknownHostException: cluster name

3 REPLIES 3

avatar
Super Collaborator

your host name is set to "cluster name" which is incorrect. instead of the hdfs://clustername/folder/file" use hdfs://hostname/folder/file", update it with your hostname.

avatar

About this answer: this isn't an answer, that's why we want HA and the cluster name. Is there no other way?


avatar
New Contributor

I have used localhost as address assuming you are use a standalone conf, else you need to find out the name of the namenode. Also it might just need hdfs instead of webhdfs as prefix of the address. law dissertation writing service