- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
unknown host exception in Spark SQL context
- Labels:
-
Apache Spark
Created ‎12-15-2016 04:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- client OS is RHEL 6.7
- Our cluster is running HDP 2.4.2
- Our cluster is configured with an HA namenode,
- We're using Kerberos for authentication.
scala> val parquetFile = sqlContext.read.parquet("hdfs://clustername/folder/file")
any idea whats the issue ???
Error:
java.lang.IllegalArgumentException: java.net.UnknownHostException: cluster name
Created ‎12-15-2016 09:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
your host name is set to "cluster name" which is incorrect. instead of the hdfs://clustername/folder/file" use hdfs://hostname/folder/file", update it with your hostname.
Created ‎03-12-2019 10:38 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
About this answer: this isn't an answer, that's why we want HA and the cluster name. Is there no other way?
Created ‎04-25-2019 12:41 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have used localhost as address assuming you are use a standalone conf, else you need to find out the name of the namenode. Also it might just need hdfs instead of webhdfs as prefix of the address. law dissertation writing service
