Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

host unreachable error occurring between 2 hadoop cluster when submitting a spark job .

Contributor

hi all,

i'm trying to run a job in <Source_A> Hadoop cluster and it should connect and populate data in <Destination_A> Hadoop Cluster, the problem we are facing recently is, the job running in <Source_A> throwing error like IllegalArgument.Error-Unknown Host: <Destination_B>.

is there any place i should look in and correct coz all the Hosts file entry is perfect only.

please help me to solve this issue.

the Error is:

java.lang.IllegalArgumentException: java.net.UnknownHostException: Destnation_A

Regards,

MJ

1 REPLY 1

Super Guru

@Manikandan Jeyabal

Are you able to ping the destination host from source?

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.