Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to connect remote Hadoop cluster using spark

avatar
Explorer

Hi,
I'm running a reconciliation application using Spark. This application needs to connect a table from 2 different clusters Hbase table and compare the data. My application is able to connect and get the data from where the application is running on a cluster. but for remote cluster I'm getting below error 
failed on local exception:

failed on local exception:exception is Failed after attempts=4, exceptions: 2023-04-19T17:11:13.491Z, RpcRetryingCaller{globalStartTime=1681924273451, pause=1000, maxAttempts=4}, java.io.IOException: Call to Host:16020 failed on local exception: java.io.IOException: java.lang.RuntimeException: Found no valid authentication method from options 2023-04-19T17:11:14.505Z, RpcRetryingCaller{globalStartTime=1681924273451, pause=1000, maxAttempts=4}, java.io.IOException: Call to host:16020 failed on local exception: java.io.IOException: java.lang.RuntimeException: Found no valid authentication method from options 2023-04-19T17:11:16.522Z, RpcRetryingCaller{globalStartTime=1681924273451, pause=1000, maxAttempts=4}, java.io.IOException: Call to host:16020 failed on local exception: java.io.IOException: java.lang.RuntimeException: Found no valid authentication method from options 2023-04-19T17:11:19.549Z, RpcRetryingCaller{globalStartTime=1681924273451, pause=1000, maxAttempts=4}, java.io.IOException: Call to host:16020 failed on local exception: java.io.IOException: java.lang.RuntimeException: Found no valid authentication method from options

5 REPLIES 5

avatar
Community Manager

@skasireddy Welcome to the Cloudera Community!

To help you get the best possible solution, I have tagged our Spark experts @jagadeesan and @Bharati  who may be able to assist you further.

Please keep us updated on your post, and we hope you find a satisfactory solution to your query.


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Master Collaborator

@skasireddy Can you please make sure you have copied hbase-site.xml from the remote HBase cluster to /etc/spark/conf/yarn-conf/ or /etc/spark/conf/ on the Edge node from where you are trying to connect your spark application?

avatar
Explorer

Thanks @jagadeesan  for response.
Yes, I'm creating Configuration object and setting all resources (hbase-site.xml, core-site.xml and hdfs-site.xml) for each cluster. the files I'm using are specific to Hbase cluster in configuration object

avatar
Expert Contributor

Hi @skasireddy 

Can you please share spark-submit command used to trigger this application?

Also you can try specifying --files for hbase dependant files and share its outcome?

 

avatar
Explorer

Sorry for late response, I use oozie to submit a spark job