Hi All,
I have a already setup HDP 3.0 cluster having total 7 machines with 3 worker nodes.
I also installed Ranger to do security and a user already created on hive. Now I want to access hive table using spark. I have tried both using spark-shell and java code But both is getting same error.
I am using spark llap to provide username/password for hive. But spark got an Exception:"No service instances found in registry"
Log
scala> df.show
[Stage 0:> (0 + 0) / 1]18/12/10 17:08:51 WARN TaskSetManager: Stage 0 contains a task of very large size (423 KB). The maximum recommended task size is 100 KB.[Stage 0:> (0 + 1) / 1]18/12/10 17:08:55 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, s1.us-east4-c.c.asdf-224606.internal, executor 2): java.lang.RuntimeException: java.io.IOException: No service instances found in registry at com.hortonworks.spark.sql.hive.llap.HiveWarehouseDataReaderFactory.createDataReader(HiveWarehouseDataReaderFactory.java:66)
If my table is having 0 records than it works fine and show me a blank table but if records are greater than 0 than it gives me exception.
Thanks in advanced.