Support Questions

Find answers, ask questions, and share your expertise

Doc for spark-llap connector for HDP 2.6.5

Explorer

Hi

The spark-llap connector states it is compatible with HDP 2.6.X. (

https://github.com/hortonworks-spark/spark-llap#compatibility)

However, I am not able to find any documentation on how to make it working. There is no HiveWarehouseBuilder
or LlapContext Scala classes in the branch 2.3.0.

I have been able to compile and configure, and run spark configured correctly.

Right now I am looking for some guidance on how to import the lib in scala and make some queries.

Thanks

4 REPLIES 4

@natus

Please refer this article : https://community.hortonworks.com/articles/72454/apache-spark-fine-grain-security-with-llap-test-dr....

HiveWarehouseBuilder is only available from HDP3.x.

Hope this helps.

Explorer
@Sandeep Nemuri

the mentioned article is about spark-thrift.

I am looking for code for spark. Something equivalent to (from https://github.com/hortonworks-spark/spark-llap):

val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build()

hive.execute("describe extended web_sales").show(100, false)

Explorer

I have tested the example and I get this error:

java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveSessionCatalog.<init>(Lorg/apache/spark/sql/hive/HiveExternalCatalog;Lorg/apache/spark/sql/catalyst/catalog/GlobalTempViewManager;Lorg/apache/spark/sql/hive/HiveMetastoreCatalog;Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;Lorg/apache/spark/sql/internal/SQLConf;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/spark/sql/catalyst/parser/ParserInterface;Lorg/apache/spark/sql/catalyst/catalog/FunctionResourceLoader;)V
at org.apache.spark.sql.hive.llap.LlapSessionCatalog.<init>(LlapSessionCatalog.scala:49)
at org.apache.spark.sql.hive.llap.LlapSessionStateBuilder.catalog$lzycompute(LlapSessionStateBuilder.scala:33)
at org.apache.spark.sql.hive.llap.LlapSessionStateBuilder.catalog(LlapSessionStateBuilder.scala:32)
at org.apache.spark.sql.hive.llap.LlapSessionStateBuilder.catalog(LlapSessionStateBuilder.scala:26)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:68)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:68)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)