Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to run HQL file in Spark

avatar
Contributor

I want to read a hql file in spark job. This hql creates a table by joining 3-4 other tables.

I don't want to write the sql statement in the spark job instead I want to pass HQL file as an argument to spark job and then run the hql file.

Is it possible in Spark ?

1 ACCEPTED SOLUTION

avatar
Contributor

We can use below Scala API to read file:

sqlContext.sql(scala.io.Source.fromFile("/vzhome/agaram8/HQLScripts/count.hql").getLines

View solution in original post

8 REPLIES 8

avatar

avatar

@Amit Kumar Agarwal

If you are looking to do it from a program then try something like the below:

http://stackoverflow.com/questions/31313361/sparksql-hql-script-in-file-to-be-loaded-on-python-code

avatar
Contributor

This link explains how to execute hive sql using spark-sql shell. But I want to the call the file programatically not through shell.

avatar
Contributor

I went through this Stackoverflow link, but I don't see any 'open' API in spark...getting compiler error...

avatar

"open" is not a spark api command, it is a python command. What language are you using? Replace open("file.hql").read() with the equivalent command/code-block in that language.

avatar
Contributor

I'm using scala..couldn't find equivalent "open" API in scala.

avatar
Contributor

We can use below Scala API to read file:

sqlContext.sql(scala.io.Source.fromFile("/vzhome/agaram8/HQLScripts/count.hql").getLines

avatar
New Contributor

Hi

Amit Kumar Agarwal

I am looking to run Hive HQL from SPARK SQL.. could you please provide the guidance for same.

Thanks,

Deepesh

deepeshnema@gmail.com