Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

How can I run multiple beeline commands in a script without authenticating every time?

avatar
Expert Contributor

As part of a script I'm writing, I'm wanting to get the HDFS location of a list of Hive schemas that's passed in via a text file. The best way I can think to do this is to run a beeline command in a loop that performs a `describe schema` command and extracts the HDFS location of each schema from this output. However, this will require me to authenticate every time I run this command which is inefficient. Is there a better way to programtically get the HDFS locations of a list of Hive schemas?

1 ACCEPTED SOLUTION

avatar
Super Guru

@Josh Nicholson,

You can put all your sql commands in a file and run the file using beeline.

Ex queries.sql has below statements

describe formatted table1;
describe formatted table2;

You can run the queries.sql like below

beeline -u "{url}" -f queries.sql

.

Please "Accept" the answer if this helps.

View solution in original post

4 REPLIES 4

avatar
Super Guru

@Josh Nicholson,

You can put all your sql commands in a file and run the file using beeline.

Ex queries.sql has below statements

describe formatted table1;
describe formatted table2;

You can run the queries.sql like below

beeline -u "{url}" -f queries.sql

.

Please "Accept" the answer if this helps.

avatar
Expert Contributor

Thanks @Aditya Sirna, I think this will get me what I need. What I'm ultimately trying to get is the HDFS location so I can use it in the script I'm writing. Is performing a describe table and then grepping the output the best way to do this?

avatar
Super Guru

@Josh Nicholson,

You can grep for the location. I am not able to think of other solution for now.

avatar

Hey Josh:

you could do:

SHOW CREATE TABLE mytable;

and then look for the keyword LOCATION in the output.

When i run that in my sql client, the hdfs path is the next line. You can also look for a line that starts with

'hdfs://

If you want to use this with the information @Aditya Sirna provided, you could have a file with multiple statements like:

SHOW CREATE TABLE mytable;

SHOW CREATE TABLE mytabl1;

SHOW CREATE TABLE mytabl2;

and then filter for lines that start with hdfs. I haven't found a way to get JUST the Location of a table.

Hope that helps.

Thanks!

Regards