I don't believe that Sqoop supports importing from Hive or exporting to Hive. It is intended as a bridge between Hive and RDBMS. However, you should be able to do what you want.
From within hive, run the following command:
insert overwrite local directory '/home/carter/staging' row format delimited fields terminated by ',' select * from hugetable;
This command will save the results of the select on the table to a file on your local system.
If you want to do it externally from hive, say via the unix command line, you could try this:
hive -e 'select * from your_Table' | sed 's/[\t]/,/g' > /home/yourfile.csv
The first command will run a query in Hive and pipe it to sed which converts the tab-delimited lines to using a comma and saves it to a csv file. Push this file to HDFS and then you can import that CSV file into the other Hive DB via an external table.
hive -e 'set hive.cli.print.header=true; select * from your_Table' | sed 's/[\t]/,/g' > /home/yourfile.csv
The second command is similar, but specifies that hive should print the headers.