Member since
09-25-2015
356
Posts
382
Kudos Received
62
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2443 | 11-03-2017 09:16 PM | |
1921 | 10-17-2017 09:48 PM | |
3836 | 09-18-2017 08:33 PM | |
4512 | 08-04-2017 04:14 PM | |
3464 | 05-19-2017 06:53 AM |
03-21-2017
05:19 PM
How did you install Hive (through Ambari or manually)? Can you check /var/log/hive/hivemetastore.log?
... View more
03-17-2017
02:21 AM
For me the command worked. hive> show partitions tablename;
OK
load_s=2017-02-22 12%3A21%3A39
hive> alter table tablename drop partition (load_s=timestamp '2017-02-22 12:21:39');
Dropped the partition load_s=2017-02-22 12%3A21%3A39
OK
You can potentially use this workaround on other thread.
... View more
03-17-2017
12:20 AM
1 Kudo
Yes, the extension on the filename should not matter as long as you are using the correct file path.
... View more
03-16-2017
05:26 PM
Thanks, useful to know that its finally been implemented.
... View more
03-16-2017
05:19 PM
1 Kudo
There is no "show views" command in Hive. You use "show tables" and it should list the views as well. However there is an open Apache Hive issue HIVE-1010, that will implement INFORMATION_SCHEMA in Hive which will make it possible to query views exclusively.
... View more
03-16-2017
05:04 PM
3 Kudos
You could accomplish this by temporarily changing the partitioning column type to string, see below: -- Change the column type to string
alter table crhs_fmtrade_break partition column (reporting_date string);
-- Drop the offending partitions
alter table crhs_fmtrade_break drop partition(reporting_date='$%7Bhiveconf%3Areporting_date}');
...
-- Change the column type back to date
alter table crhs_fmtrade_break partition column (reporting_date date)
... View more
03-15-2017
11:52 PM
If you ran this using beeline connecting to HS2, can you check the full error stack trace in the hiveserver2.log?
... View more
03-15-2017
09:08 PM
1 Kudo
Can you try the following: ALTER TABLE tablename DROP PARTITION IF EXISTS (load_s=timestamp '2017-02-22 12:21:39');
... View more
03-15-2017
09:02 PM
2 Kudos
Hive does not support like semantics for drop table, see official DDL semantics here. You will need multiple calls to accomplish this and handle them in an external driver.
... View more
03-15-2017
06:34 AM
1 Kudo
Yes, its possible to import data between Hadoop datastores (HDFS, Hive, HBase) and Oracle. Here is a sample command for import data from HDFS to Oracle. sqoop export --connect jdbc:oracle:thin:@oradb.example.com:1521:ORCL --table bar --username user --password passwd --export-dir /user/test/data Above command assumes that the csv data file to be exported into Oracle is in the HDFS folder /user/test/data and the table "bar" exists in Oracle. Also the csv data and its column sequence matches with that of the Oracle table.
... View more