Member since
10-09-2016
8
Posts
0
Kudos Received
0
Solutions
11-09-2020
08:56 AM
To create a table in this way, there are two steps: CREATE TABLE ... LOAD DATA INPATH ... The first statement creates the table schema within Hive, and the second directive tells Hive to move the data from the source HDFS directory into the Hive HDFS table directory /user/joe/sales.csv => /user/hive/warehouse/sales/sales.csv The move operation occurs as the 'hive' user, so in order for this to complete, the 'hive' user must have access to perform this move operation in HDFS. Ensure that the 'hive' user has the correct permissions to move this file into the final location. (Impala, but a lot of overlap with Hive) https://docs.cloudera.com/documentation/enterprise/6/latest/topics/impala_load_data.html Also please note that latest version is 6.3.4 and has lots of benefits over 6.0. https://docs.cloudera.com/documentation/enterprise/6/release-notes/topics/rg_cdh_63_packaging.html
... View more
10-23-2020
10:28 AM
1 Kudo
https://issues.apache.org/jira/browse/IMPALA-8454 is the apache impala jira
... View more
10-22-2020
07:16 AM
1 Kudo
Here's a relevant community answer... https://community.cloudera.com/t5/Support-Questions/How-to-use-merge-in-sqoop-import/td-p/161847 Mike
... View more
10-21-2020
01:00 AM
It started working by adding credentials to sqoop actions. Thanks for the help.
... View more