Member since
05-15-2019
177
Posts
4
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
442 | 10-22-2024 04:07 AM | |
796 | 10-16-2024 12:56 PM | |
694 | 06-08-2022 10:49 AM |
10-22-2024
04:07 AM
@jayes Unfortunately there is no compression setting for the Hive Export. This feature was introduced when Hive CLI was used in the HDP days of Hortonworks. You will need to create your tables with compression enabled, and in your case you will need to either do one of the following. Alter the table and add compression to the table properties, and then do an insert overwrite to the table to compress it. Or create a new table with compression added to the table properties, and then insert the data from the old table into the new one. I would recommend using the Snappy Compression.
... View more
10-18-2024
05:47 AM
Please see the accepted compression formats supported below. https://docs.cloudera.com/cdp-private-cloud-base/7.1.9/managing-clusters/topics/cm-choosing-configuring-data-compression.html When exporting in hive it will compress the data.
... View more
10-16-2024
01:44 PM
1 Kudo
@Arathi Can you please open a case on the Cloudera Support Portal. Please attach the application log, hiveserver2 logs from the time period the job failed, and the beeline console output from the failed query.
... View more
10-16-2024
01:37 PM
@jayes Unfortunately the Hive Import/Export is only supported for HDFS. The only method I know of to get the table and data into S3 is as follows, see example below. You need to create a table that is mapped onto S3 bucket and directory CREATE TABLE tests3 ( id BIGINT, time STRING, log STRING ) row format delimited fields terminated by ',' lines terminated by '\n' STORED AS TEXTFILE LOCATION 's3n://bucket/directory/'; Insert data into s3 table and when the insert is complete the directory will have a csv file INSERT OVERWRITE TABLE tests3 select id, time, log from testcsvimport;
... View more
10-16-2024
01:25 PM
@allen_chu This looks like a Yarn Resource issue. I would recommend opening a case in the Cloudera Support Portal under the Yarn Component to get further assistance with this.
... View more
10-16-2024
12:56 PM
1 Kudo
Hello @Patriciabqc It seems this was fixed in CDP-7.1.8 according to the TSB, please see the Knowledge base link below. https://my.cloudera.com/knowledge/TSB-2022-600-Renaming-translated-external-partition-table?id=353902
... View more
10-03-2024
06:51 AM
In HDP3 the default for Hive is the 1.2 jars. In order to take advantage of the newer jars you will need to use Hive LLAP which uses Hive3. You may also need to use the Hive Warehouse Connector.
... View more
08-22-2024
04:49 AM
1 Kudo
Hello murtaza74, At this time this is unknown for adding Hive4 into CDP Private Cloud. We have however incorporated a lot of fixes and features which are a part of Hive4. Is there a particular feature you are looking for?
... View more
06-08-2022
10:49 AM
1 Kudo
Hello Liwei, Unfortunately the TEZ UI has been deprecated in HDP-3.x Hive 3. This has been replaced with DAS (Data Analytic Studio) To use DAS in HDP you will need to reach out to your Account Team at Cloudera. If you are using CDP, DAS should already be included, and can be added as a service.
... View more