Member since
04-11-2016
535
Posts
148
Kudos Received
77
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7486 | 09-17-2018 06:33 AM | |
1825 | 08-29-2018 07:48 AM | |
2741 | 08-28-2018 12:38 PM | |
2117 | 08-03-2018 05:42 AM | |
1984 | 07-27-2018 04:00 PM |
07-05-2018
09:12 AM
@Rajendra Manjunath Can you more details on where this error is observed?
... View more
07-03-2018
08:42 AM
2 Kudos
@Siddarth Wardhan If you are using Tez as execution engine, then you need to set below properties: set hive.merge.tezfiles=true;
set hive.merge.smallfiles.avgsize=128000000;
set hive.merge.size.per.task=128000000;
... View more
07-02-2018
07:29 AM
@heta desai When a Hive external table is created, then the Hive table would be pointing to existing Druid data source and Superset can to viewed from Druid datasource and queried upon.
... View more
06-25-2018
06:06 AM
@Nagendra Sharma I verified internally within the code and addBatch method is not supported even in the latest version of HDP. Below is the snippet from the code: public void addBatch() throws SQLException {
// TODO Auto-generated method stub
throw new SQLException("Method not supported");
}
... View more
04-06-2018
07:17 AM
@Alexander Schätzle There is no known workaround for YARN-6625. The issue is fixed in HDP 2.6.3 (fixed issue doc). If you are using lower version, then you could consider upgrading to latest version with fix.
... View more
04-04-2018
06:19 PM
@Adrien Mafety The issue might be related to PARQUET-377 when the parquet file is created from different version and Hive uses different version o f Parquet.
... View more
04-04-2018
03:08 PM
@PJ Sqoop using MapReduce as execution engine to read the data from Netezza and loading data into Hive is usually using "load data inpath" statement. To specify the queue for Sqoop job run, try passing -Dmapreduce.job.queuename=<queue_name>
... View more
04-04-2018
06:29 AM
1 Kudo
@Abhinav Joshi HDF and HDP cluster does not support Heterogeneity of component versions as of now. Adding to it, difference in OS version across nodes in the cluster is not supported as well.
... View more
04-02-2018
08:47 AM
@avinash nishanth Sqoop with hcatalog does not work with bucketed hive table as this is not supported yet. ( import or export) For the export, you might have to load the data from bucketed table to non bucketed table and then do a sqoop export of the non bucketed table. Reference KB Link.
... View more
03-21-2018
09:36 AM
@SUDHIR KUMAR To list tables, you need to use 'show tables;'. Also, FYI, link is for Hive QL.
... View more