Member since
02-02-2016
583
Posts
518
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3180 | 09-16-2016 11:56 AM | |
1355 | 09-13-2016 08:47 PM | |
5350 | 09-06-2016 11:00 AM | |
3107 | 08-05-2016 11:51 AM | |
5178 | 08-03-2016 02:58 PM |
07-28-2016
11:31 AM
1 Kudo
This seems to be an known issue, please set below parameter on hive shell before running select query. set hive.optimize.ppd = false; https://issues.apache.org/jira/browse/HIVE-11401
... View more
07-28-2016
11:11 AM
mkdir /usr/hdp/current/hive-server2/auxlib/ on HS2 node. copy required jars to /usr/hdp/current/hive-server2/auxlib/ loaction. Added below property into custom hive-server2 and restart the all affected services. And see if that help. <property>
<name>hive.aux.jars.path</name>
<value>/usr/hdp/current/hive-server2/auxlib/</value>
</property>
... View more
07-28-2016
10:19 AM
1 Kudo
can you please post the output of below command? parquet-tools schema <parquet file path>
If parquet-tools is not configured then please follow below link. https://github.com/Parquet/parquet-mr/tree/master/parquet-tools
... View more
07-28-2016
10:11 AM
ok, then you can also put these required jars inside hive.aux.jars.path through ambari hive conf, jars will get picked automatically.
... View more
07-28-2016
10:05 AM
2 Kudos
Hi @Hocine Bouzelat Can you please check if you have defined same column name while creating parquet file schema through spark?
... View more
07-28-2016
10:01 AM
1 Kudo
Hi, can you please try to add those jars saparetly? add jar /tmp/udfs/esri-geometry-api.jar add jar /tmp/udfs/spatial-sdk-hadoop.jar
... View more
07-25-2016
02:08 PM
1 Kudo
Yes, HDP 2.4 has Hive 1.2.1. Is there any issue with that? https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_HDP_RelNotes/content/ch_relnotes_v240.html
... View more
07-25-2016
11:45 AM
Hi @Rajib Mandal Thanks for confirming, please accept the answer to close this thread.
... View more
07-23-2016
01:11 PM
1 Kudo
@Ravikumar Kumashi
If you don't want to loss the source data copy while loading then the best way would be to create external table over that existing hdfs directory OR you can also make a copy of your source directory and create an external hive table that should point to new dir location. hadoop fs -cp /path/old/hivetable /path/new/hivetable
create external table table_name ( id int, myfields string )
location '/path/new/hivetable';
... View more
07-22-2016
08:21 PM
Ohh great, may be my updated answer didn't reflected one time 🙂 Anyway I have accepted your answer. Thanks for updating.
... View more