Member since
10-01-2015
3933
Posts
1150
Kudos Received
374
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3365 | 05-03-2017 05:13 PM | |
2796 | 05-02-2017 08:38 AM | |
3076 | 05-02-2017 08:13 AM | |
3006 | 04-10-2017 10:51 PM | |
1517 | 03-28-2017 02:27 AM |
01-08-2016
07:38 PM
@Yuri Chemolosov there's no way to upgrade individual components aside from Spark.
... View more
01-08-2016
07:37 PM
here's blueprint reference api. @avoma
... View more
01-08-2016
07:34 PM
+1 you will benefit from a lot of improvements @Gerd Koenig
... View more
01-08-2016
06:57 PM
1 Kudo
I'm afraid the only way to get the latest version and not lose support is to upgrade the whole stack @Gerd Koenig.
... View more
01-08-2016
06:40 PM
@Enis @vshukla @Devaraj Das @Ram Sriharsha
... View more
01-08-2016
06:38 PM
there's work in progress to make Spark and HBase work efficiently together on the Hortonworks Side, we're not publishing anything until we can support it.
... View more
01-08-2016
06:34 PM
@Cui Lin I updated my response above with links to mapreduce examples. You will need to setup a scanner based on your criteria and then run mapreduce to write out the data to files for Pig, here's an example to read data from Hbase table, then you just call "store data into 'location' using storage of your choice.
... View more
01-08-2016
06:25 PM
you can write Mapreduce program to dump data to files, you can use pig, you can use python with happybase, you have a lot of different options @Cui Lin.
... View more
01-08-2016
03:50 AM
@Hani Al-Shater try this and let us know
... View more
01-08-2016
03:36 AM
this is a known issue in HDP 2.3.2 docs, there is no mention of fix available in 2.3.4 docs. Looks like you need to wait a bit more for a fix.
... View more