Member since
12-16-2015
23
Posts
6
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6244 | 09-15-2016 08:19 AM |
05-06-2020
03:24 AM
This could be permission issue. you can see the hive server2 log for the error. Log will be in /var/log/hive on the node to which you connect the hive
... View more
04-03-2017
03:49 AM
mapred.child.java.opts seems to be depricated. Below are the values from cluster and the one used in driver code. In Code : ======================= config.set("mapreduce.map.java.opts","-Xmx8192m") config.set("mapreduce.reduce.java.opts","-Xmx8192m"); In Cluster : ================================== <property>
<name>mapreduce.reduce.java.opts</name>
<value>-Xmx26214m</value>
</property> <property>
<name>mapreduce.map.java.opts</name>
<value>-Xmx13107m</value>
</property>
... View more
09-15-2016
01:49 PM
Yes I agree but I am trying to run hive query inside the map method. While using hive context, I am getting error that class is not serializable.
... View more
07-21-2016
09:51 AM
I'm reading from Hbase in a spark Hdp 2.3.4.0-3485 (and I've done reading with lower Hdp releases). Read hbase through a RDD: Configuration hbaseConfiguration = HBaseConfiguration.create();
hbaseConfiguration.set(TableInputFormat.INPUT_TABLE, "sometable");
JavaPairRDD<ImmutableBytesWritable, Result> hbaseRdd = sc.newAPIHadoopRDD(hbaseConfiguration, TableInputFormat.class, ImmutableBytesWritable.class, Result.class);
Or am I missing something?
... View more
06-22-2016
03:10 AM
Thank you.
... View more
02-11-2016
11:05 AM
@pooja khandelwal I have tested this and this works. Accepting this as best answer.
... View more
12-17-2015
05:27 AM
Thank you @Alex Miller I imported the Knox SSl certificate into cacerts and used the below connection string . (
"jdbc:hive2://knoxserver.net:443/;ssl=true;transportMode=http;httpPath=knox/nn01/hive",
"username", "pwd"); It finally worked.. 🙂
... View more