Member since
06-18-2015
55
Posts
34
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1349 | 03-04-2016 02:39 AM | |
1905 | 12-29-2015 09:42 AM |
05-29-2017
03:55 AM
if you are using hbase shell for scanning, you can try: > scan '<table>', CACHE => 1000 this CACHE will tell hbase RS to cache some certain number of rows before return, which can save lots of RPC calls.
... View more
03-17-2016
08:52 AM
As mentioned in my comment I already tried ,but it didn't work .
... View more
03-04-2016
02:39 AM
2 Kudos
Able to resolve it as missing one of the jars files hbase-hadoop-compat.jar:
... View more
02-19-2016
02:40 AM
1 Kudo
@asinghal Why doesn't it throw error when I run the same command HDP 2.3.2 sandbox ? In sandbox it works fine and I couldn't see Jackson dependencies conflicts error .
... View more
03-24-2017
01:58 AM
Hope this will help anyone: Add or modify below properties Ambari > Spark > Configs > Custom spark-defaults spark.eventLog.enabled = true spark.history.fs.logDirectory = hdfs://<fs.defaultFS>/<directory-to-your-spark-logs> Ambari > Spark > Configs > Advanced spark-defaults spark.history.provider = org.apache.spark.deploy.history.FsHistoryProvider
... View more
02-15-2016
02:37 AM
1 Kudo
Thanks Ali for the help
... View more
02-18-2016
08:40 AM
1 Kudo
@Artem Ervits : I had to configure SSH Tunnel as my cluster was running on EC2. Thanks a lot.
... View more
12-29-2015
09:42 AM
Finally resolved the issue the input data was not correct format so when
I was using Timestamp/DateType its was returning empty result set.
... View more
02-02-2016
03:47 PM
@Divya Gehlot has this been resolved? Can you post your solution or accept best answer?
... View more
12-22-2015
01:55 PM
@Divya Gehlot See @Andrew Grande point.
... View more