Member since
11-14-2015
268
Posts
122
Kudos Received
29
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2661 | 08-07-2017 08:39 AM | |
4281 | 07-26-2017 06:06 AM | |
9837 | 12-30-2016 08:29 AM | |
7743 | 11-28-2016 08:08 AM | |
7571 | 11-21-2016 02:16 PM |
05-06-2022
02:42 AM
@arunpoy If we are using the CDH/CDP both timeout parameters(hbase.rpc.timeout & hbase.client.scanner.timeout.period) need to be added in both server-side and client-side in the below paths from the HBase configuration. HBase Service Advanced Configuration Snippet (Safety Valve) for hbase-site.xml HBase Client Advanced Configuration Snippet (Safety Valve) for hbase-site.xml Bothe the time-out parameters need to be added on the server and client-side and RPC(hbase.rpc.timeout) time-out needs to be set a bit higher than the client scanner time-out (hbase.client.scanner.timeout.period).
... View more
07-01-2020
01:23 PM
The Phoenix-Hive storage handler as of v4.14.0 (CDH 5.12) seems buggy. I was able to get the Hive external wrapper table working for simple queries, after tweaking column mapping around upper/lower case gotchas. However, it fails to work when I tried the "INSERT OVERWRITE DIRECTORY ... SELECT ..." command to export to file: org.apache.phoenix.schema.ColumnNotFoundException: ERROR 504 (42703): Undefined column. columnName=<table name> This is a known problem that no one is apparently looking at: https://issues.apache.org/jira/browse/PHOENIX-4804
... View more
11-25-2019
07:20 PM
Hi Vijay.. did u solve this issue? I am having same exception . kindly share .
... View more
10-22-2019
03:33 AM
Hey @axk , Thanks for letting us know. I'm glad it was helpful 🙂
... View more
09-24-2019
10:55 PM
Hi @linuslukia or anyone else... can you PLEASE tell us what is the solution for this problem? I tried and last advice with deleteing the WALS and rm from /hbase-unsecure/rs and restarted zookeeper and hbase and it didn't work.
... View more
09-02-2019
08:33 PM
Can you please elaborate in detail with commands you used to resolve?
... View more
06-21-2018
04:53 AM
1 Kudo
I recommend you take a look at building a uber jar. This will allow you to solve classpath issues where dependencies are not found.
... View more
08-08-2017
12:52 AM
That's an incorrect approach. You don't need to add xml files to the jars. As I already mentioned before, you need to add directories where those files located, not files themselves. That's how java classpath work. It accepts jars and directories only. So if you need a resource in the java classpath, you need to have it in a jar file (like you did) OR put the parent directory to the classpath. In Squirrel it can be done in the Extra classpath tab of the Driver configuration:
... View more
08-02-2017
05:37 AM
It depends on what 50K users is doing. (if your cluster capacity and configuration is right, you can scale horizontally without any problem) If it is just the point lookup(key value access) then depending upon the disk(SSD/HDD) you are using, you should be able to scale without any problem, some basic configuration tweak is required, like increasing no. of handlers for datanode and regionserver, block cache/bucket cache etc If you are doing heavy scans then you may need a large cluster which can bear this load. Network, CPU and disk will play an important role.
... View more
09-13-2017
06:12 PM
@Roni I was facing same kind of issue. I have resolve this issue by using following steps:- 1) Edit Ambari->Hive->Configs->Advanced->Custom hive-site->Add Property..., add the following properties based on your HBase configurations(you can search in Ambari->HBase->Configs): custom hive-site.xml hbase.zookeeper.quorum=xyz (find this property value from hbase ) zookeeper.znode.parent=/hbase-unsecure (find this property value from hbase ) phoenix.schema.mapSystemTablesToNamespace=true phoenix.schema.isNamespaceMappingEnabled=true 2) Copy jar to /usr/hdp/current/hive-server2/auxlib from /usr/hdp/2.5.6.0-40/phoenix/phoenix-4.7.0.2.5.6.0-40-hive.jar /usr/hdp/2.5.6.0-40/phoenix/phoenix-hive-4.7.0.2.5.6.0-40-sources.jar If he jar is not working for you then just try to get following jar phoenix-hive-4.7.0.2.5.3.0-37.jar and copy this to /usr/hdp/current/hive-server2/auxlib 3) add property to custom-hive-env HIVE_AUX_JARS_PATH=/usr/hdp/current/hive-server2/auxlib/4) Add follwoing property to custom-hbase-site.xmlphoenix.schema.mapSystemTablesToNamespace=true phoenix.schema.isNamespaceMappingEnabled=true
5) Also run following command 1) jar uf /usr/hdp/current/hive-server2/auxlib/phoenix-4.7.0.2.5.6.0-40-client.jar /etc/hive/conf/hive-site.xml 2) jar uf /usr/hdp/current/hive-server2/auxlib/phoenix-4.7.0.2.5.6.0-40-client.jar /etc/hbase/conf/hbase-site.xml And I hope my solution will work for you 🙂
... View more