Member since
10-22-2015
241
Posts
86
Kudos Received
20
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2419 | 03-11-2018 12:01 AM | |
1455 | 01-03-2017 10:48 PM | |
1854 | 12-20-2016 11:11 PM | |
3631 | 09-03-2016 01:57 AM | |
1375 | 09-02-2016 04:55 PM |
05-17-2020
08:41 PM
Hi @kettle
As this thread was marked 'Solved' in June of 2016 you would have a better chance of receiving a useful response by starting a new thread. This will also provide you with the opportunity to provide details specific to your use of the PutSQL processor and/or Phoenix that could aid others in providing a more tailored answer to your question.
... View more
09-24-2019
10:55 PM
Hi @linuslukia or anyone else... can you PLEASE tell us what is the solution for this problem? I tried and last advice with deleteing the WALS and rm from /hbase-unsecure/rs and restarted zookeeper and hbase and it didn't work.
... View more
10-16-2017
06:49 PM
@Ivan Majnaric, There is no harm in running sqlline.py again. Actually it is a client to query phoenix. It will create the SYSTEM tables if not already created. You can check this answer by Josh in the link. https://community.hortonworks.com/questions/64005/phoenix-security-and-initial-system-table-creation.html If this works for you please mark the answer as accepted so that it will be useful for the community. Thanks, Aditya
... View more
09-06-2016
06:05 PM
For HDP 2.3 (Apache 1.1.2), ./hbase-common/src/main/java/org/apache/hadoop/hbase/HBaseConfiguration.java calls HeapMemorySizeUtil.checkForClusterFreeMemoryLimit(conf); There is no HBaseConfiguration.checkForClusterFreeMemoryLimit Can you double check your classpath to see which hbase related jars are present. Please pastebin those jars Thanks
... View more
09-01-2016
07:58 PM
It was DHCP failing to see a response from the DHCP server for periods of time. d2 Ubuntu(14.04) instances were using Enhanced Networking and the "ixgbevf" driver 2.11.3-k. 2.11.3-k is below the minimum recommended version 2.14.2 and should be upgraded to 2.16.4. We upgraded the driver to the latest version which seems to have fixed the issue. Reference: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/sriov-networking.html#enhanced-networking-ubuntu
... View more
06-14-2016
01:58 PM
The phoenix-sqlline command is not using PQS. You want to use /usr/hdp/current/phoenix-client/bin/sqlline-thin.py to interactive with PQS.
... View more
06-13-2016
10:45 PM
Quite often, hbase users face the following exception when submitting job(s) to secure hbase cluster: 2016-05-27 21:16:06,806 WARN [hconnection-0x6be9ad6c-metaLookup-shared--pool2-t1] org.apache.hadoop.hbase.ipc.AbstractRpcClient: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] 2016-05-27 21:16:06,806 FATAL [hconnection-0x6be9ad6c-metaLookup-shared--pool2-t1] org.apache.hadoop.hbase.ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) The above shows an error setting up the HBase connection. If the above happens in tasks, then it doesn't have a delegation token or it's not seeing it. TableMapReduceUtil#initCredentials() should be called before submitting the job.
... View more
Labels: