Member since
08-05-2016
52
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3766 | 07-21-2017 12:22 PM |
09-24-2019
02:11 AM
Hi Suresh, There is no command but you can easily find the information on the HBase Web UI. http://host:16010/master-status#baseStats Best, Helmi KHALIFA
... View more
06-20-2019
09:12 AM
Hi rename the file gateway.jks mv /var/lib/knox/data-2.6.4.0-91/security/keystores/gateway.jks /var/lib/knox/data-2.6.4.0-91/security/keystores/gateway.jks.bck when you start the know instance it will create a new certificate. Best, Helmi KHALIFA
... View more
06-20-2019
09:11 AM
Hi rename the file gateway.jks mv /var/lib/knox/data-2.6.4.0-91/security/keystores/gateway.jks /var/lib/knox/data-2.6.4.0-91/security/keystores/gateway.jks.bck when you start the know instance it will create a new certificate. Best, Helmi KHALIFA
... View more
12-20-2018
01:36 PM
hi Muji, Great job 🙂 just missing a ',' after : B_df("_c1").cast(StringType).as("S_STORE_ID") // Assign column names to the Region dataframe
val storeDF = B_df.select( B_df("_c0").cast(IntegerType).as("S_STORE_SK"), B_df("_c1").cast(StringType).as("S_STORE_ID"), B_df("_c5").cast(StringType).as("S_STORE_NAME")
)
... View more
08-20-2018
09:20 PM
Hi Neeraj, Allowing read and wright to all users to Poenix SYSTEM tables is not really secure. Is there any solution to avoid it? Thanks Helmi
... View more
07-16-2018
09:39 AM
Hi, I execute an inner join on 2 tables of 10 Millions records each and i have an outofmemory issue. The columns are both indexed. phoenix.query.maxServerCacheBytes=2147483648 phoenix.query.maxGlobalMemoryPercentage=35 Error: Encountered exception in sub plan [0] execution. (state=,code=0)
java.sql.SQLException: Encountered exception in sub plan [0] execution.
at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:201)
at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:145)
at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:140)
at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:281)
at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:266)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:265)
at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1444)
at sqlline.Commands.execute(Commands.java:822)
at sqlline.Commands.sql(Commands.java:732)
at sqlline.SqlLine.dispatch(SqlLine.java:808)
at sqlline.SqlLine.begin(SqlLine.java:681)
at sqlline.SqlLine.start(SqlLine.java:398)
at sqlline.SqlLine.main(SqlLine.java:292)
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3236)
at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118)
at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:135)
at java.io.DataOutputStream.writeInt(DataOutputStream.java:200)
at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:152)
at org.apache.phoenix.join.HashCacheClient.serialize(HashCacheClient.java:125)
at org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient.java:85)
at org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(HashJoinPlan.java:387)
at org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:169)
at org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:165)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
06-07-2018
10:01 AM
Hi, We need to index some columns on hbase tables. Any recommandations or best practises using the secondary-index , please? Thanks Helmi KHALIFA
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
01-30-2018
08:27 PM
thank you @Josh Elser 🙂
... View more
01-30-2018
08:10 PM
Hi ! I need to store some HBase tables/namespace in a dedicated servers in a large cluster. So how can we configure it? How can we dedicate only 2 servers for some tables/namespace, on a cluster of 10 servers? Thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
10-19-2017
08:56 AM
Yes ii works ! Thank you @Aditya Sirna 🙂
... View more
- « Previous
-
- 1
- 2
- Next »