Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Inner join Phoenix query out of memory ?

Contributor

Hi,

I execute an inner join on 2 tables of 10 Millions records each and i have an outofmemory issue.

The columns are both indexed.

phoenix.query.maxServerCacheBytes=2147483648

phoenix.query.maxGlobalMemoryPercentage=35

Error: Encountered exception in sub plan [0] execution. (state=,code=0) java.sql.SQLException: Encountered exception in sub plan [0] execution. at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:201) at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:145) at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:140) at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:281) at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:266) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:265) at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1444) at sqlline.Commands.execute(Commands.java:822) at sqlline.Commands.sql(Commands.java:732) at sqlline.SqlLine.dispatch(SqlLine.java:808) at sqlline.SqlLine.begin(SqlLine.java:681) at sqlline.SqlLine.start(SqlLine.java:398) at sqlline.SqlLine.main(SqlLine.java:292) Caused by: java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:3236) at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118) at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:135) at java.io.DataOutputStream.writeInt(DataOutputStream.java:200) at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:152) at org.apache.phoenix.join.HashCacheClient.serialize(HashCacheClient.java:125) at org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient.java:85) at org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(HashJoinPlan.java:387) at org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:169) at org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:165) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:748)

3 REPLIES 3

Increase the maximum Java heap space in your client.

New Contributor

@Helmi KHALIFA I have the same problem,I don't know where can find this property "phoenix.query.maxServerCacheBytes".

New Contributor

@Josh Elser Can you pls. guide me, how and where to set the Java heap space on the client ?

I have windows machines where my app runs and the phoenix queries are trigger from these windows system. I see no logs on the server side, I believe, the query is failing on the client side itself.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.