Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Phoenix csv Bulk load fails with large data sets

Highlighted

Phoenix csv Bulk load fails with large data sets

New Contributor

I'm trying to load a dataset (280GB) using the Phoenix csv bulk load tool on a HDInsight Hbase cluster. The job fails with the following error:

18/02/23 06:09:10 INFO mapreduce.Job: Task Id : attempt_1519326441231_0004_m_000067_0, Status : FAILEDError: Java heap spaceContainer killed by the ApplicationMaster.Container killed on request. Exit code is 143Container exited with a non-zero exit code 143

Here's my cluster configuration:

Region Nodes

8 cores, 56 GB RAM, 1.5TB HDD

Master Nodes

4 cores, 28GB, 1.5TB HDD

Can anyone please help me troubleshoot this issue?

1 REPLY 1

Re: Phoenix csv Bulk load fails with large data sets

Guru

@Jignesh Rawal Can you share the application log ? look for exceptions in there if you can.

Don't have an account?
Coming from Hortonworks? Activate your account here