Support Questions

Find answers, ask questions, and share your expertise

Out of Memory Heap Error When Processing Bulk File (1.26.0)

avatar
New Contributor

Hello Team,

I am Sachin Duggal , and I am the owner of a bags manufacturing business. I am encountering an “Out of memory heap” error while processing a large text file (26k+ rows) through SFTP. The file contains | separated values, and I'm using the following processors for data manipulation. However, the process halts after the splitJSON processor.

Has anyone else faced a similar issue or has any suggestions on how to optimize the memory usage or address this issue? I would appreciate any insights or alternative approaches that could help resolve this.

Looking forward to your suggestions!

Sachin Duggal 

 
 
 
 
 
1 REPLY 1

avatar
Community Manager

Welcome to our community! To help you get the best possible answer, I have tagged our Airflow expert @smdas who may be able to assist you further.

Please feel free to provide any additional information or details about your query. We hope that you will find a satisfactory solution to your question.



Regards,

Vidya Sargur,
Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community: