- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Out of Memory Heap Error When Processing Bulk File (1.26.0)
- Labels:
-
Apache NiFi
Created on
12-17-2024
04:19 AM
- last edited on
12-18-2024
09:46 PM
by
VidyaSargur
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Team,
I am Sachin Duggal , and I am the owner of a bags manufacturing business. I am encountering an “Out of memory heap” error while processing a large text file (26k+ rows) through SFTP. The file contains | separated values, and I'm using the following processors for data manipulation. However, the process halts after the splitJSON processor.
Has anyone else faced a similar issue or has any suggestions on how to optimize the memory usage or address this issue? I would appreciate any insights or alternative approaches that could help resolve this.
Looking forward to your suggestions!
Sachin Duggal
Created 12-17-2024 09:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Welcome to our community! To help you get the best possible answer, I have tagged our Airflow expert @smdas who may be able to assist you further.
Please feel free to provide any additional information or details about your query. We hope that you will find a satisfactory solution to your question.
Regards,
Vidya Sargur,Community Manager
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:
