Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Hbase configuration

New Contributor

I want to upload 8 gb of tsv file into my quick start vm of cdh5. 3.
I tried with bulk load of import tsv as well using cat file.tsv |hbase shell. But it always show outofmemory error. I tried to read tsv file using java and write to hbase table directly but againwd failed. I am sure it is issue of hbase memory so please do let me know how I resolve this problem.

1 REPLY 1

Master Guru
Please add more details, or command outputs of your error.

What exactly fails with an OutOfMemoryError message - a job map task, your HBase command, your java app, etc.?

How wide is your TSV file per line? How many columns are in each line? Could you share the output of Linux commands 'file your-file.tsv' and 'wc your-file.tsv'?

How much heap are you providing HBase RegionServers (search 'regionserver heap' in Cloudera Manager -> HBase -> Configuration page)?