New Contributor
Posts: 1
Registered: ‎04-09-2019

Hbase configuration

[ Edited ]

I want to upload 8 gb of tsv file into my quick start vm of cdh5. 3.
I tried with bulk load of import tsv as well using cat file.tsv |hbase shell. But it always show outofmemory error. I tried to read tsv file using java and write to hbase table directly but againwd failed. I am sure it is issue of hbase memory so please do let me know how I resolve this problem.

Posts: 1,903
Kudos: 435
Solutions: 305
Registered: ‎07-31-2013

Re: Hbase configuration

Please add more details, or command outputs of your error.

What exactly fails with an OutOfMemoryError message - a job map task, your HBase command, your java app, etc.?

How wide is your TSV file per line? How many columns are in each line? Could you share the output of Linux commands 'file your-file.tsv' and 'wc your-file.tsv'?

How much heap are you providing HBase RegionServers (search 'regionserver heap' in Cloudera Manager -> HBase -> Configuration page)?