Support Questions

Find answers, ask questions, and share your expertise

Hbase Issue for data insert

Hi Team,

I have created hbase server on Azure. When i am trying to insert data more than 1.5 million it is creating problem. Does Hbase has some limitation on data size. we are using community version of hortonworks.

Thanks,

Vishal Gupta

8 REPLIES 8

Super Collaborator

Could you please be more specific how do you insert data?

@Vishal Gupta

Sometimes in HBase when you try to load too much data too fast, you may experience throttling. There is a very good article here which talks about throttle and its possible impacts on latency and throughput.

Let know if that helps!

@Vishal Gupta

did the link helped?

Expert Contributor

Hi @Vishal Gupta

There is no data limit as such. As you may already know , performance of Hbase depends on various factors like amount of data, Sizing of clusters and hbase configurations.

Could you please share how you are inserting data into hbase? Are you performing bulk upload?

Also If you could share the details about exact issue? Any error at client end/Region server? Hbase Master?

-Shubham

Hi Shubham,

We have installed latest hortonworks on Azure. We are trying to bulk uplaod with data more than 1.5 million. Till 1 million it works fine.

Thanks in advance.

Vishal Gupta

Expert Contributor

Hi @Vishal Gupta

Do you see any errors at client end or in region servers while running bulkload job?

Are you doing bulkload via phoenix or Hbase directly?

-Shubham

Hi Shubham,

Doesn't give any error . But it keep on running.

Thanks,

Vishal Gupta

Expert Contributor

Hi @Vishal Gupta,

Generate the thread dump of region servers and check whether you have enough handlers ?

su - hbase

jstack -l <region server process id> >> region_server.jstack

Yon can upload the thread dump here as well.

Thanks

Shubham

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.