Created 01-27-2017 04:13 AM
Lets say I have two Spark Jobs that inserts data to HBase. Is it okay if I have two concurrent jobs inserting to the same table at the same time?
Created 01-27-2017 01:14 PM
That would be absolutely no problem. It is exactly what Hbase was designed for; handling many, many concurrent insert (put) and retrieve (get) requests. So it will handle two or more concurrent jobs, even if they handle the same keys.
Created 01-27-2017 01:14 PM
That would be absolutely no problem. It is exactly what Hbase was designed for; handling many, many concurrent insert (put) and retrieve (get) requests. So it will handle two or more concurrent jobs, even if they handle the same keys.
Created 01-27-2017 01:16 PM
does it not slow down insert to hbase table?
Created 01-27-2017 03:16 PM
Not really, if you have a couple of Regionservers and the key space is distributed evenly. Once again this is the sweetspot of HBase, high volume random read/write workloads.
Created 01-29-2017 08:35 PM
Please mark the question as answered, if it is answered