Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

SHC bulk load capability - write locality of SHC


SHC bulk load capability - write locality of SHC

New Contributor

Does SHC(Spark HBASE Connector) provided by hortonworks support bulk load? From the source code i could see it uses saveAsNewAPIHadoopDataset to save the HFile directly. But when i run a test example i could see it writes to memstore,WAL and compactions do happen.

The other question is around write locality if it supports bulk load. Does the connector ensures that the spark executor gets launched in the region where the hbase data is supposed to be written, so that the HFile write would be local.

Don't have an account?
Coming from Hortonworks? Activate your account here