Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Who agreed with this topic

Custom hbase filter jar not loaded

New Contributor



I am trying to deploy a custom filter to my hbase cluster. According to the cloudera docs [1] dynamic loading of filter jars should be enabled by default. 


My hbase.rootdir is /hbase


To deploy my custom filter jar I created the directory /hbase/lib in hdfs and put my jar in it. Then I tried to use the custom filter from a spark-hbase job:

object MyFilterTest {
  def main(args: Array[String]): Unit = {
    val filter = new MyFilter()

    val scan = new Scan()

    try {
      val rdd = hbaseContext.hbaseRDD(TableName.valueOf("some_table"), scan)
      val rowKeys = => Bytes.toString(tuple._1.get))
    } finally {

But it failed with a ClassNotFoundException stating that MyFilter could not be found.

To investigate I set the log level to DEBUG and look for org.apache.hbase.util.DynamicClassLoader entries in the logs, but did not find any at all. Intrigued by this fact I tried to set the relevant settings explicitly. In the hbase-site.xml safety valve I put this config:


But still, it didn't work and there were no logs indicating that hbase even tried to load the jars from hdfs.


Is there any configuration I forgot?




Who agreed with this topic