Support Questions

Find answers, ask questions, and share your expertise

Custom hbase filter jar not loaded

New Contributor



I am trying to deploy a custom filter to my hbase cluster. According to the cloudera docs [1] dynamic loading of filter jars should be enabled by default. 


My hbase.rootdir is /hbase


To deploy my custom filter jar I created the directory /hbase/lib in hdfs and put my jar in it. Then I tried to use the custom filter from a spark-hbase job:

object MyFilterTest {
  def main(args: Array[String]): Unit = {
    val filter = new MyFilter()

    val scan = new Scan()

    try {
      val rdd = hbaseContext.hbaseRDD(TableName.valueOf("some_table"), scan)
      val rowKeys = => Bytes.toString(tuple._1.get))
    } finally {

But it failed with a ClassNotFoundException stating that MyFilter could not be found.

To investigate I set the log level to DEBUG and look for org.apache.hbase.util.DynamicClassLoader entries in the logs, but did not find any at all. Intrigued by this fact I tried to set the relevant settings explicitly. In the hbase-site.xml safety valve I put this config:


But still, it didn't work and there were no logs indicating that hbase even tried to load the jars from hdfs.


Is there any configuration I forgot?





New Contributor

I have the exact same experience. Custom filters are not loaded, either by default or by explicitly setting the config.... 😕 Cant figure out why... 

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.