I need to consume golden gate messages from over 250 tables(topics) . I wanted to know what would be the best practice to do that? It appears that the consume kafka processor has an upper limit of the number of topics it can handle which is 99. I've tried using 1 processor to consume from 99 topics but it became very difficult to debug in case of any error and I had to put in a separate processor for that particular topic. Creating 250 consume kafka processor will just clutter the UI and wont be a good approach i guess. I wanted to know how every one else is getting around this scenario ?