We have HDP cluster installed by Ambari and a Metron cluster installed via docker.
For learning and testing purposes, we created a simple log file and we are trying to ingest data to Elasticsearch over Storm and Kafka.
We created kafka topics and it works fine with all producers and consumers communication, i.e. we can observe inputs entered by producers at the consumer end.
Then we created a simple Grok parser for our data and the parser seems OK as we checked from online Grok parser testing sites. However after following the steps of the tutorial we use (https://cwiki.apache.org/confluence/display/METRON/Adding+a+New+Telemetry+Data+Source), we came across the problem shown in the attached screenshot, that is it says that it cannot access the parser file in HDFS. We tried changing/disabling supervisors, changing ports, giving SSH access to the master node from other nodes, changing the grokPath, altering permissions, copying the parser file to local machine at the same path so far but the error stays.