This is a great guide to what gets installed where on HDP: https://community.hortonworks.com/articles/16763/cheat-sheet-and-tips-for-a-custom-install-of-horto....
You will notice that Kafka should be installed within the cluster and is best dedicated to its own nodes.
As a side note, Hortwonworks Data Flow (HDF) is a separate distribution/product provided by Hortonworks. It packages Kafka along with NiFi, Storm and Ambari and excels at acquiring, inspecting, routing, transforming, analyizing data in motion from a diverse number of sources (ranging from sensors to databases), which is typically outputted in Hadoop. Exciting technology and a lot to talk ... check it out: http://hortonworks.com/products/data-center/hdf/