Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

By using HDF 3.0 in VMware, how can I feed local .csv files from PC to it and pass it to Kafka Producer/Topic?

Solved Go to solution
Highlighted

By using HDF 3.0 in VMware, how can I feed local .csv files from PC to it and pass it to Kafka Producer/Topic?

New Contributor

Hi Guys,

I just starting exploring Kafka using Docker. Today I have installed VMware version of HDF 3.0 and I am wondering whether it is possible to practice Kafka using some local files I am generating (.csv files stored in local file system - eg c:\temp\) by feeding them to Kafka Producer?

If so, could you provide some newbie guide on how to achieve this? I'd appreciate it.

Thanks.

1 ACCEPTED SOLUTION

Accepted Solutions

Re: By using HDF 3.0 in VMware, how can I feed local .csv files from PC to it and pass it to Kafka Producer/Topic?

Contributor

You need to setup 'shared folder' between sandbox and windows on VMware.

And then refer this link for Kafka Producer

'https://community.hortonworks.com/questions/4140/hdp-twitter-demo-send-data-into-kafka-from-csv.html'

2 REPLIES 2

Re: By using HDF 3.0 in VMware, how can I feed local .csv files from PC to it and pass it to Kafka Producer/Topic?

Contributor

You need to setup 'shared folder' between sandbox and windows on VMware.

And then refer this link for Kafka Producer

'https://community.hortonworks.com/questions/4140/hdp-twitter-demo-send-data-into-kafka-from-csv.html'

Re: By using HDF 3.0 in VMware, how can I feed local .csv files from PC to it and pass it to Kafka Producer/Topic?

New Contributor

I will give it a try, thank you.

Don't have an account?
Coming from Hortonworks? Activate your account here