Support Questions
Find answers, ask questions, and share your expertise

How to putHDFS in widows version configuration properties?

Explorer

I am newer to Big Data and NiFi .

I tried to install NiFi in VM but Some Heap Issue i am unable to access in Web UI

so i installed in Windows

if i tried to give /etc/hadoop/core-site.xml and /etc/hadoop/hdfs-site.xml in hadoop configuration resources

it shows error above files are invalid , does not exit or is not a file

i'm having hdfs in the port of hdfs://192.168.1.168:8020

VM is in the port of 192.168.1.189

can anyone please help me and its very helpful for newer in NiFi

5 REPLIES 5

Hi @Iyappan Gopalakrishnan,

Regarding the installation in a VM, be sure to have enough ressources available. You can have details about errors in <nifi-dir>/logs/nifi-app.log

When using PutHDFS on a machine not located on the HDFS cluster, you need to locally copy core-site and hdfs-site files on the NiFi node and to point to these files in the processor. If you are running NiFi on Windows you cannot give /etc/.../... as paths to the configuration files, you have to copy the files on your Windows environment and provide the Windows path to the files.

Hope this helps.

Explorer

Thanks @ Pierre Villard If i use PutFile : how to put file into Local root folder i already mention my port no. once again thanks

If you use PutFile, this will put your file in the configured local folder (not HDFS) of the machine where is running NiFi.

Explorer

please help me if you have any idea !!!!

Explorer

oh ok , I need a flow like

1.I have a FTP folder, to move files from FTP to Local Linux root folder

2. I need to check files if it is in excel then need to convert into .csv or .tsv

3. After conversion i need to move files to HDFS and then load into hive ORC table

4. Altogether need to tract logs if any fails, need to reconnect to complete the flow

Please help to achieve this using NiFi on windows