How to upload files to cloudera virtual box?
I'm taking SimpliLearn's Big Data Hadoop and Spark Developer Training course, which is a free course. That's why I don't have access to the labs. So what I did was download cloudera virtual box to practice spark and hadoop codes. However, some exercises require uploading files such as datasets using FTP.
I tried using FileZilla to connect to the VM and upload the files to the VM, but it didn't work connecting using the VM's IP address.
I'm a beginner with these tools, could anyone help me with how to put these files in the VM?
See below what I need to do:
1. Download the Hadoop-mapreduce-example.jar file and wordcount.txt file
2. Log in to the FTP using the username and password from the lab and upload the file
3. Log in to the Webconsole using the username and password from the lab and create a new directory demo in HDFS using the mkdir command
4. Push the wordcount.txt file into the directory using the put command
5. Execute the command to move the Hadoop-mapreduce-example.jar file to the HDFS directory
6. View the files in the Output folder with the part files
Without FTP I don't execute tasks 2 and 4.
Please, help me!