Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HDFS load correctly my files but they are empty

avatar
Contributor

Hi,

I started to create my first job, and i load correctly 2 files (director.txt ans movies.csv) as indicated in the tutoriel. But when i execute my job, i receive some error and after that i notice what is mentioned in the figure ( they load only the name of file and not the contents). What i must do here?

Thank you

10511-browse-directory.png

1 ACCEPTED SOLUTION

avatar
Expert Contributor

@Asma Dhaouadi

This is interesting. If the file is uploaded, that means your action is correct, why this error message come out. Could you please double check whether this message is generated after the fix or before the fix?

View solution in original post

12 REPLIES 12

avatar
Super Guru

@Asma Dhaouadi

Can you share a link to the tutorial you were working on?

avatar
Contributor

avatar
Expert Contributor
@Asma Dhaouadi

you mentioned that you received errors during the job running, if there is error, that might be prevent to creating correct files. Could you please share what kind of errors it produced?

avatar
Contributor

Here, you find the total execution code, in picture i put the wrong line.

I hope to receive a solution for this. I m blocked here.

Thank you.

5.png

10516-1.png

10518-3.png

10520-4.png


2.png

avatar
Expert Contributor

Looks to me that you are running hadoop on windows, and you are missing the winutil.exe. Can you try this:

  1. Download winutils.exe from http://public-repo-1.hortonworks.com/hdp-win-alpha/winutils.exe.
  2. Set your HADOOP_HOME environment variable on the OS level to the full path to the bin folder with winutils.

avatar
Expert Contributor

@Asma Dhaouadi, can you confirm the issue is solved? If yes, could you please accept my answer.

avatar
Expert Contributor
@Asma Dhaouadi

Thanks for the screen shots. It looks like your datanode is not fully operational. Can you please check the Namenode UI or Ambari (if you are using one to make sure that your datanode is up). if you have Ambari, log in and check the HDFS status page. If you are using HDFS without Ambari, connect to the Namenode UI http port ( 50070 : if you are using a version less that Hadoop 3.0) and you can connect to Datanode HTTP port at DatanodeIP:50075 ( again assuming that you are not running hadoop 3.0)

avatar
Contributor

Thank you for you replie,

i think that my Namenode URI it is good.

In the first picture you find my Namenode according by the configuration (but with this value i receive a error when i check service (picture 2))

for this reason i replaced it with my ip address 192.168.43.73 (in my case). Here (picture 3-4 ) i correct error.

And according "Datanode Information" , is in service (figure 5)

So, what do you think here ?

Thank you

10526-sol5.png

10527-sol6.png

10528-sol1.png

10529-sol2.png

10530-information.png

avatar
Contributor

Hi @Frank Lu,

thank you for your patience , i downloaded the winutils.exe and i created a HADOOP_HOME and i put it in Path.

When i executed my job, they load my files and full this time but i receive these two error lines.

So, I hope they will not have any disadvantage at following, and how I precede to eliminate them ?

thank you

10538-mardi.png