Support Questions
Find answers, ask questions, and share your expertise

HDFS load correctly my files but they are empty

Explorer

Hi,

I started to create my first job, and i load correctly 2 files (director.txt ans movies.csv) as indicated in the tutoriel. But when i execute my job, i receive some error and after that i notice what is mentioned in the figure ( they load only the name of file and not the contents). What i must do here?

Thank you

10511-browse-directory.png

1 ACCEPTED SOLUTION

Expert Contributor

@Asma Dhaouadi

This is interesting. If the file is uploaded, that means your action is correct, why this error message come out. Could you please double check whether this message is generated after the fix or before the fix?

View solution in original post

12 REPLIES 12

@Asma Dhaouadi

Can you share a link to the tutorial you were working on?

Explorer

Expert Contributor
@Asma Dhaouadi

you mentioned that you received errors during the job running, if there is error, that might be prevent to creating correct files. Could you please share what kind of errors it produced?

Explorer

Here, you find the total execution code, in picture i put the wrong line.

I hope to receive a solution for this. I m blocked here.

Thank you.

5.png

10516-1.png

10518-3.png

10520-4.png


2.png

Expert Contributor

Looks to me that you are running hadoop on windows, and you are missing the winutil.exe. Can you try this:

  1. Download winutils.exe from http://public-repo-1.hortonworks.com/hdp-win-alpha/winutils.exe.
  2. Set your HADOOP_HOME environment variable on the OS level to the full path to the bin folder with winutils.

Expert Contributor

@Asma Dhaouadi, can you confirm the issue is solved? If yes, could you please accept my answer.

Rising Star
@Asma Dhaouadi

Thanks for the screen shots. It looks like your datanode is not fully operational. Can you please check the Namenode UI or Ambari (if you are using one to make sure that your datanode is up). if you have Ambari, log in and check the HDFS status page. If you are using HDFS without Ambari, connect to the Namenode UI http port ( 50070 : if you are using a version less that Hadoop 3.0) and you can connect to Datanode HTTP port at DatanodeIP:50075 ( again assuming that you are not running hadoop 3.0)

Explorer

Thank you for you replie,

i think that my Namenode URI it is good.

In the first picture you find my Namenode according by the configuration (but with this value i receive a error when i check service (picture 2))

for this reason i replaced it with my ip address 192.168.43.73 (in my case). Here (picture 3-4 ) i correct error.

And according "Datanode Information" , is in service (figure 5)

So, what do you think here ?

Thank you

10526-sol5.png

10527-sol6.png

10528-sol1.png

10529-sol2.png

10530-information.png

Explorer

Hi @Frank Lu,

thank you for your patience , i downloaded the winutils.exe and i created a HADOOP_HOME and i put it in Path.

When i executed my job, they load my files and full this time but i receive these two error lines.

So, I hope they will not have any disadvantage at following, and how I precede to eliminate them ?

thank you

10538-mardi.png

Expert Contributor

@Asma Dhaouadi

This is interesting. If the file is uploaded, that means your action is correct, why this error message come out. Could you please double check whether this message is generated after the fix or before the fix?

Explorer

Hi @Frank Lu,

Now , i receive this mistakes code executed (in figure), can you help me, to say what i could do ?

Thank you very much.10674-1.png

10675-2.png

10676-3.png

Expert Contributor

I am not sure about this error at this moment. But check it in https://wiki.apache.org/hadoop/CouldOnlyBeReplicatedTo

; ;