Support Questions
Find answers, ask questions, and share your expertise
Alert: Please see the Cloudera blog for information on the Cloudera Response to CVE-2021-4428

I get error Fie does not exist when I run any commend in Pig scripts

Hi ,

I am new to Pig.

I have set up my HDFS and I could run hive successfullybut for pih when I want to run script I get error .

whatever I write i get same error for example I write mkdir a ; or drivers= LOAD '/tmp/drivers.csv' USING PigStorage(',');

I get same error .in addition I have checked all permissin and al are ok . and I also added hadoop.proxyuser.root.groups and hosts into HDFS stetting. I login as admin and I made my home directory for admin . please help me what to do .

File does not exist: /user/admin/pig/jobs/ttt1_11-11-2016-20-07-59/stdout at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(

at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(

at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(

at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(




There is a very similar issue at:

Main points there are:

Which version of HDP are you using? Can you run this successfully from the command line?

Hi ,

I looked and used all of links and tried before i posted this .

i checked up the hdfs setting and I even created directoru for user admin . the problem is that there is no matter that I used this syntax (drivers=load ' /temp/drivers.....'). whever I use such as simple like mikdir or even wrong syntax , it gets this errror . I searche alot but i can not find root cause.please help me to resolve.



Could you provide the following?

  • HDP version
  • HDFS version
  • Pig version
  • how HDP was installed (e.g. through Ambari)

I am actually using IBM Biginsight and IBM Open Platform with Apache Spark and Apache Hadoop version 4.1 .

If I am not mistaken the version of

Hadoop : 2.7.1

HBase 1.1.1

Pig 0.15.0

this is the link I could get the information (

please let me know whatever you need to know .

Add comment Share


It is best of you contact IBM for support on this issue.

Expert Contributor


I have same issue faces when I'm using clouera quick start. I did this to resolve it's very simple just go to that .CSV file location go to file properties copy the file location and give that same path in LOAD '/tmp/drivers.csv' and check.

Hi ,

thanks for telling. but I tried a lot before I posted .

the problem is whatever I write it gets this error for example I write mkdir a ; it gets this error or I write A=load 'root/drivers.csv' .... gets same error .I even made sure that the file is available and I fixed the permission to be avaliable. I am stuck what wrong is this.

in addition , when I use drivers= LOAD '/root/drivers.csv' USING PigStorage(','); manually in Linux ,it works fine but I do not know why in Ambari does not work .

please help if you have any idea.