Support Questions
Find answers, ask questions, and share your expertise

First pig script failing .

First pig script failing .

Contributor

I am runnign my first pig script ,it is failing with below error. any idea?

File does not exist: /user/admin/pig/jobs/explain_p1_28-03-2016-00-58-46/stderr at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:652) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)

11 REPLIES 11
Highlighted

Re: First pig script failing .

Guru

@Amit Sharma :

If you look at the error message, it states that file does not exists :

So might be you need to move your input file to hdfs or create pig dir in hdfs .

hadoop fs -mkdir /user/admin/pig

Highlighted

Re: First pig script failing .

Contributor

Hi , I tried following exact same procedure from the link bleow.

http://hortonworks.com/hadoop-tutorial/hello-world-an-introduction-to-hadoop-hcatalog-hive-and-pig/#...

but it gives me same error:

File does not exist: /user/admin/pig/jobs/riskfactorpig_28-03-2016-13-44-12/stdout at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:652)

My script location is showing as /user/admin/pig/scripts/riskfactorpig-2016-03-28_01-43.pig

but not sure why it is looking into(fro error) /user/admin/pig/jobs/riskfactorpig_28-03-2016-13-44-12/stdout at

Highlighted

Re: First pig script failing .

Mentor

Please post your script and sample data, you need to make sure your user has sufficient access to input and output directory, i suggest you run through our tutorials on pig first before trying your own luck. You can find the info at hortonworks.com/hadoop/pig

Highlighted

Re: First pig script failing .

Contributor

Hi , I tried following exact same procedure from the link bleow.

http://hortonworks.com/hadoop-tutorial/hello-world-an-introduction-to-hadoop-hcatalog-hive-and-pig/#...

but it gives me same error:

File does not exist: /user/admin/pig/jobs/riskfactorpig_28-03-2016-13-44-12/stdout at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:652)

My script location is showing as /user/admin/pig/scripts/riskfactorpig-2016-03-28_01-43.pig

but not sure why it is looking into(fro error) /user/admin/pig/jobs/riskfactorpig_28-03-2016-13-44-12/stdout at

Highlighted

Re: First pig script failing .

Guru

@Amit Sharma

You can take a look at https://issues.apache.org/jira/browse/AMBARI-12738

I think you are running into this issue. Quick fix is to make sure below setting are in core-site.xml (can be added/modified through ambari -> hdfs configs). This is part of documentation on configuring Pig View

hadoop.proxyuser.hcat.groups=*

hadoop.proxyuser.hcat.hosts=*
Highlighted

Re: First pig script failing .

I did before before I posted this , it never worked , when I run in linux as manually , it works fine

Highlighted

Re: First pig script failing .

@Amit Sharma What did you do to solve this problem?

I'm facing the same issue right now...

Highlighted

Re: First pig script failing .

@Stefan Schuster Check whether your file exists in HDFS or not. If it is not in HDFS you might need to run your pig scripts in local mode or you can copy the file to HDFS and run the scripts pointing it to the HDFS file directory..

Highlighted

Re: First pig script failing .

14800-bildschirmfoto-2017-04-21-um-093417.png

Thanks for your quick response.

There are some files in HDFS... and the the data is also in the table riskfactor, which i created for this reason.

I still get the same error when i run the pig script...

Don't have an account?