Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Unable to Create Spark Context

Highlighted

Unable to Create Spark Context

New Contributor

Hi,

 

I am using CDH 5.12.2 on a 4 node cluster. I amd trying to run a talend (talend studio 6.5.1) job, currently i am running into a issue which is throwing an exception.  please find the exception below.  

 

This job mainly transfers csv files from an sftp server to hadoop cluster using Talend jobs.

 

what i observed is once the job starts, Talend is creating a .staging directory under the xyzuser home directory in HDFS and transfereing some jar files and while doing this at some point the below exception is thrown and job is failing.

 

I tried a similar data ingestion MapReduce job from Talend which is working successfully. 

 

PriviledgedActionException as:xyzuser (auth:PROXY) via mapred (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: /user/xyzuser/.staging/job_1524652846082_0089/job_1524652846082_0089.summary
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:66)
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:56)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:2041)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:2011)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1924)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:572)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getBlockLocations(AuthorizationProviderProxyClientProtocol.java:89)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2211)

 I really appreciate if someone help me understand what the issue is?

Don't have an account?
Coming from Hortonworks? Activate your account here