Reply
New Contributor
Posts: 1
Registered: ‎10-05-2017

Issue in integration sas with hadoop


Hello,
I use CLOUDERA HADOOP 5.10, SAS 9.4, TKGrid, SAS Plug-ins for Hadoop package (SASHDAT), SAS Embedded Process for Hadoop, RHEL 6.8
Hadoop cluster has 4 nodes.

I getting this error message (from Hadoop).

[root@cloudera-node3 ~]# cat /tmp/hdfs_fail.txt
17/10/04 19:13:51 INFO hadoop.BaseService: Starting embedded DataNodeService 20141117a
17/10/04 19:13:52 INFO hadoop.BaseService: Starting DataNodeService 20141117a
17/10/04 19:13:52 INFO hadoop.BaseService: Creating configuration
17/10/04 19:13:53 INFO hadoop.BaseService: sudoCommand=sudo
17/10/04 19:13:53 INFO hadoop.BaseService: shortCircuitCommand=/opt/cloudera/parcels/CDH-5.10.1-1.cdh5.10.1.p0.10/lib/hadoop/bin/saslasrfd
17/10/04 19:13:53 INFO hadoop.BaseService: NameNode port=15452, DataNode port=15453
17/10/04 19:13:53 INFO hadoop.BaseService: Service version 6.0
17/10/04 19:13:53 INFO hadoop.BaseService: Data dirs: [/tmp/hadoop-sassrv/dfs/data]
17/10/04 19:13:53 ERROR hadoop.BaseService: Invalid rc (1) from process on host cloudera-node3.example.loc(10.104.89.39) [df, /tmp/hadoop-sassrv/dfs/data]: df: `/tmp/hadoop-sassrv/dfs/data': No such file or directory df: no file systems processed
java.io.IOException: Invalid rc (1) from process on host cloudera-node3.example.loc(10.104.89.39) [df, /tmp/hadoop-sassrv/dfs/data]: df: `/tmp/hadoop-sassrv/dfs/data': No such file or directory df: no file systems processed
at com.sas.lasr.hadoop.BaseService.invokeCommand(BaseService.java:1312)
at com.sas.lasr.hadoop.BaseService.getMountpoint(BaseService.java:1269)
at com.sas.lasr.hadoop.DataNodeService.createTempDirs(DataNodeService.java:215)
at com.sas.lasr.hadoop.DataNodeService.start(DataNodeService.java:162)
at com.sas.lasr.hadoop.DataNodeService.main(DataNodeService.java:98)
17/10/04 19:13:53 INFO hadoop.BaseService: Temp dirs: [null]
17/10/04 19:13:53 INFO hadoop.BaseService: Ready to accept commands
17/10/04 19:13:53 INFO hadoop.BaseService: Processing command 1

 
How to resolve this issue?

Thank You.

Announcements