Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to mount hdfs on remote linux system?

How to mount hdfs on remote linux system?

New Contributor

 

Hello,

 

I download latest version of QuickStart vm.

My goal is to mount HDFS file system on remote Cent OS linux system in same LAN/subnet.

I am a newbie to Hadoop.

 

Steps so far I done as follows:-

 

- Created folder /export on quickstart vm

- Now used command 'hadoop-fuse-dfs dfs://127.0.0.1:8020 /export/'

- Now after doing ls -ltr I see hdfs mount point status as = 'drwxr-xr-x    7 hdfs nobody  4096 Jul 31 17:49 export'

- I am able to create directories,files inside /export

- Now I want to mount '/export' on remote cent os system.

- I used following command on remote system ' mount -t nfs -o vers=3,proto=tcp,nolock 10.97.99.80:/export /hdfs' where /hdfs directory is on cent os i.e. client system.

 

I am getting error 'mount.nfs: requested NFS version or transport protocol is not supported'.

 

Can some1 help to resolve this issue?

 

Thanks & Regards,

Amey.

2 REPLIES 2

Re: How to mount hdfs on remote linux system?

Master Guru
You can install CDH on the remote machine and do the regular fuse mount on it.

If you require NFS, then you need to instead follow the NFS deployment guide at http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_ig_nfsv3_gateway_co...
Highlighted

Re: How to mount hdfs on remote linux system?

New Contributor

The link provided by the engineer does not work.

Thanks