Reply
New Contributor
Posts: 4
Registered: ‎06-21-2016
Accepted Solution

How to copy files from remote windows system to HDFS

Currently i am using spooldir(source) for copying the files from local file system to HDFS, but i want to copy files from remote windows system.

So can some one suggest which source option can i use to copy the files from remote windows system to HDFS using flume where i can specify the username and password.

 

 

 

 

 

Explorer
Posts: 14
Registered: ‎05-06-2014

Re: How to copy files from remote windows system to HDFS

It may be a bit of a long shot, but you could mount the directories of your remote server in your local server using samba and afterwards copy the files to hdfs from the command line.

Posts: 177
Topics: 8
Kudos: 28
Solutions: 19
Registered: ‎07-16-2015

Re: How to copy files from remote windows system to HDFS

In the role HDFS there is a "NFS gateway service" that let you mount an NFS image of the HDFS.

That is one way (you can directly copy file to it). (Check the performance).

 

Hue (web ui) also let you upload files into HDFS (this is a more manual approach).

 

In our enterprise, for an automated process, we are using a custom Java application that is using the HCatWriter API for writting into Hive tables.

But you can also use the httpFs or the webHdfs.

 

 

Explorer
Posts: 11
Registered: ‎04-04-2019

Re: How to copy files from remote windows system to HDFS using HTTPFS

http://httpfs.server.com:14000/webhdfs/v1/user/rakesh/abc.csv?op=CREATE&user.name=hdfs


I am able to create only small files.

How to increase the buffer size for uploading GB's of file??


Champion
Posts: 777
Registered: ‎05-16-2016

Re: How to copy files from remote windows system to HDFS using HTTPFS

Didnt quite understand your requirement
Could you provide some more information
Explorer
Posts: 11
Registered: ‎04-04-2019

Re: How to copy files from remote system to HDFS using HTTPFS

HTTPFS:


**************working with private ip and public ip irrespective of file size********
curl -X PUT -L -b cookie.jar "http://192.168.1.3:14000/webhdfs/v1/user/abc.csv?op=CREATE&data=true&user.name=hdfs" --header "Content-Type:application/octet-stream" --header "Transfer-Encoding:chunked" -T "abc.csv"

 

 

Above command is for a non-kerberized cluster. I enabled Kerberos and what parameters should I pass to put a file to hdfs?

Highlighted
Posts: 1,903
Kudos: 435
Solutions: 307
Registered: ‎07-31-2013

Re: How to copy files from remote system to HDFS using HTTPFS

@Reavidence,

HTTPFS with Kerberos requires SPNEGO authentication to be used. Per https://www.cloudera.com/documentation/enterprise/latest/topics/cdh_sg_httpfs_security.html, for curl (after kinit) this can be done by passing the below two parameters:

"""
The '--negotiate' option enables SPNEGO in curl.
The '-u :' option is required but the username is ignored (the principal that has been specified for kinit is used).
"""
Explorer
Posts: 11
Registered: ‎04-04-2019

Re: How to copy files from remote system to HDFS using HTTPFS

httpfs_kerberos_error.pnghttpfs_kerberos_curl_error

 

I am getting 404 error when i tried to get filestatus using httpfs_ip:14000

 

But with webhdfs port 50070, i am getting the result. Below is successfull command but for httpfs port 14000 it's not working.

 

*****Working***

WEBHDFS(50070): 

curl -i --negotiate -u : "http://gateway1.rev.com:50070/webhdfs/v1/user/root/ratemp/?op=LISTSTATUS"

 

*****Not working****

HTTPFS(14000):

curl --negotiate -u : -b ~/cookiejar.txt -c ~/cookiejar.txt http://gateway1.rev.com:14000/webhdfs/v1/user/root/ratemp/test.txt?op=LISTSTATUS

 

 

I am using coudera manager and is it requied to change 

 

Explorer
Posts: 11
Registered: ‎04-04-2019

Re: How to copy files from remote system to HDFS using HTTPFS

[ Edited ]

image.png

 

 

I added above values and that was causing https to shutdown. After deleting those values , it started and working fine now.

 

Thanks @Harsh J  for your reply.