Member since
07-28-2016
27
Posts
8
Kudos Received
0
Solutions
11-22-2018
10:52 AM
using PUT command, need to submit the curl twice. There is "negotiate" curl command which does the same in single submission. curl --negotiate -u : -L "http://namenode:50070/webhdfs/v1/user/username/余宗阳视频审核稿-1024.docx?op=CREATE&user.name=username" -T 余宗阳视频审核稿-1024.docx
... View more
11-22-2018
10:47 AM
I found the root cause of the issue. Should use namenode with 50070 port. I was using edge node and hence the failure. Thanks!
... View more
11-21-2018
12:50 PM
After encoding it is working for me. But the first command, it's still throwing the error "illegal character found at index 62". 62 is where the filename will start in the destination path. i checked the $LANG, and it is UTF-8. What was the exact output for you when executed the first curl without encoding?
... View more
11-21-2018
09:54 AM
@Jagadeesan A S That's working thanks! I am trying to put the same file to hdfs using Curl via webhdfs and getting error ">HTTP Status 500 - Illegal character in path at index " curl -i -H 'content-type:application/octet-stream' -H 'charset:UTF-8' -X PUT -T '余宗阳视频审核稿-1024.docx' 'http://hostname:14000/webhdfs/v1/user/username/余宗阳视频审核稿-1024.docx?op=CREATE&data=true&user.name=username&overwrite=true' Any other header to be passed to recognize the chinese character here?
... View more
11-20-2018
09:00 AM
I am trying to put a file in hadoop with the filename in chinese characters. file: 余宗阳视频审核稿-1024.docx but the file name is looking vaguely in hadoop as Óà×ÚÑôÊÓƵÉóºË¸å-1024.docx Any hints to solve this issue?
... View more
Labels:
- Labels:
-
Apache Hadoop
07-17-2018
01:36 PM
Matt, if you have the detailed document on importing data from Salesforce to Hadoop using nifi, please share.
... View more
04-17-2017
05:04 AM
> I have millions of records in each table and hundreds of tables, so first option might not be optimal for big tables. > will try out the other options thank you
... View more
04-13-2017
09:07 AM
Hi Is there a way to compare the whole data of a table in hive and the same table in Oracle?
... View more
Labels:
- Labels:
-
Apache Hive
12-02-2016
11:20 AM
Sqoop import failing with "exception: Java Heap space" when there are no records in Oracle source table. I used a fetchsize of 75000 in sqoop import. sqoop import successfully ran when I removed fetch-size 75000 though no records in source table. I am standardizing the Sqoop import job which can be used for many other tables. And suppose in production, if there are no records, job should not fail. how to avoid this situation. And why it is failing when using bigger fetch-size. Thanks
... View more
Labels:
- Labels:
-
Apache Sqoop
11-25-2016
05:26 AM
Yes, that worked. after giving SELECT_CATALOG_ROLE privileges, direct option is working.
... View more