Member since
04-04-2019
11
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
8530 | 06-06-2019 11:09 PM |
06-06-2019
11:09 PM
I added above values and that was causing https to shutdown. After deleting those values , it started and working fine now. Thanks @Harsh J for your reply.
... View more
06-06-2019
11:01 PM
I added above values and that was causing https to shutdown. After deleting those values , it stated and working fine now.
... View more
06-06-2019
10:45 PM
Do i need to add above values for httpfs? I added httpfs service using cloudera manager.
... View more
06-06-2019
10:31 PM
httpfs_kerberos_curl_error I am getting 404 error when i tried to get filestatus using httpfs_ip:14000 But with webhdfs port 50070, i am getting the result. Below is successfull command but for httpfs port 14000 it's not working. *****Working*** WEBHDFS(50070): curl -i --negotiate -u : "http://gateway1.rev.com:50070/webhdfs/v1/user/root/ratemp/?op=LISTSTATUS" *****Not working**** HTTPFS(14000): curl --negotiate -u : -b ~/cookiejar.txt -c ~/cookiejar.txt http://gateway1.rev.com:14000/webhdfs/v1/user/root/ratemp/test.txt?op=LISTSTATUS I am using coudera manager and is it requied to change
... View more
06-05-2019
04:47 AM
1. Webhdfs: curl with Kerberos enabled with port 50070 I am able to list status 2. HTTPFS: curl with kerberos enabled with port 14000, I am getting 404 not found. Both curl commands file path location is the same. Can anyone have this issue?
... View more
Labels:
- Labels:
-
Kerberos
06-05-2019
03:07 AM
HTTPFS: **************working with private ip and public ip irrespective of file size******** curl -X PUT -L -b cookie.jar "http://192.168.1.3:14000/webhdfs/v1/user/abc.csv?op=CREATE&data=true&user.name=hdfs" --header "Content-Type:application/octet-stream" --header "Transfer-Encoding:chunked" -T "abc.csv" Above command is for a non-kerberized cluster. I enabled Kerberos and what parameters should I pass to put a file to hdfs?
... View more
06-05-2019
03:01 AM
curl -X PUT -L --anyauth -u : -b cookie.jar "http://httpfs_ip:14000/webhdfs/v1/user/file.csv?op=CREATE&data=true&user.name=hdfs" --header "Content-Type:application/octet-stream" --header "Transfer-Encoding:chunked" -T "file.csv" Just replace httpfs_ip and file.csv
... View more
05-28-2019
03:22 AM
http://httpfs.server.com:14000/webhdfs/v1/user/rakesh/abc.csv?op=CREATE&user.name=hdfs I am able to create only small files. How to increase the buffer size for uploading GB's of file??
... View more
04-17-2019
02:33 AM
We are creating kudu tables using Impala & kudu tablet server hard memory limit is 8GB.
After restarting kudu, each tablet server is only using 1.2GB memory
By using our application, we are reaching 7.2GB of memory and after some time when all the queries are done it is coming down to 3.6GB.
Even in the ideal state also it is still 3.6GB on all the tablet servers.
Can we bring down the resident memory further down?
Max memory minimum memory
... View more
Labels:
- Labels:
-
Apache Impala
-
Apache Kudu