Member since
09-20-2017
22
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
916 | 10-16-2017 02:00 AM |
01-22-2019
07:28 AM
@Jay Kumar SenSharma Thanks for your reply, i deleted some logs to recover Namenode service status to running (service can't start when log space full), and my logs over 50GB per year, so i will study log4j extra to save space usage.
... View more
01-21-2019
02:39 AM
Hi, there are many log files in /var/log/hadoop/hdfs, and file name like hdfs-audit.log.2018-06-01, and some file name like hadoop-hdfs-namenode-hostname are very large. may i delete it all?
(this host running namenode, ranger, datanode flume and node manager.) and if any configure can setting those log as auto cycling or auto delete? Thanks
... View more
- Tags:
- Hadoop Core
- HDFS
- logs
Labels:
- Labels:
-
Apache Hadoop
10-24-2018
06:13 AM
@Jay Kumar SenSharma i'm tried rebalance hdfs, but failed and show 0 moved / 0 left / 0 being processed, stderr: /var/lib/ambari-agent/data/errors-1216.txt (can not find this log) stdout: /var/lib/ambari-agent/data/output-1216.txt (can not find this log) looks no work, if need any action before rebalance?
... View more
10-22-2018
09:11 AM
@Jay Kumar SenSharma Thanks for reply, so it only rebalance manually right?
... View more
10-22-2018
09:01 AM
Hi, i added 2 new data node to hdfs cluster and in the same rack, will data auto move to new data node? my old data node space are 10T per node, new data node space are 20T per node, when old data node space full, new data node only 50% usage right? or will hdfs put more data to large data node? or i need manually excute hdfs balancer when old data node full? Thanks.
... View more
Labels:
- Labels:
-
Apache Hadoop
07-25-2018
05:31 AM
@Adi Jabkowsky yes, i'm using ambari manager, and it's look easy, just i have many file maybe need wait a long time to replicate data, thanks for reply.
... View more
07-25-2018
03:41 AM
Hi everyone, i want decommision datanode and keep other service (prevent disk space not enough), in the server have Hive metastore/hiveserver2/flume/node manager service, i want keep these services. how to do? may i also remove node manager? Thanks.
... View more
Labels:
- Labels:
-
Apache Hadoop
10-23-2017
03:57 AM
@Vishal Gupta in my case, i'm change WEBHDFS service config from {{webhdfs_service_urls}} to http://My_NameNode:50070/webhdfs in xml and your command URL should use "ui" or "adminui" instead "default"
... View more
10-16-2017
02:00 AM
1 Kudo
oh! i think i solved this problem, after add user1, i restart knox all service, and start DEMO LDAP, and DEMO LDAP looks no restart, so i stop DEMO LDAP then restart again, it's worked! thanks @Aditya Sirna
... View more
10-16-2017
01:44 AM
@Aditya Sirna 2017-10-16 09:40:15,499 INFO hadoop.gateway (KnoxLdapRealm.java:getUserDn(691)) - Computed userDn: uid=user1,ou=people,dc=hadoop,dc=apache,dc=org using dnTemplate for principal: user1 2017-10-16 09:40:15,509 INFO hadoop.gateway (KnoxLdapRealm.java:doGetAuthenticationInfo(203)) - Could not login: org.apache.shiro.authc.UsernamePasswordToken - user1, rememberMe=false (10.243.91.58) 2017-10-16 09:40:15,509 ERROR hadoop.gateway (KnoxLdapRealm.java:doGetAuthenticationInfo(205)) - Shiro unable to login: javax.naming.AuthenticationException: [LDAP: error code 49 - INVALID_CREDENTIALS: Bind failed: ERR_229 Cannot authenticate user uid=user1,ou=people,dc=hadoop,dc=apache,dc=org]
... View more
10-13-2017
06:00 AM
Hi All: when curl via knox i only can use admin (-u admin:admin-password) to access and can't use other account or will reply : HTTP/1.1 401 Unauthorized
Date: Fri, 13 Oct 2017 05:45:38 GMT
Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Thu, 12-Oct-2017 05:45:38 GMT
WWW-Authenticate: BASIC realm="application"
Content-Length: 0
Server: Jetty(9.2.15.v20160210) my command: curl -i -k -u user1:Hadoop -X PUT 'https://knoxHost:8443/gateway/default/webhdfs/v1/user1/senfile1?op=CREATE' folder permission: drwxr-xr-x - user1 hdfs 0 2017-10-05 11:08 /user1 Knox users-ldif: # entry for user1 dn: uid=user1,ou=people,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: user1
sn: user1
uid: user1
userPassword:Hadoop Ranger (Sync Source is Unix) HDFS config:
add user1 to default all-path policy Ranger knox config:
add user1 to default all-topology, service policy if any wrong in my config?
... View more
Labels:
- Labels:
-
Apache Knox
10-13-2017
02:34 AM
@Aditya Sirna i can access webhdfs using curl now, but have a probelm when curl via knox i only can use admin (-u admin:admin-password) and can't use other account or will reply HTTP/1.1 401 Unauthorized, i has configured ranger and grand permission to other account (user1), i can create/delete file with user1 in ambari file view via knox (user1 should have permission) mycommand: curl -i -k -u user1:Hadoop -X PUT 'https://knoxHost:8443/gateway/default/webhdfs/v1/user1/senfile1?op=CREATE' folder permission: drwxr-xr-x - user1 hdfs 0 2017-10-05 11:08 /user1 do you know what the problem?
... View more
10-12-2017
08:25 AM
@Aditya Sirna i not sure, i copied Location url to another curl command but not success , i am checking if my command wrong or other error, i will feedback my status after test.
... View more
10-11-2017
08:38 AM
Hi @Aditya Sirna thanks your reply, but i tried as below (try to put a test file ) curl -i -k -u user1:Hadoop -X PUT 'https://knoxHost:8443/gateway/default/webhdfs/v1/user1/test?op=CREATE' get 307 Temporary Redirect, then type curl -i -k -u user1:Hadoop -X PUT -T /test 'https://knoxHost:8443/gateway/default/webhdfs/v1/user1/test?op=CREATE' is it right? i get 307 again
... View more
10-11-2017
07:42 AM
Hi All, to fix knox rest api 404 not found error, i modify WEBHDFS url from {{webhdfs_service_urls}} to http://My_NameNode:50070/webhdfs, it's worked. then i using Ranger to control HDFS access permission, and configure Ranger HDFS and Knox plug-in... i run curl command again, will display 307 Temporary Redirect! i research about 307 message says maybe have name node HA and need add other name node url to Knox WEBHDFS setting, but this is not match my ambari system because i have only one name node (no HA), if any setting missing in Ranger or Knox?
... View more
Labels:
- Labels:
-
Apache Knox
-
Apache Ranger
10-02-2017
03:15 AM
after VM started, you can login with SSH (Host:127.0.0.1 port:2222) in your Win7, default password root/hadoop, i use browser can't reset ambari password success, then type ambari-admin-password-reset after reset ambari password, you can login ambari manager with browswe (http://127.0.0.1:8080)
... View more
10-02-2017
02:27 AM
@vinodh ramaswamy are you try to login with virtual box manager? don't do this, after import HDP ova file, just start docker, and you can login with SSH (Host:127.0.0.1 port:2222) in local OS, not virtual box, then type ambari-admin-password-reset after reset password, you can login ambari manager with browswe (http://127.0.0.1:8080)
... View more
09-29-2017
07:33 AM
i'm not using sandbox, and after install knox, ambari has auto create default.xml include <service> <role>WEBHDFS</role> {{webhdfs_service_urls}} </service> because not work, i tried modified to <service> <role>WEBHDFS</role> <url>http://My_NameNode:50070</url> </service> still not work, now, i tried modify to <service> <role>WEBHDFS</role> <url>http://My_NameNode:50070/webhdfs</url> </service> i got another error, anywhere, seens passed 404 not found now, problem is service url wrong. Very Thanks for your information.
... View more
09-29-2017
05:51 AM
Hi All, i want use curl command to access HDFS via Knox, but always response HTTP/1.1 404 Not Found error. command example: curl -u admin:admin-password -i -v -k "https://KnoxHostName:8443/gateway/default/webhdfs/v1/user/hdfs&op=GETFILESTATUS" and i know how to use curl to create/modify folder or file when url host is Name node with 50070 port, just don't know how to via knox, if any other configs need to setting? My Knox version is 0.12 Thanks.
... View more
Labels:
- Labels:
-
Apache Knox
09-20-2017
07:36 AM
@Jay SenSharma you're great, i tried add new xml and AMBARI/AMBARIUI url point to my ambari manager (i install in different host), file view via knox is worked now, amazing! very thanks.
... View more
09-20-2017
07:04 AM
Hi All: i'm using ambari 2.52 and HDP 2.6 by repo install (not sandbox), i created a new lab only start HDFS/YARN/MapReduce for knox test, and add new host only install knox, now, i can access ambari manager web via knox (i have add AMBARIUI config in advanced topology), but when click file view (or other view) will get 404 not found error, try to use curl command to access hdfs folder also get 404 not found error, if any other configure need setting?
... View more
Labels:
- Labels:
-
Apache Knox