Member since
12-09-2015
103
Posts
82
Kudos Received
13
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1918 | 04-17-2016 06:56 AM | |
1148 | 04-13-2016 11:39 AM | |
1712 | 03-17-2016 10:01 AM | |
2875 | 02-22-2016 06:50 AM | |
987 | 02-17-2016 09:36 AM |
04-19-2016
03:19 AM
@Sadek M If I understand this correctly, you are trying to use TDE with hdfs user. This will not work because hdfs user is blacklisted for TDE operations. Here is note from Hortonworks Doc. For separation of administrative roles, do not use the hdfs user to create encryption zones. Instead, designate another administrative account for creating encryption keys and zones. See Creating an HDFS Admin User for more information.
... View more
04-17-2016
06:56 AM
3 Kudos
@Sadek M I was able to resolve it after restarting Ranger Service. Ambari does not prompt for restarting Ranger service, but prompt for only Ranger Kms. Be sure to edit the repository username from Ranger UI by logging in as keyadmin user. Changing user from Ambari does not work. Set values of below to * hadoop.kms.proxyuser.hive.users=* hadoop.kms.proxyuser.oozie.users=* hadoop.kms.proxyuser.HTTP.users=* hadoop.kms.proxyuser.ambari.users=* hadoop.kms.proxyuser.yarn.users=* hadoop.kms.proxyuser.hive.hosts=* hadoop.kms.proxyuser.oozie.hosts=* hadoop.kms.proxyuser.HTTP.hosts=* hadoop.kms.proxyuser.ambari.hosts=* hadoop.kms.proxyuser.yarn.hosts=*
... View more
04-15-2016
11:17 AM
Same problem here on HDP 2.3.4 with Ambari 2.2.0 Changing hadoop.kms.authentication.type to simple works fine.
... View more
04-13-2016
05:52 PM
Can you help me with working demo of enabling service level authorisation for yarn. I followed the steps in https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/ServiceLevelAuth.html#Enable_Service_Level_Authorization but it is not working. I can run yarn jobs from any user irrespective of the acl settings. I tried this in HDP 2.3.4.0 with Ambari 2.2.0 FYI, ranger plugin policies are working fine. I tried this with and without enabling ranger plugin. However service level authorisation is working fine in case of apache hadoop.
... View more
04-13-2016
11:39 AM
Click on hosts menu on ambari web UI Click on host where you want to add client. On the host page click on add button as below.
... View more
04-13-2016
10:44 AM
@Safak Kapci As far as I know, you have to manually add clients using Ambari.
... View more
04-05-2016
04:49 AM
@Paul Tamburello I am wondering how difficult it will be to change the hostname of these servers back to what they were before. See this http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/set-hostname.html
... View more
04-04-2016
01:26 PM
@Benjamin Leonhardi Can you help me with working demo of enabling service level authorization for yarn. I have followed the steps in https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/ServiceLevelAuth.html#Enable_Service_Level_Authorization but it is not working. I can run yarn jobs from any user irrespective of the acl settings. I need this in HDP 2.3.4.0 with Ambari 2.2.0
... View more
03-31-2016
02:30 PM
Thanks a lot.
... View more
03-29-2016
05:35 PM
If your cluster is Ambari managed then use Ambari to install Ranger
... View more
03-28-2016
01:21 PM
@priyanka vijayakumar It depends on how much you have to spare. My laptop has 8 GB RAM so I have given 4 GB RAM to Sandbox. In case of CPU I have 4 core. So I have given 1 to 2 core to sandbox.
... View more
03-25-2016
03:43 PM
In sandbox NAT should be used. See this https://community.hortonworks.com/questions/23453/yum-install-doesnt-work-couldnt-resolve-host-mirro.html#answer-23454
... View more
03-25-2016
12:10 PM
Are you using sandbox?
... View more
03-21-2016
03:45 PM
Could be memory issue. Can you check /var/log/messages for Kernel OOM killer. Also please give details of your environment. (OS, RAM, nodes etc)
... View more
03-21-2016
03:38 PM
Have you also changed the value of min.user.id. ? Or you can change the id of hive user above 1000
... View more
03-17-2016
10:01 AM
3 Kudos
You need to add network adapter to VM using NAT. You current ip suggest that you are using host only network. Host only network will not allow internet access. See below:
... View more
03-17-2016
09:55 AM
Can you check that hostname fqdn and ip address matching is correct?
... View more
03-16-2016
04:32 AM
@Prakash Punj
Are you changing dfs.namenode.safemode.threshold-pct from Ambari web UI?
If not then do it from Ambari web UI.
Does hdfs dfsadmin -safemode leave work? Are you running above command from hdfs user? Does hdfs dfsadmin -safemode leave give any error?
... View more
03-15-2016
03:34 PM
Are you able to access NameNode web UI? If yes then what message you see on webpage? If not then please paste the contents of latest namenode service logs. Should be at /var/log/hadoop/hdfs
... View more
02-26-2016
04:34 AM
3 Kudos
My understanding is that in case of kerberos enabled cluster users/principal is required to be present on all the nodes. Refer this https://community.hortonworks.com/questions/15160/adding-a-new-user-to-the-cluster.html
... View more
02-24-2016
01:54 PM
@Saurabh Kumar I have not used this but worth trying. https://issues.apache.org/jira/browse/HADOOP-8989
... View more
02-22-2016
06:50 AM
1 Kudo
In Namenode UI check and ensure that there are no missing and corrupt blocks. If this is true then you can successfully remove failed disk from DataNode. Refer this for details
... View more
02-19-2016
06:37 AM
3 Kudos
@Rushikesh Deshmukh See this http://stackoverflow.com/questions/11954904/indexing-and-searching-in-hadoop
... View more
02-17-2016
09:36 AM
1 Kudo
Something wrong with your network side. I can successfully run wget https://www.dropbox.com/s/rv43a05czfaqjlj/Tutorials-master-2.3.zip on my centos box. Could you please try again? Also you can try using http instead of https like below: [centos@cdh-master1 ~]$ wget http://www.dropbox.com/s/rv43a05czfaqjlj/Tutorials-master-2.3.zip [centos@cdh-master1 ~]$ wget https://www.dropbox.com/s/rv43a05czfaqjlj/Tutorials -master-2.3.zip
--2016-02-17 14:59:16-- https://www.dropbox.com/s/rv43a05czfaqjlj/Tutorials-mas ter-2.3.zip
Resolving www.dropbox.com. 108.160.172.238, 108.160.172.206
Connecting to www.dropbox.com|108.160.172.238|:443. connected.
HTTP request sent, awaiting response... 302 FOUND
Location: https://dl.dropboxusercontent.com/content_link/bftSrCceMqCQfKKMRuQ1nts 3QapPMFe1qQb39ngRSKPDGK5l2Z5wmpGeGB1pk3iO/file [following]
--2016-02-17 14:59:18-- https://dl.dropboxusercontent.com/content_link/bftSrCce MqCQfKKMRuQ1nts3QapPMFe1qQb39ngRSKPDGK5l2Z5wmpGeGB1pk3iO/file
Resolving dl.dropboxusercontent.com... 108.160.173.133
Connecting to dl.dropboxusercontent.com|108.160.173.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 122908046 (117M) [application/zip]
Saving to: “Tutorials-master-2.3.zip”
10% [============> ] 13,040,912 77.8K/s eta 22m 14s
... View more
02-17-2016
08:39 AM
2 Kudos
@Kunal Gaikwad Looks like you are reinstalling from failed attempt. In this case I will suggest to do a clean install rather than correcting/debugging error. By clean install I mean resetting you operating system. From my experience trying to re install on failed/partial installation is cumbersome task.
... View more
02-17-2016
07:24 AM
1 Kudo
From which user you are running the job?
... View more
02-15-2016
09:20 AM
3 Kudos
Just do hdfs dfs -copyFromLocal myFolder.tar.gz /hdfs/destination/path
... View more
02-15-2016
07:25 AM
2 Kudos
Error : Connection refused to 8020 Possible causes: 1. NameNode is not running. Check namenode log to find error. 2. Firewall blocking connection to 8020. Check you firewall setting. It is recommended in the doc to disable firewall.
... View more