Member since
08-08-2013
339
Posts
132
Kudos Received
27
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
14906 | 01-18-2018 08:38 AM | |
1594 | 05-11-2017 06:50 PM | |
9243 | 04-28-2017 11:00 AM | |
3459 | 04-12-2017 01:36 AM | |
2859 | 02-14-2017 05:11 AM |
01-12-2016
06:53 PM
Hi, I have an issue at installation time of datanode (Ambari 2.1.2, HDP 2.2.4) , it fails because at step "Datanode start" with the error: "" File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist "" Obviously the cause of this issue already happens at step "Datanode install", because the last log message is: "" ...
2016-01-12 19:44:17,233 - Skipping XmlConfig['core-site.xml'] due to only_if
2016-01-12 19:44:17,233 - Can only link configs for HDP-2.3 and higher. "" Hmmm, strange, HDP2.2 should be supported by recent versions of Ambari... Is this a known issue? any workarounds, hints what to do ? Thanks, Gerd
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
01-10-2016
06:50 PM
Hi @Margus Roo , in my previous answer I meant to check the kerberos ticket for user 'hive', not for your personal user. sudo su - hive
kdestroy
kinit -kt <path-to-keytab> hive/sandbox.hortonworks.com
klist and then again the beeline command...
... View more
01-10-2016
04:18 PM
Hi @Margus Roo , does the hive user on the Hiveserver node have a valid Kerberos ticket as well ? Try to re-init one for user 'hive'. I had similar issue in certain versions, where the ticket for user 'hive' hasn't been updated automatically....
... View more
01-08-2016
08:15 PM
2 Kudos
Hi, due to security concerns I need to provide Ranger WebUI via Https, and I thought accessing it through Knox would be a simple approach. But I can also imagine some wired conflicts while e.g. configuring Knox policies for Knox, in Ranger and thereby creating some Kind of 'deadlock'.... What do you think about that approach, is it possible at all and how would a topology in Knox look like?!?! Thanks for any thoughts and Hints!
... View more
Labels:
- Labels:
-
Apache Knox
-
Apache Ranger
01-08-2016
07:34 PM
Thanks @Artem Ervits for your quick response.....that's what I thought. Luckily upgrading to 2.3 is on the List of ToDo's
... View more
01-08-2016
06:49 PM
1 Kudo
Hello, can Ranger be upgraded 'somehow' separately from 0.4 to 0.5 within HDP 2.2.4? or is it just possible by upgrading the whole HDP stack to 2.3 ? Thanks, Gerd
... View more
Labels:
- Labels:
-
Apache Ranger
01-03-2016
10:48 PM
Hi @jb brahma , you just have to click into the VirtualBox window (the black terminal screen) to put the focus on it. Afterwards you can get to the login screen by hitting Alt+F5 Regards, Gerd
... View more
12-27-2015
11:02 AM
Hi @Ali Gouta , I cross-checked your post you was referring to. In that post you mention you are(were) running Oracle JDK 1.7 and now you upgraded just the Ambari-node to JDK 1.8. Does this mean all the other cluster nodes are still running OracleJDK1.7 ? And your output of the java version and property entry show two different versions. I'd recommend to ensure that you are running the same ORACLE JDK on the ambari node as on the other cluster nodes. Compare by e.g. #>java -version
#>alternatives --display java
#>rpm -qa | grep java on Ambari node vs. another cluster node. I got rid of that error message (some weeks back having the same issue, but with HDP2.2 version) by using ORACLE JDK1.7 throughout all the nodes. And don't forget to run #>ambari-server setup -j <your-java-home>
e.g. if you have your java installed in /usr/jdk64/oraclejdk1.7/bin/java, then
#>ambari-server setup -j /usr/jdk64/oraclejdk1.7 And I totally agree to @Artem Ervits , at least fix your version mismatch on the ambari node itself. Therefore if you want to stick with your 1.8.0_66 version, then #>ambari-server setup -j /opt/jdk1.8.0_66/ HTH, best regards...
... View more
12-26-2015
08:25 PM
Hi @Ali Gouta , @gaurav sharma , which one is your default Java, that is being used ? alternatives --display java or ls -al /etc/alternatives/java Can you ensure that you are using Oracle JDK? afaik it is definitely something related to the Java Kit...
... View more
12-26-2015
07:51 AM
3 Kudos
In Ambari the Hiveserver2 is shown as "green", but there is an alert indicating problem with the Hiverserver2 process. This is due to a known bug in HDP2.2.3/2.2.4, that at Hive startup time no Kerberos ticket will be grabbed. To get rid of that alert, login to the Hiveserver node, become user hive and execute a "kinit -kt /etc/security/keytabs/hive.service.keytab hive/<hiveserver>@<REALM>" (if you keytab is in that default directory) to ensure the hive user got a valid Kerberos ticket. Afterwards you can restart Hiveserver in Ambari and the alert will disappear (Thanks to @dprichici for highlighting this) Regards, Gerd
... View more
Labels: