Member since
02-02-2016
583
Posts
518
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1340 | 09-16-2016 11:56 AM | |
700 | 09-13-2016 08:47 PM | |
2673 | 09-06-2016 11:00 AM | |
1444 | 08-05-2016 11:51 AM | |
2784 | 08-03-2016 02:58 PM |
06-01-2017
07:52 PM
Hi Team, While trying to install standalone Ambari server on one of the existing edge node. Below is the error I got every time when I ran ambari-server setup command. SELinux status is 'disabled' Customize user account for ambari-server daemon [y/n] (n)? ERROR: Unexpected error 'getpwuid(): uid not found: 55025' ERROR: Exiting with exit code 1. REASON: Failed to create user. Exiting. Also, I don't see UID "55025" in the /etc/passwd and wondering from where it's picking this UID. Thanks
... View more
Labels:
05-30-2017
08:19 PM
@Constantin Stanca There was no error in hive/hdfs/yarn etc. during that time frame, even I tried to restart all the services but didn't helped. But when I restarted Ambari server the issue got resolved, don't know how this exception is related to ambari restart 🙂
... View more
05-30-2017
05:51 PM
1 Kudo
Hi, I'm getting below error while running select query in hive view. I checked proxy settings and evrything looks fine, this is a custom hive view on secured cluster and SSL hadoop. Can anyone suggest where to look?. java.lang.NullPointerException at org.apache.ambari.view.hive2.resources.jobs.JobService.getOne(JobService.java:139) at sun.reflect.GeneratedMethodAccessor561.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60) at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205) at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75) at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302) at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147) at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
... View more
Labels:
12-18-2016
04:40 PM
Thanks @nyakkanti I also had similar issue and used one of the custom port available in Port forwarding.
... View more
09-21-2016
11:54 AM
Thanks @Ayub Pathan for the help, it worked with below syntax as per your suggestion. RULE:[1:$1](4[0-9]*)s/^4/d/
... View more
09-20-2016
11:57 AM
Hi @Ayub Pathan I'm still not able to convert HDFS auth rule i.e RULE:[1:$1](4[0-9]{7,8})s/^4/d/ into atlas auth rule. Ideally same should work.
... View more
09-16-2016
08:47 PM
Thanks @Constantin Stanca But wondering why Ambari 2.2.2 has new feature (Grafana) even it was a maintainence release.
... View more
09-16-2016
12:06 PM
Hi, I'm trying to implement auth rule in atlas by coping a rule from hdfs auth rules i.e RULE:[1:$1](4[0-9]{7,8})s/^4/d/ however this rule syntax doesn't seems to be working for atlas though same works for HDFS. Please help.
... View more
Labels:
09-16-2016
11:56 AM
Thanks @Mats Johansson so as per doc (<major>.<minor>.<maintenance> ) if I'm using Ambari 2.2.1 and want to upgrade to 2.2.2 then that will be a maintenance release upgrade with no new feature addition, right?.
... View more
09-15-2016
04:04 PM
1 Kudo
Hi, Can somebody explain about Ambari version naming standard? Looking similar to HDP https://community.hortonworks.com/questions/41422/question-on-hdp-versioning.html
... View more
Labels:
- Labels:
-
Apache Ambari
09-13-2016
08:47 PM
1 Kudo
I found out the issue, actually we enabled the spnego for atlas therefore it was checking the spnego.service.keytab file. Somehow user "atlas" got removed from the application group therefore didn't had read permission on spnego.service.keytab file. Once we given read permission issue got resolved. Thanks
... View more
09-13-2016
08:12 PM
1 Kudo
After enabling kerberos we are seeing below errors in atlas application log. WARN - [3e0a832b-bc1f-43bf-ab70-fe616747cf1a:] ~ Authentication exception: GSSException: Failure unspecified at GSS-API level (Mechanism level: Invalid argument ( 400) - Cannot find key of appropriate type to decrypt AP REP - RC4 with HMAC) (AuthenticationFilter:586) Below is the klist output. klist -kte atlas.service.keytab Keytab name: FILE:atlas.service.keytab KVNO Timestamp Principal ---- ----------------- -------------------------------------------------------- 4 01/01/70 01:00:00 atlas/xxx (DES cbc mode with CRC-32) 4 01/01/70 01:00:00 atlas/xxx (DES cbc mode with RSA-MD5) 4 01/01/70 01:00:00 atlas/xxx (ArcFour with HMAC/md5) 4 01/01/70 01:00:00 atlas/xxx (AES-256 CTS mode with 96-bit SHA-1 HMAC) 4 01/01/70 01:00:00 atlas/xxx (AES-128 CTS mode with 96-bit SHA-1 HMAC) Checked kerberos kvno number for that principal from KDC and keytab file and both are same. Please guide.
... View more
Labels:
- Labels:
-
Apache Atlas
09-06-2016
12:35 PM
please accept the answer if issue got resolve or let us know if you are still facing same issue.
... View more
09-06-2016
11:23 AM
can you please try to run these two hadoop commands from hdfs user?. hadoop fs -mkdir /user/root hadoop fs -chown /user/root
... View more
09-06-2016
11:00 AM
1 Kudo
From hdfs user: hadoop fs -chown root:root /user/root and then please run the hive shell again from root user.
... View more
08-11-2016
05:49 PM
2 Kudos
Hi @Bala Vignesh N V The above approach is pretty good and work very well when you having small number of files but what if you have thousands or millions of files in directories? In that case its better to use Hadoop Mapreduce framework to do same job on large files but in less time. Below is an example to count lines using mapreduce. https://sites.google.com/site/hadoopandhive/home/hadoop-how-to-count-number-of-lines-in-a-file-using-map-reduce-framework
... View more
08-05-2016
01:48 PM
I would highly suggest you to upgrade your ambari server with latest release. As a workaround restart the Ambari (server and agents) as well as Ambari Metrics and see if that resolve the issue.
... View more
08-05-2016
11:51 AM
1 Kudo
Hi @sankar rao This seems to be an known issue with Ambari, please refer below bug for more info and see if version matches. https://issues.apache.org/jira/browse/AMBARI-12931
... View more
08-03-2016
03:43 PM
As mentioned in my previous comment, can you please check inside /usr/hdp/current/hive-server2/lib/ directory? If its not there then try to copy all the ranger plugin jars from /usr/hdp/2.3.4.7/ranger-hive-plugin/lib/ to /usr/hdp/current/hive-server2/lib/ and restart the hiveserver2.
... View more
08-03-2016
02:58 PM
1 Kudo
Looks like ranger plugin jar is not present under HS2 lib directory, can you please cross check before jar exist under /usr/hdp/current/hive-server2/lib/? ranger-hive-plugins-<version>.jar
... View more
08-03-2016
10:09 AM
3 Kudos
Hi @Jaime Currently HDP 2.4.2 only support spark 1.6, therefore you won't see spark 2.0 in the repository. We don't recommend to install versions which are not tested which specific HDP release. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_HDP_RelNotes/content/ch_relnotes_v242.html However if you are looking to test spark 2.0 on HDP then can you download the spark source code and build it with HDP dependencies.
... View more
07-29-2016
10:47 AM
2 Kudos
I'm assuming you are talking about pig java UDF's here is a good tutorial for the same. https://cwiki.apache.org/confluence/display/PIG/How+to+set+up+Eclipse+environment http://www.tutorialspoint.com/apache_pig/apache_pig_user_defined_functions.htm http://www.hadooptpoint.com/how-to-write-pig-udf-example-in-java/ For Pig Plugins: https://cwiki.apache.org/confluence/display/PIG/PigTools
... View more
07-29-2016
10:26 AM
1 Kudo
You can run below commands to find the hdp version. hdp-select
or
rpm -qa|grep hadoop
... View more
07-28-2016
01:27 PM
Manual change won't work becuase when you restart the service through Ambari, confs it will get overwrite. I believe only change remaining is adding in hive-site.xml through Ambari and restart service.
... View more
07-28-2016
01:11 PM
Yes, under same location. just after below line "# Folder containing extra libraries required for hive compilation/execution can be controlled by:
... View more
07-28-2016
12:49 PM
Hi @Hocine Bouzelatplease accept this answer if suggsted workaround worked for you.
... View more
07-28-2016
11:59 AM
Looks like it still didn't picked the jar file, lets try another approach. In hive-env.sh file through Ambari. export HIVE_AUX_JARS_PATH=<jar1 file path>,<jar2 file path> And restart the services.
... View more