Member since
03-21-2016
233
Posts
62
Kudos Received
33
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1003 | 12-04-2020 07:46 AM | |
1283 | 11-01-2019 12:19 PM | |
1748 | 11-01-2019 09:07 AM | |
2714 | 10-30-2019 06:10 AM | |
1372 | 10-28-2019 10:03 AM |
02-24-2017
08:02 AM
@Sanaz Janbakhsh Can you please accept the answer and close the thread.
... View more
01-28-2017
06:44 PM
After bringing network interface up: Did it work?
... View more
03-08-2017
08:00 AM
Only Cloudera SRPMs are available. http://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5.10/
... View more
01-07-2017
03:22 PM
Thank you Rguruvannagari. This solution really worked for me.
... View more
01-09-2017
04:43 AM
@Jay SenSharma yes sasl is enabled as true in hive-site.xml. But it is still showing error.
... View more
01-25-2017
06:19 AM
Hey @rguruvannagari, Sorry to reply so late and taking so long time on this. Since Thrift server wasn't required for our project we decided to stop in the cluster. And thank you for the suggestion. Now got some free time and verified. Yarn was keeping it in the ACCEPTED state as long as memory wasn't available. Once memory is available, I can see the hive prompt as the application goes to RUNNING state.
... View more
01-05-2017
11:07 AM
I solve the problem
now !!!!!! it is working ! 😉 I reindex the index withe the posgres user and it
works now! ambari=# REINDEX INDEX pg_type_typname_nsp_index;
REINDEX
ambari=# \di pg_type_typname_nsp_index;
List of relations
Schema | Name | Type | Owner | Table -----------+---------------------------+-------+----------+--------- pg_catalog | pg_type_typname_nsp_index | index | postgres | pg_type
(1 row)
Many thanks to all
for the great support! special thanx to Jay SenSharma BOB
... View more
10-20-2017
03:55 PM
Hi All, I am receiving the same error "Ambari Server Kerberos credentials check failed" when I try to start Ambari server. My settings are same as described above. when I am trying to kinit it is giving error" kinit: Preauthentication failed while getting initial credentials". Please be informed that we are using Active directory for authentication. Please let me know what to do. Thanks
... View more
02-08-2017
12:20 PM
1 Kudo
I now have a working livy running, at least sc.version works After trying everything I could find with livy 0.2.0 (the version in 2.5.0) I decided to give 0.3.0 a try. I believe that the problem is caused by a bug in spark 1.6.2 when connecting to the metadata store. After compiling livy 0.3.0 with hadoop 2.7.3 and spark 2.0.0, and installing it beside 0.2.0 I had problems creating credentials for the HTTP principal. I solved that by using the hadoop jars from livy 0.2.0 instead of those from the build.
... View more
08-22-2016
02:18 PM
@bigdata.neophyte -t option seems to not available in 2.7.1 version which is included with HDP 2.4.2. This option is included in hadoop 2.7.3 which is available with HDP 2.5 Tech preview . [hdfs@hdp1 ~]$ hadoop version
Hadoop 2.7.3.2.5.0.0-1133
Subversion git@github.com:hortonworks/hadoop.git -r 93bf28063ef319be6833d3d6f117d44e0b6b8fa9
Compiled by jenkins on 2016-08-03T11:38Z
Compiled with protoc 2.5.0
From source with checksum 1aed9e48ca6f7cd4ada3a36b1cd5feb
This command was run using /usr/hdp/2.5.0.0-1133/hadoop/hadoop-common-2.7.3.2.5.0.0-1133.jar
[hdfs@hdp1 ~]$ hdfs dfs -ls -t -help
-ls: Illegal option -help
Usage: hadoop fs [generic options] -ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]
[hdfs@hdp1 ~]$ hdp-select versions
2.5.0.0-1133
... View more
- « Previous
- Next »