Member since
02-16-2016
34
Posts
19
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1819 | 04-04-2016 12:51 PM | |
4419 | 04-04-2016 05:54 AM |
02-15-2018
06:06 AM
I had same issue on rhel 7 with below error Error: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_6_3_0_235-hdfs' returned 1. Error: Package: hadoop_2_6_3_0_235-hdfs-2.7.3.2.6.3.0-235.x86_64 (HDP-2.6-repo-101)
Requires: libtirpc-devel
You could try using --skip-broken to work around the problem Solution : Please check the Red Hat Enterprise Linux Server 7 Optional (RPMs) enabled on all nodes with below command # yum repolist all (To check enabled or disabled) !rhui-REGION-rhel-server-optional/7Server/x86_64 Red Hat Enterprise Linux Server 7 Optional (RPMs) Disabled: #yum-config-manager --enable rhui-REGION-rhel-server-optional ( enabling the optional rpms) Cross verify with first command to get it optional rpms enabled # yum repolist all !rhui-REGION-rhel-server-optional/7Server/x86_64 Red Hat Enterprise Linux Server 7 Optional (RPMs) enabled: 13,201
... View more
01-16-2017
09:05 PM
I got the same issue and resolved with using the "Force Overwrite" while Hive replication setup
... View more
06-01-2016
12:55 PM
1 Kudo
@Blanca Sanz I am agree with @Robert Levas I will try to give your question answer, first of all as per my understanding you have LDAP server exists in your premises and that LDAP server integrated with the hadoop platform, now thing is that you want to configure strong authentication with this environment. The answer should be "YES" you can... Please find below steps to configure OpenLDAP and Kerberos with hadoop platform * LDAP server must be exist and configured with all nodes * MIT kerberos server you have to install and configured with all nodes * While client configuration you have to give both server information * "authconfig-tui" under authentication configuration you have to select user information is LDAP and in authentication method add kerberos you can keep as it is other things except "use ldap authentication " (for more detail find attached jpeg image) ldap-and-kerberos-conf.jpg * All required principals should be exists and placed on appropriate location Note: Recommendation is always good to go for strong authentication if your cluster is stable.
... View more
05-24-2016
07:29 AM
1 Kudo
knox was running fine. I was able to check Knox using curl comman. After I configured name node ha, Knox is not working. can u pls suggest.
... View more
Labels:
- Labels:
-
Apache Knox
05-20-2016
07:08 AM
1 Kudo
Can someone please provide me working demo configuration for openldap. I need to use this for Hadoop and ecosystem components integration.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
04-11-2016
12:08 PM
1 Kudo
Hi @vinay kumar,
I think your partitioning is wrong you are not using "/" for hdfs directory.
If you want use full disk capacity, you can create any folder name under "/" example /data/1 on every data node using command "#mkdir -p /data/1" and add to it dfs.datanode.data.dir.
restart the hdfs service. You should get the desired output.
... View more
04-04-2016
12:51 PM
1 Kudo
Resolved: Enabled Kerberos Authentication for HTTP Web-Consoles (HDFS) and regenerated missing kerberos credentials After changes done, I got below output. -bash-4.1$ hdfs crypto -createZone -keyName mykey1 -path /user/xxxx/zone1 Added encryption zone /user/xxxx/zone1
-bash-4.1$
... View more
04-04-2016
10:49 AM
Cluster having the rest encryption enabled, I am able to create keys using
"#hdfs key create mykey1" but not able to create encryption zone
on hdfs directories. Please find below steps for reference -bash-4.1$ hadoop key list Listing keys for KeyProvider: KMSClientProvider[https://fqdn:port/kms/v1/] mykey2 mykey1 I got below error when I am going to assign encryption zone to hdfs empty
dir. -sh-4.1$ hdfs crypto -createZone -keyName
mykey1 -path /user/xxxx/zone1 RemoteException: sun.security.validator.ValidatorException: PKIX path
building failed: sun.security.provider.certpath.SunCertPathBuilderException:
unable to find valid certification path to requested target
... View more
Labels:
- Labels:
-
Cloudera Navigator Encrypt