Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 616 | 06-04-2025 11:36 PM | |
| 1182 | 03-23-2025 05:23 AM | |
| 585 | 03-17-2025 10:18 AM | |
| 2192 | 03-05-2025 01:34 PM | |
| 1376 | 03-03-2025 01:09 PM |
02-18-2019
05:04 PM
Hi @Geoffrey Shelton Okot: Do you see anything I would do to troubleshoot this problem?
... View more
02-10-2019
10:14 PM
1 Kudo
@Michael Bronson HWX doesn't recommend upgrading an individual HDP component because one never knows the incompatibilities that could impact the other components and component selective upgrades tend to be a nightmare during a version upgrade The lastest HDP Kafka version is 11-2.1.x delivered by HDP 3.1 but ASF has its own rollout version and naming convention HTH
... View more
02-05-2019
10:31 AM
@Geoffrey Shelton Okot But Geoffrey Whatever you are suggesting, that is already exist on respective server. Like you are saying, create datadrv2 on 50 and 51. But as i mentioned earlier, it is already present. Please see this again: on .50 : /datadrv1, /datadrv2, /datadrv3 on 51: /datadrv1, /datadrv2 on 52 : /data1 on 53 : /datadrv1 on 54 : /data on 55 : /data1 These directories are already there on respective servers. So when the data is coming to HDFS, it automatically adding data to mentioned path (In HDFS Config) {/datadrv1/hadoop/hdfs/data,/data1/hadoop/hdfs/data} But if it does not get the path, suppose /datadrv1 is not there on 52, So it is creating that directory /datadrv1 on root and putting data there. So thats why root space is getting full. Because data should be going to mentioned directories as it supposed to be. But it is not going there. And same with other servers too. Did you get my error now?
... View more
02-10-2019
10:37 PM
@Sampath Kumar Any updates did this article help you ?
... View more
02-05-2019
08:14 PM
@Ruslan Fialkovsky There is a patch attached did you update your code?
... View more
02-06-2019
01:03 PM
@Chris Jenkins My pleasure I made your day and welcome to Big data space, having to go all through all this will make you better technically you've now seen the different facets to resolving a problem. Happy Hadooping !
... View more
02-07-2019
03:42 PM
@Shraddha Singh Where machine is the FQDN and {rangerkms_password} is the rangerkms user password. The FQDN is the output of $hostname -f Re-run the below commands grant all privileges on rangerkms.* to 'rangerkms'@'machine' identified by '{rangerkms_password}';
grant all privileges on rangerkms.* to 'rangerkms'@'machine' with grant option; And let me know
... View more
02-12-2019
01:07 PM
Logs from Infra Solr now constantly show errors as below: 2019-02-12T10:20:33,548 [zkCallback-7-thread-4] WARN [c:hadoop_logs s:shard5 r:core_node19 x:hadoop_logs_shard5_replica_n16] org.apache.solr.update.PeerSync (PeerSync.java:489) - PeerSync: core=hadoop_logs_shard5_replica_n16 url=http://ip-10-241-10-96:8886/solr exception talking to http://ip-10-241-10-72:8886/solr/hadoop_logs_shard5_replica_n18/, failed
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://ip-10-241-10-72:8886/solr/hadoop_logs_shard5_replica_n18: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 403 GSSException: Failure unspecified at GSS-API level (Mechanism level: Invalid argument (400) - Cannot find key of appropriate type to decrypt AP REP - RC4 with HMAC)</title>
</head>
<body><h2>HTTP ERROR 403</h2>
<p>Problem accessing /solr/hadoop_logs_shard5_replica_n18/get. Reason:
<pre> GSSException: Failure unspecified at GSS-API level (Mechanism level: Invalid argument (400) - Cannot find key of appropriate type to decrypt AP REP - RC4 with HMAC)</pre></p>
</body>
</html>
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:607) ~[solr-solrj-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz - 2018-06-18 16:55:14]
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255) ~[solr-solrj-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz - 2018-06-18 16:55:14]
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) ~[solr-solrj-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz - 2018-06-18 16:55:14]
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219) ~[solr-solrj-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz - 2018-06-18 16:55:14]
at org.apache.solr.handler.component.HttpShardHandler.lambda$submit$0(HttpShardHandler.java:172) ~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz - 2018-06-18 16:55:13]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112]
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) ~[metrics-core-3.2.6.jar:3.2.6]
at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209) ~[solr-solrj-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz - 2018-06-18 16:55:14]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
... View more
02-10-2019
10:43 PM
@Dukool SHarma Any updates?
... View more
01-29-2019
01:50 AM
Thanks Again! I do believe I found my issue. the repos where not complete and accurate on my ubuntu 18.04 builds , so I just copied repos from my xenial 16.04 box and replaced xenial with ubuntu then was able to install lafter update the kerberos client.
here was my final repo for ubuntu 18.04
deb http://us.archive.ubuntu.com/ubuntu/ bionic main restricted
deb http://us.archive.ubuntu.com/ubuntu/ bionic-updates main restricted
deb http://us.archive.ubuntu.com/ubuntu/ bionic universe
deb http://us.archive.ubuntu.com/ubuntu/ bionic-updates universe
deb http://us.archive.ubuntu.com/ubuntu/ bionic multiverse
deb http://us.archive.ubuntu.com/ubuntu/ bionic-updates multiverse
deb http://us.archive.ubuntu.com/ubuntu/ bionic-backports main restricted universe multiverse
deb http://security.ubuntu.com/ubuntu bionic-security main restricted
deb http://security.ubuntu.com/ubuntu bionic-security universe
deb http://security.ubuntu.com/ubuntu bionic-security multiverse
... View more