Member since
02-18-2020
29
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3220 | 07-23-2020 11:38 PM |
07-29-2020
06:01 AM
@Shelton I tried this, and I get the following error : Error remove old version
... View more
07-28-2020
01:36 AM
The restoring of /usr/hdp/3.1.0.0-78/oozie/conf/oozie-client-env.sh in /usr/hdp/3.1.4.0-315/oozie/conf/ is not the solution. It seems to be related to the BUG-123169 (https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/release-notes/content/known_issues.html)
... View more
07-28-2020
01:34 AM
Thanks for this reply I want to upgrade to the version 3.1.4, because the download of the packages of the version 3.1.5 requires a login/passwd that I don't have. Regarding the red flag on Oozie, I think that it's due to the BUG-123169 that I will try to workaround with a upgrade of bigtop-tomcat
... View more
07-27-2020
06:26 AM
I have one question regarding the upgrade of HDP 3.1.0.0-78 to 3.1.4.0-315 on Ubuntu 18, and the remove of the old version once the upgrade is finished.
I can see only the new version In Ambari / Stack and versions / version view, but I would like to know if I have to remove the binaries of the old version on the servers or do something else.
I read the post https://community.cloudera.com/t5/Support-Questions/How-to-remove-an-old-HDP-version/td-p/116161, saying that there is an API call that removes the old HDP version
Thanks in advance
... View more
Labels:
07-27-2020
05:11 AM
I'm facing an issue during the upgrade of HDP 3.1.0.0-78 to 3.1.4.0-315 on Ubuntu 18 All services have been updated by the upgrade progress, and after clicking on the Finalize button, I can see that the upgraded is finished, but there's no button to quit the upgrade progress. There's only the Dismiss button available. I was waiting a dedicated button in order to close properly the upgrade process, other than the Dismiss button HDP upgrade process
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
07-24-2020
08:16 AM
1 Kudo
Fixed after restoring /usr/hdp/3.1.0.0-78/oozie/conf/oozie-client-env.sh in /usr/hdp/3.1.4.0-315/oozie/conf/
... View more
07-24-2020
12:10 AM
I'm facing an issue during the upgrade of HDP 3.1.0.0-78 to 3.1.4.0-315 on Ubuntu 18, and after the upgrade of Oozie Server : at the step 'Oozie Clients', I get the following error : Traceback (most recent call last):
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
ExecutionFailed: Execution of '/var/lib/ambari-agent/tmp/oozieSmoke2.sh ubuntu /usr/hdp/current/oozie-client /usr/hdp/current/oozie-client/conf /usr/hdp/current/oozie-client/bin http://server1:11000/oozie /usr/hdp/current/oozie-client/doc /usr/hdp/3.1.4.0-315/hadoop/conf /usr/hdp/3.1.4.0-315/hadoop/bin ambari-qa no-op True /etc/security/keytabs/smokeuser.headless.keytab /usr/bin/kinit ambari-qa-dbdne_fe@DOMAIN' returned 1. /usr/bin/kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa-dbdne_fe@DOMAIN; source /usr/hdp/current/oozie-client/conf/oozie-env.sh ; /usr/hdp/current/oozie-client/bin/oozie -Doozie.auth.token.cache=false job -oozie http://server1:11000/oozie -config /usr/hdp/current/oozie-client/doc/examples/apps/no-op/job.properties -run
Error: IO_ERROR : java.io.IOException: Error while connecting Oozie server. No of retries = 1. Exception = Error while authenticating with endpoint: http://server1:11000/oozie/versions
The above exception was the cause of the following exception:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/OOZIE/package/scripts/service_check.py", line 140, in <module>
OozieServiceCheck().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/OOZIE/package/scripts/service_check.py", line 53, in service_check
OozieServiceCheckDefault.oozie_smoke_shell_file(smoke_test_file_name, prepare_hdfs_file_name)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/OOZIE/package/scripts/service_check.py", line 125, in oozie_smoke_shell_file
logoutput=True
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/var/lib/ambari-agent/tmp/oozieSmoke2.sh ubuntu /usr/hdp/current/oozie-client /usr/hdp/current/oozie-client/conf /usr/hdp/current/oozie-client/bin http://server1:11000/oozie /usr/hdp/current/oozie-client/doc /usr/hdp/3.1.4.0-315/hadoop/conf /usr/hdp/3.1.4.0-315/hadoop/bin ambari-qa no-op True /etc/security/keytabs/smokeuser.headless.keytab /usr/bin/kinit ambari-qa-dbdne_fe@DOMAIN' returned 1. /usr/bin/kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa-dbdne_fe@DOMAIN; source /usr/hdp/current/oozie-client/conf/oozie-env.sh ; /usr/hdp/current/oozie-client/bin/oozie -Doozie.auth.token.cache=false job -oozie http://server1:11000/oozie -config /usr/hdp/current/oozie-client/doc/examples/apps/no-op/job.properties -run
Error: IO_ERROR : java.io.IOException: Error while connecting Oozie server. No of retries = 1. Exception = Error while authenticating with endpoint: http://server1:11000/oozie/versions I don't understand why this error happens. Thanks in advance for your help
... View more
Labels:
07-23-2020
11:38 PM
I was able to restart to the datanode from the Ambari UI after a restart of the ambari-agent on the servers where the datanode run
... View more
07-22-2020
05:33 AM
In fact, I can't restart the datanode from the Ambari UI, but I can restart it by executing the following command directly on the server where the datanode should run /var/lib/ambari-agent/ambari-sudo.sh -H -E /usr/hdp/3.1.0.0-78/hadoop/bin/hdfs --config /usr/hdp/3.1.0.0-78/hadoop/conf --daemon start datanode Therefore I think that the operating system limit max locked memory is right set on the server where the datanode should run
... View more
07-22-2020
12:18 AM
In fact, I can’t restart the datanode after the upgrade of Ambari from 2.7.3.0 to 2.7.4.0, not during the upgrade of HDP, and while the restart works fine before the upgrade Below the logs of the restart with the error : The operating system limit max locked memory is set to 2197152 kbytes and it's more than the value of the parameter dfs.datanode.max.locked.memory (2147483648 bytes) core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 257446
max locked memory (kbytes, -l) 2197152
max memory size (kbytes, -m) unlimited
open files (-n) 128000
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 65536
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
==> /var/log/hadoop/hdfs/hadoop-hdfs-root-datanode-di-dbdne-fe-develophdpwkr-01.log <==
2020-07-22 06:42:20,156 INFO datanode.DataNode (LogAdapter.java:info(51)) - registered UNIX signal handlers for [TERM, HUP, INT]
2020-07-22 06:42:20,422 INFO security.UserGroupInformation (UserGroupInformation.java:loginUserFromKeytab(1009)) - Login successful for user dn/di-dbdne-fe-develophdpwkr-01.node.fe.sd.diod.tech@DIOD.TECH using keytab file /etc/security/keytabs/dn.service.keytab
2020-07-22 06:42:20,574 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [DISK]file:/mnt/hdd0/hadoop/hdfs/data
2020-07-22 06:42:20,581 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [DISK]file:/mnt/hdd1/hadoop/hdfs/data
2020-07-22 06:42:20,582 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [DISK]file:/mnt/hdd2/hadoop/hdfs/data
2020-07-22 06:42:20,582 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [DISK]file:/mnt/hdd3/hadoop/hdfs/data
2020-07-22 06:42:20,582 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [RAM_DISK]file:/mnt/dn-tmpfs
2020-07-22 06:42:20,656 INFO impl.MetricsConfig (MetricsConfig.java:loadFirst(118)) - Loaded properties from hadoop-metrics2.properties
2020-07-22 06:42:20,911 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(85)) - Initializing Timeline metrics sink.
2020-07-22 06:42:20,912 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(105)) - Identified hostname = di-dbdne-fe-develophdpwkr-01.node.fe.sd.diod.tech, serviceName = datanode
2020-07-22 06:42:20,943 INFO availability.MetricSinkWriteShardHostnameHashingStrategy (MetricSinkWriteShardHostnameHashingStrategy.java:findCollectorShard(42)) - Calculated collector shard di-dbdne-fe-develophdpadm-01.node.fe.sd.diod.tech based on hostname: di-dbdne-fe-develophdpwkr-01.node.fe.sd.diod.tech
2020-07-22 06:42:20,943 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(135)) - Collector Uri: http://di-dbdne-fe-develophdpadm-01.node.fe.sd.diod.tech:6188/ws/v1/timeline/metrics
2020-07-22 06:42:20,943 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(136)) - Container Metrics Uri: http://di-dbdne-fe-develophdpadm-01.node.fe.sd.diod.tech:6188/ws/v1/timeline/containermetrics
2020-07-22 06:42:20,948 INFO impl.MetricsSinkAdapter (MetricsSinkAdapter.java:start(204)) - Sink timeline started
2020-07-22 06:42:20,988 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(374)) - Scheduled Metric snapshot period at 10 second(s).
2020-07-22 06:42:20,989 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:start(191)) - DataNode metrics system started
2020-07-22 06:42:21,068 INFO common.Util (Util.java:isDiskStatsEnabled(395)) - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2020-07-22 06:42:21,070 INFO datanode.BlockScanner (BlockScanner.java:<init>(184)) - Initialized block scanner with targetBytesPerSec 1048576
2020-07-22 06:42:21,073 INFO datanode.DataNode (DataNode.java:<init>(486)) - File descriptor passing is enabled.
2020-07-22 06:42:21,074 INFO datanode.DataNode (DataNode.java:<init>(499)) - Configured hostname is di-dbdne-fe-develophdpwkr-01.node.fe.sd.diod.tech
2020-07-22 06:42:21,074 INFO common.Util (Util.java:isDiskStatsEnabled(395)) - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2020-07-22 06:42:21,076 ERROR datanode.DataNode (DataNode.java:secureMain(2883)) - Exception in secureMain
java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) of 2147483648 bytes is more than the datanode's available RLIMIT_MEMLOCK ulimit of 16777216 bytes.
... View more
- « Previous
-
- 1
- 2
- Next »