Member since
05-21-2017
16
Posts
0
Kudos Received
0
Solutions
05-23-2017
10:41 AM
grafana can not start and made an alert in ambari. [root@compute-1-0 ~]# /usr/sbin/ambari-metrics-grafana restartStopping Ambari Metrics Grafana ...OK Starting Ambari Metrics Grafana: .... FAILED and metrics collector process failed in ambari: (see this.) Connection failed: [Errno 111] Connection refused to compute-1-6.local:6188
... View more
Labels:
05-22-2017
03:37 PM
i install and configure the ambari HDP in my cluster. all services like HDFS, MR, YARN, Zookeeper run without alerts but ambari metrics has some alerts i can not how to fix them. (see picture) what should i do? (i stop all and start all services before but it didn't made any change)
... View more
Labels:
05-22-2017
03:01 PM
i fix the problem by running this command on each node: yum -y erase hdp-select
... View more
05-22-2017
11:21 AM
i read the documents on hortonworks for setting a local repository and now in the "install, start and tests" it failed. the error is this:
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist
what should i do? any idea?
... View more
05-22-2017
07:42 AM
please help me. its very important to me to resolve this issue
... View more
05-22-2017
07:15 AM
i don't know. it's a rocks cluster!
... View more
05-22-2017
06:51 AM
i did the both, but it still makes error like this one: resource_management.core.exceptions.Fail: Execution of 'useradd -m -G hadoop -g hadoop mapred' returned 12. useradd: cannot create directory /home/mapred please for a minute see this link: https://tutorial.readthedocs.io/en/latest/home_folder.html
... View more
05-22-2017
06:24 AM
first command results: [root@NullCluster home]# mkdir /home/spark
mkdir: cannot create directory `/home/spark': Permission denied second command results is: [root@NullCluster home]# df -i
Filesystem Inodes IUsed IFree IUse% Mounted on
/dev/sda1 3203072 314305 2888767 10% /
tmpfs 490523 1 490522 1% /dev/shm
/dev/sda2 640848 22304 618544 4% /var
tmpfs 490523 695 489828 1% /var/lib/ganglia/rrds
... View more
05-22-2017
06:11 AM
the rocks cluster create the home for every user in the following path: [root@NullCluster home]# ls -l /export/home/ total 20 drwx------ 5 condor condor 4096 May 20 04:08 condor drwx------ 25 guest guest 4096 Jan 9 08:17 guest drwx------ 25 hduser hadoop 4096 May 15 10:51 hduser drwx------ 6 huser huser 4096 May 13 08:47 huser drwx------ 4 spark hadoop 4096 May 22 04:14 spark i think it makes the errors but don't know how to solve it.
... View more
05-22-2017
06:00 AM
the result is: [root@NullCluster /]# ls -l /home total 0 i don't know why rocks don't allow to make a home for users.
... View more
05-22-2017
05:58 AM
yes i run it as the root user. i don't know why it can't create /home directory for the users. [root@NullCluster /]# ls -l /home total 0
... View more
05-22-2017
05:54 AM
on the master node:
# # # # # [root@NullCluster /]# df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda1 48G 36G 9.9G 79% /
tmpfs 1.9G 0 1.9G 0% /dev/shm
/dev/sda2 9.5G 3.9G 5.2G 43% /var
tmpfs 936M 72M 864M 8% /var/lib/ganglia/rrds # # # # # [root@NullCluster /]# df -P /boot/efi/
Filesystem 1024-blocks Used Available Capacity Mounted on
/dev/sda1 50264772 37411032 10293740 79% / # # # # # on the compute-1-0 node: # # # # # [root@compute-1-0 ~]# df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda1 16G 8.0G 6.6G 56% /
tmpfs 1.9G 0 1.9G 0% /dev/shm
/dev/sda5 439G 72M 416G 1% /state/partition1
/dev/sda2 3.8G 1.2G 2.5G 32% /var # # # # # [root@compute-1-0 ~]# df -P /boot/efi/
Filesystem 1024-blocks Used Available Capacity Mounted on
/dev/sda1 15995848 8352440 6824208 56% / # # # # #
... View more
05-22-2017
05:43 AM
the user created but it didn't make the home directory. what would i do?
... View more
05-22-2017
05:37 AM
i set the local repository for both ambari and HDP. the ssh keys are set. the ntp are set the snappy is in the right version i wrote above "i read the document" i don't know why it makes this error
... View more
05-22-2017
05:19 AM
hi, i am very confused by an error. if someone knows how i can resolve the issue, please help me!!! i want to configure and install the ambari2.2.2.0+HDP2.4 in a rocks cluster with 7 nodes. ( the one is master and the others are slaves). i did all the steps in documents but in the "install, start and test" i regularly see this error message:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 29, in hook
setup_users()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 44, in setup_users
fetch_nonlocal_groups = params.fetch_nonlocal_groups
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/accounts.py", line 82, in action_create
shell.checked_call(command, sudo=True)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'useradd -m -G hadoop -g hadoop spark' returned 12. useradd: cannot create directory /home/spark
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-995.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-995.json', 'INFO', '/var/lib/ambari-agent/tmp']
... View more