Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Ambari installation - last step fail

Ambari installation - last step fail

New Contributor

Hello

After installation with ambari im getting such errors in last step:

Check YARN

OPEN COPY TASKS
stderr: /var/lib/ambari-agent/data/errors-248.txt
Command aborted. Reason: 'Server considered task failed and automatically aborted it'
	
stdout: /var/lib/ambari-agent/data/output-248.txt
2018-12-03 18:24:34,242 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-03 18:24:34,243 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-12-03 18:24:34,270 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-03 18:24:34,282 - HdfsResource['/user/ambari-qa'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0770}
2018-12-03 18:24:34,285 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/user/ambari-qa?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp6EZZl_ 2>/tmp/tmppp6QPm''] {'logoutput': None, 'quiet': False}
2018-12-03 18:24:34,352 - call returned (0, '')
2018-12-03 18:24:34,352 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":16388,"group":"hdfs","length":0,"modificationTime":1543856827257,"owner":"ambari-qa","pathSuffix":"","permission":"770","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
Command aborted. Reason: 'Server considered task failed and automatically aborted it'
	

Command failed after 1 tries

Check Tez

OPEN COPY TASKS
stderr: /var/lib/ambari-agent/data/errors-247.txt
Python script has been killed due to timeout after waiting 300 secs
	
stdout: /var/lib/ambari-agent/data/output-247.txt
2018-12-03 18:19:31,859 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-03 18:19:31,865 - File['/var/lib/ambari-agent/tmp/sample-tez-test'] {'content': 'foo\nbar\nfoo\nbar\nfoo', 'mode': 0755}
2018-12-03 18:19:31,866 - Writing File['/var/lib/ambari-agent/tmp/sample-tez-test'] because it doesn't exist
2018-12-03 18:19:31,866 - Changing permission for /var/lib/ambari-agent/tmp/sample-tez-test from 644 to 755
2018-12-03 18:19:31,866 - HdfsResource['/tmp/tezsmokeoutput'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'action': ['delete_on_execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2018-12-03 18:19:31,869 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/tmp/tezsmokeoutput?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpFPSNcX 2>/tmp/tmpXvhlIn''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:31,927 - call returned (0, '')
2018-12-03 18:19:31,927 - get_user_call_output returned (0, u'{"RemoteException":{"exception":"FileNotFoundException","javaClassName":"java.io.FileNotFoundException","message":"File does not exist: /tmp/tezsmokeoutput"}}404', u'')
2018-12-03 18:19:31,928 - HdfsResource['/tmp/tezsmokeinput'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2018-12-03 18:19:31,929 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/tmp/tezsmokeinput?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpnlBZNF 2>/tmp/tmpHxrAfD''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:31,987 - call returned (0, '')
2018-12-03 18:19:31,988 - get_user_call_output returned (0, u'{"RemoteException":{"exception":"FileNotFoundException","javaClassName":"java.io.FileNotFoundException","message":"File does not exist: /tmp/tezsmokeinput"}}404', u'')
2018-12-03 18:19:31,989 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/tmp/tezsmokeinput?op=MKDIRS&user.name=hdfs'"'"' 1>/tmp/tmp94vn_L 2>/tmp/tmpcCXrpw''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:32,120 - call returned (0, '')
2018-12-03 18:19:32,120 - get_user_call_output returned (0, u'{"boolean":true}200', u'')
2018-12-03 18:19:32,122 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/tmp/tezsmokeinput?op=SETOWNER&owner=ambari-qa&group=&user.name=hdfs'"'"' 1>/tmp/tmpaTlUlH 2>/tmp/tmp5SFfZe''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:32,287 - call returned (0, '')
2018-12-03 18:19:32,288 - get_user_call_output returned (0, u'200', u'')
2018-12-03 18:19:32,289 - HdfsResource['/tmp/tezsmokeinput/sample-tez-test'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'source': '/var/lib/ambari-agent/tmp/sample-tez-test', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2018-12-03 18:19:32,290 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/tmp/tezsmokeinput/sample-tez-test?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpMEiKsr 2>/tmp/tmpNIFMIs''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:32,351 - call returned (0, '')
2018-12-03 18:19:32,351 - get_user_call_output returned (0, u'{"RemoteException":{"exception":"FileNotFoundException","javaClassName":"java.io.FileNotFoundException","message":"File does not exist: /tmp/tezsmokeinput/sample-tez-test"}}404', u'')
2018-12-03 18:19:32,352 - Creating new file /tmp/tezsmokeinput/sample-tez-test in DFS
2018-12-03 18:19:32,353 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT --data-binary @/var/lib/ambari-agent/tmp/sample-tez-test -H '"'"'Content-Type: application/octet-stream'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/tmp/tezsmokeinput/sample-tez-test?op=CREATE&user.name=hdfs&overwrite=True'"'"' 1>/tmp/tmpXR01HX 2>/tmp/tmpIlqsbL''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:35,860 - call returned (0, '')
2018-12-03 18:19:35,860 - get_user_call_output returned (0, u'201', u'')
2018-12-03 18:19:35,861 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/tmp/tezsmokeinput/sample-tez-test?op=SETOWNER&owner=ambari-qa&group=&user.name=hdfs'"'"' 1>/tmp/tmpYaCYxY 2>/tmp/tmp6uhOsu''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:35,984 - call returned (0, '')
2018-12-03 18:19:35,985 - get_user_call_output returned (0, u'200', u'')
2018-12-03 18:19:35,988 - Called copy_to_hdfs tarball: tez
2018-12-03 18:19:35,988 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-12-03 18:19:35,988 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-12-03 18:19:35,988 - Source file: /usr/hdp/3.0.1.0-187/tez/lib/tez.tar.gz , Dest file in HDFS: /hdp/apps/3.0.1.0-187/tez/tez.tar.gz
2018-12-03 18:19:35,989 - Preparing the Tez tarball...
2018-12-03 18:19:35,989 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-12-03 18:19:35,989 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-12-03 18:19:35,989 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-12-03 18:19:35,989 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-12-03 18:19:35,989 - Extracting /usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz to /var/lib/ambari-agent/tmp/mapreduce-tarball-itymqa
2018-12-03 18:19:35,990 - Execute[('tar', '-xf', u'/usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz', '-C', '/var/lib/ambari-agent/tmp/mapreduce-tarball-itymqa/')] {'tries': 3, 'sudo': True, 'try_sleep': 1}
2018-12-03 18:19:39,343 - Extracting /usr/hdp/3.0.1.0-187/tez/lib/tez.tar.gz to /var/lib/ambari-agent/tmp/tez-tarball-J7yxxn
2018-12-03 18:19:39,344 - Execute[('tar', '-xf', u'/usr/hdp/3.0.1.0-187/tez/lib/tez.tar.gz', '-C', '/var/lib/ambari-agent/tmp/tez-tarball-J7yxxn/')] {'tries': 3, 'sudo': True, 'try_sleep': 1}
2018-12-03 18:19:41,358 - Execute[('cp', '-a', '/var/lib/ambari-agent/tmp/mapreduce-tarball-itymqa/hadoop/lib/native', '/var/lib/ambari-agent/tmp/tez-tarball-J7yxxn/lib')] {'sudo': True}
2018-12-03 18:19:41,376 - Directory['/var/lib/ambari-agent/tmp/tez-tarball-J7yxxn/lib'] {'recursive_ownership': True, 'mode': 0755, 'cd_access': 'a'}
2018-12-03 18:19:41,377 - Creating a new Tez tarball at /var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz
2018-12-03 18:19:41,377 - Execute[('tar', '-zchf', '/tmp/tmpWVS0xt', '-C', '/var/lib/ambari-agent/tmp/tez-tarball-J7yxxn', '.')] {'tries': 3, 'sudo': True, 'try_sleep': 1}
2018-12-03 18:19:52,461 - Execute[('mv', '/tmp/tmpWVS0xt', '/var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz')] {}
2018-12-03 18:19:53,699 - HdfsResource['/hdp/apps/3.0.1.0-187/tez'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-12-03 18:19:53,701 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/tez?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpk2aUBU 2>/tmp/tmpnWK_1G''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:53,794 - call returned (0, '')
2018-12-03 18:19:53,794 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16441,"group":"hdfs","length":0,"modificationTime":1543856893870,"owner":"hdfs","pathSuffix":"","permission":"555","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-12-03 18:19:53,795 - HdfsResource['/hdp/apps/3.0.1.0-187/tez/tez.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'source': '/var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-12-03 18:19:53,797 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/tez/tez.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpFBYz0R 2>/tmp/tmpcxdcmv''] {'logoutput': None, 'quiet': False}
2018-12-03 18:19:53,875 - call returned (0, '')
2018-12-03 18:19:53,875 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":1543856893870,"blockSize":134217728,"childrenNum":0,"fileId":16442,"group":"hadoop","length":254014306,"modificationTime":1543856915642,"owner":"hdfs","pathSuffix":"","permission":"444","replication":3,"storagePolicy":0,"type":"FILE"}}200', u'')
2018-12-03 18:19:53,876 - Not replacing existing DFS file /hdp/apps/3.0.1.0-187/tez/tez.tar.gz which is different from /var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz, due to replace_existing_files=False
2018-12-03 18:19:53,876 - Will attempt to copy tez tarball from /var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz to DFS at /hdp/apps/3.0.1.0-187/tez/tez.tar.gz.
2018-12-03 18:19:53,876 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2018-12-03 18:19:53,877 - ExecuteHadoop['jar /usr/hdp/current/tez-client/tez-examples*.jar orderedwordcount /tmp/tezsmokeinput/sample-tez-test /tmp/tezsmokeoutput/'] {'try_sleep': 5, 'tries': 3, 'bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'user': 'ambari-qa', 'conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf'}
2018-12-03 18:19:53,877 - Execute['hadoop --config /usr/hdp/3.0.1.0-187/hadoop/conf jar /usr/hdp/current/tez-client/tez-examples*.jar orderedwordcount /tmp/tezsmokeinput/sample-tez-test /tmp/tezsmokeoutput/'] {'logoutput': None, 'try_sleep': 5, 'environment': {}, 'tries': 3, 'user': 'ambari-qa', 'path': ['/usr/hdp/3.0.1.0-187/hadoop/bin']}
Command failed after 1 tries
	

Rest of services Atlas, HIVE, MapReduce2 are in warning state without any information.

What am i doing wrong why services do not want to start on master server?

,

Hello

Ambari installation is not setting up all services on master:

Check YARN

OPEN COPY TASKS
stderr: /var/lib/ambari-agent/data/errors-248.txt
Command aborted. Reason: 'Server considered task failed and automatically aborted it'
	
stdout: /var/lib/ambari-agent/data/output-248.txt
2018-12-03 18:24:34,242 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-03 18:24:34,243 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-12-03 18:24:34,270 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-03 18:24:34,282 - HdfsResource['/user/ambari-qa'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://horton00.local:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0770}
2018-12-03 18:24:34,285 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://horton00.local:50070/webhdfs/v1/user/ambari-qa?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp6EZZl_ 2>/tmp/tmppp6QPm''] {'logoutput': None, 'quiet': False}
2018-12-03 18:24:34,352 - call returned (0, '')
2018-12-03 18:24:34,352 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":16388,"group":"hdfs","length":0,"modificationTime":1543856827257,"owner":"ambari-qa","pathSuffix":"","permission":"770","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
Command aborted. Reason: 'Server considered task failed and automatically aborted it'
Command failed after 1 tries
	
2 REPLIES 2

Re: Ambari installation - last step fail

Hi @Robert Holeksa,

I see from attached operation output logs that tez is not starting even after 300 seconds which is causing the other operations to be aborted.

I guess it will be ok if you do a service check on each of services one by one.

also will it be possible for you to attach a screenshot of your ambari dashboard now.

Re: Ambari installation - last step fail

New Contributor

Hello

Thank You for answer. In Ambari im still on

Install, Start and Test

page i did not proceed because i want to be sure every test are passed before i will finish installation.

How can i check those services? I do not see those services in systemd/journal. How they are started/stoped by ambari, where are logs of them on server?

Don't have an account?
Coming from Hortonworks? Activate your account here