Member since
02-12-2016
13
Posts
12
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6593 | 02-22-2016 07:34 PM |
02-22-2016
07:34 PM
1 Kudo
I finally got the solution. There are few failed MR jobs ate up the resource. After they been killed, the further job run smoothly.
... View more
02-20-2016
04:31 PM
2 Kudos
I accepted your answer
... View more
02-16-2016
03:59 AM
1 Kudo
I tried increase the YARN containers memory from 1gb to 1.5gb, no help. it seems the problem was kerberos related. The job completed without issue if kerberos was disabled.
... View more
02-16-2016
01:14 AM
1 Kudo
It's my test env, I don't have support access, mapreduce.task.timeout=300000 thx,
... View more
02-15-2016
08:06 PM
I re-run the job this morning, it's failed and terminated by itself. here is the full log: appattempt-1455551404320-0001-000001.txt Thank you ! wei
... View more
02-15-2016
07:37 PM
1 Kudo
No, I was exporting data from secured hdfs to mysql, the cluster has namenode HA enabled. The job was killed by me. PLS check the attached log. application-1455503320604-0004.txt
... View more
02-15-2016
03:48 AM
2016-02-14 22:40:05,909 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2016-02-14 22:40:06,618 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-02-14 22:40:06,760 - HdfsResource['/user/ambari-qa/mapredsmokeoutput'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'default_fs': 'hdfs://HDPCA', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-HDPCA@EXAMPLE.COM', 'user': 'hdfs', 'action': ['delete_on_execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory'}
2016-02-14 22:40:06,860 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-HDPCA@EXAMPLE.COM'] {'user': 'hdfs'}
2016-02-14 22:40:11,788 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx0.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpZmSrzL 2>/tmp/tmpxqBP9F''] {'quiet': False}
2016-02-14 22:40:15,601 - call returned (0, '')
2016-02-14 22:40:15,603 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx1.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpWcmtSE 2>/tmp/tmp8ueZHF''] {'quiet': False}
2016-02-14 22:40:19,015 - call returned (0, '')
2016-02-14 22:40:19,017 - NameNode HA states: active_namenodes = [(u'nn1', 'lnx0.localdomain.com:50070')], standby_namenodes = [(u'nn2', 'lnx1.localdomain.com:50070')], unknown_namenodes = []
2016-02-14 22:40:19,018 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx0.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpEDIPa1 2>/tmp/tmp1Xt3Yx''] {'quiet': False}
2016-02-14 22:40:22,856 - call returned (0, '')
2016-02-14 22:40:22,858 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx1.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpKQmmZw 2>/tmp/tmpJ6YFEL''] {'quiet': False}
2016-02-14 22:40:26,162 - call returned (0, '')
2016-02-14 22:40:26,164 - NameNode HA states: active_namenodes = [(u'nn1', 'lnx0.localdomain.com:50070')], standby_namenodes = [(u'nn2', 'lnx1.localdomain.com:50070')], unknown_namenodes = []
2016-02-14 22:40:26,167 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET --negotiate -u : '"'"'http://lnx0.localdomain.com:50070/webhdfs/v1/user/ambari-qa/mapredsmokeoutput?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpC3nS44 2>/tmp/tmpYhq8nH''] {'logoutput': None, 'quiet': False}
2016-02-14 22:40:29,885 - call returned (0, '')
2016-02-14 22:40:30,159 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X DELETE --negotiate -u : '"'"'http://lnx0.localdomain.com:50070/webhdfs/v1/user/ambari-qa/mapredsmokeoutput?op=DELETE&user.name=hdfs&recursive=True'"'"' 1>/tmp/tmpKxdOxa 2>/tmp/tmpNflsmo''] {'logoutput': None, 'quiet': False}
2016-02-14 22:40:34,695 - call returned (0, '')
2016-02-14 22:40:34,697 - HdfsResource['/user/ambari-qa/mapredsmokeinput'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'source': '/etc/passwd', 'default_fs': 'hdfs://HDPCA', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-HDPCA@EXAMPLE.COM', 'user': 'hdfs', 'action': ['create_on_execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file'}
2016-02-14 22:40:34,699 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-HDPCA@EXAMPLE.COM'] {'user': 'hdfs'}
2016-02-14 22:40:36,030 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx0.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpWi075d 2>/tmp/tmpfGORFu''] {'quiet': False}
2016-02-14 22:40:39,946 - call returned (0, '')
2016-02-14 22:40:39,948 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx1.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpVeHNPf 2>/tmp/tmpXbZTBA''] {'quiet': False}
2016-02-14 22:40:41,300 - call returned (0, '')
2016-02-14 22:40:41,302 - NameNode HA states: active_namenodes = [(u'nn1', 'lnx0.localdomain.com:50070')], standby_namenodes = [(u'nn2', 'lnx1.localdomain.com:50070')], unknown_namenodes = []
2016-02-14 22:40:41,303 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx0.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmp_2srqJ 2>/tmp/tmpaBMZT6''] {'quiet': False}
2016-02-14 22:40:41,588 - call returned (0, '')
2016-02-14 22:40:41,591 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://lnx1.localdomain.com:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpzAgUPP 2>/tmp/tmp_JVFtg''] {'quiet': False}
2016-02-14 22:40:41,859 - call returned (0, '')
2016-02-14 22:40:41,861 - NameNode HA states: active_namenodes = [(u'nn1', 'lnx0.localdomain.com:50070')], standby_namenodes = [(u'nn2', 'lnx1.localdomain.com:50070')], unknown_namenodes = []
2016-02-14 22:40:41,867 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET --negotiate -u : '"'"'http://lnx0.localdomain.com:50070/webhdfs/v1/user/ambari-qa/mapredsmokeinput?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp1NN7xj 2>/tmp/tmpdtdIhc''] {'logoutput': None, 'quiet': False}
2016-02-14 22:40:42,493 - call returned (0, '')
2016-02-14 22:40:42,494 - DFS file /user/ambari-qa/mapredsmokeinput is identical to /etc/passwd, skipping the copying
2016-02-14 22:40:42,495 - HdfsResource[None] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'default_fs': 'hdfs://HDPCA', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-HDPCA@EXAMPLE.COM', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf'}
2016-02-14 22:40:42,495 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa-HDPCA@EXAMPLE.COM;'] {'user': 'ambari-qa'}
2016-02-14 22:40:42,778 - ExecuteHadoop['jar /usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.*.jar wordcount /user/ambari-qa/mapredsmokeinput /user/ambari-qa/mapredsmokeoutput'] {'bin_dir': '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent:/usr/hdp/current/hadoop-client/bin:/usr/hdp/current/hadoop-yarn-client/bin', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'logoutput': True, 'try_sleep': 5, 'tries': 1, 'user': 'ambari-qa'}
2016-02-14 22:40:42,881 - Execute['hadoop --config /usr/hdp/current/hadoop-client/conf jar /usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples-2.*.jar wordcount /user/ambari-qa/mapredsmokeinput /user/ambari-qa/mapredsmokeoutput'] {'logoutput': True, 'try_sleep': 5, 'environment': {}, 'tries': 1, 'user': 'ambari-qa', 'path': ['/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent:/usr/hdp/current/hadoop-client/bin:/usr/hdp/current/hadoop-yarn-client/bin']}
WARNING: Use "yarn jar" to launch YARN applications.
16/02/14 22:42:50 INFO impl.TimelineClientImpl: Timeline service address: http://lnx1.localdomain.com:8188/ws/v1/timeline/
16/02/14 22:42:51 INFO client.RMProxy: Connecting to ResourceManager at Lnx1.localdomain.com/192.168.122.40:8050
16/02/14 22:42:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 599 for ambari-qa on ha-hdfs:HDPCA
16/02/14 22:42:54 INFO security.TokenCache: Got dt for hdfs://HDPCA; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:HDPCA, Ident: (HDFS_DELEGATION_TOKEN token 599 for ambari-qa)
16/02/14 22:42:54 WARN token.Token: Cannot find class for token kind kms-dt
16/02/14 22:42:54 INFO security.TokenCache: Got dt for hdfs://HDPCA; Kind: kms-dt, Service: 192.168.0.102:9292, Ident: 00 0f 61 6d 62 61 72 69 2d 71 61 2d 48 44 50 43 41 02 72 6d 00 8a 01 52 e3 06 19 54 8a 01 53 07 12 9d 54 04 02
16/02/14 22:43:03 INFO input.FileInputFormat: Total input paths to process : 1
16/02/14 22:43:06 INFO mapreduce.JobSubmitter: number of splits:1
16/02/14 22:43:10 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1455503320604_0004
16/02/14 22:43:10 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:HDPCA, Ident: (HDFS_DELEGATION_TOKEN token 599 for ambari-qa)
16/02/14 22:43:10 WARN token.Token: Cannot find class for token kind kms-dt
16/02/14 22:43:10 WARN token.Token: Cannot find class for token kind kms-dt
Kind: kms-dt, Service: 192.168.0.102:9292, Ident: 00 0f 61 6d 62 61 72 69 2d 71 61 2d 48 44 50 43 41 02 72 6d 00 8a 01 52 e3 06 19 54 8a 01 53 07 12 9d 54 04 02
16/02/14 22:43:19 INFO impl.YarnClientImpl: Application submission is not finished, submitted application application_1455503320604_0004 is still in NEW
16/02/14 22:43:20 INFO impl.YarnClientImpl: Submitted application application_1455503320604_0004
16/02/14 22:43:21 INFO mapreduce.Job: The url to track the job: http://Lnx1.localdomain.com:8088/proxy/application_1455503320604_0004/
16/02/14 22:43:21 INFO mapreduce.Job: Running job: job_1455503320604_0004
... View more
Labels:
- Labels:
-
Apache Hadoop
02-13-2016
08:01 PM
I'm using Spark 1.4.1, and the command "peopleSchemaRDD.write.format("orc").save("people.orc")" works !!! Thank you very much !
... View more
02-12-2016
06:02 PM
2 Kudos
I moved public-yum-ol7.repo and nux-dextop.repo into /etc/yum.repos.d/bk and tried upgrade again, it works !! Thank you very much for the help !!
... View more
02-12-2016
05:28 PM
1 Kudo
When I try Tutorial "A Lap around Apache Spark 1.3.1 with HDP 2.3" from sandbox, I encountered the problem: scala> peopleSchemaRDD.saveAsOrcFile("people.orc") <console>:41: error: value saveAsOrcFile is not a member of org.apache.spark.sql.DataFrame
peopleSchemaRDD.saveAsOrcFile("people.orc")
^
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Spark
02-12-2016
03:59 PM
1 Kudo
2016-02-11 21:20:20,569 - Package Manager failed to install packages. Error: Execution of '/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0-3485,HDP-UTILS-2.3.4.0-3485 snappy' returned 1. Error: Package: snappy-devel-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20) Requires: snappy(x86-64) = 1.0.5-1.el6
Removing: snappy-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20)
snappy(x86-64) = 1.0.5-1.el6
Updated By: snappy-1.1.0-3.el7.x86_64 (ol7_latest)
snappy(x86-64) = 1.1.0-3.el7
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 357, in install_packages
skip_repos=[self.REPO_FILE_NAME_PREFIX + "*"] if OSCheck.is_redhat_family() else [])
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 49, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0-3485,HDP-UTILS-2.3.4.0-3485 snappy' returned 1. Error: Package: snappy-devel-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20)
Requires: snappy(x86-64) = 1.0.5-1.el6
Removing: snappy-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20)
snappy(x86-64) = 1.0.5-1.el6
Updated By: snappy-1.1.0-3.el7.x86_64 (ol7_latest)
snappy(x86-64) = 1.1.0-3.el7
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 478, in <module>
InstallPackages().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 162, in actionexecute
raise Fail("Failed to distribute repositories/install packages")
resource_management.core.exceptions.Fail: Failed to distribute repositories/install packages stdout: /var/lib/ambari-agent/data/output-11887.txt 2016-02-11 21:20:11,346 - Will install packages for repository version 2.3.4.0-3485
2016-02-11 21:20:11,347 - Repository['HDP-2.3.4.0-3485'] {'append_to_file': False, 'base_url': 'http://Lnx0.localdomain.com/hdp/HDP/centos7/2.x/updates/2.3.4.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-2.3.4.0-3485', 'mirror_list': None}
2016-02-11 21:20:11,357 - File['/etc/yum.repos.d/HDP-2.3.4.0-3485.repo'] {'content': '[HDP-2.3.4.0-3485]\nname=HDP-2.3.4.0-3485\nbaseurl=http://Lnx0.localdomain.com/hdp/HDP/centos7/2.x/updates/2.3.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-02-11 21:20:11,358 - Writing File['/etc/yum.repos.d/HDP-2.3.4.0-3485.repo'] because contents don't match
2016-02-11 21:20:11,358 - Repository['HDP-UTILS-2.3.4.0-3485'] {'append_to_file': True, 'base_url': 'http://Lnx0.localdomain.com/hdp/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-2.3.4.0-3485', 'mirror_list': None}
2016-02-11 21:20:11,363 - File['/etc/yum.repos.d/HDP-2.3.4.0-3485.repo'] {'content': '[HDP-2.3.4.0-3485]\nname=HDP-2.3.4.0-3485\nbaseurl=http://Lnx0.localdomain.com/hdp/HDP/centos7/2.x/updates/2.3.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-2.3.4.0-3485]\nname=HDP-UTILS-2.3.4.0-3485\nbaseurl=http://Lnx0.localdomain.com/hdp/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-02-11 21:20:11,364 - Writing File['/etc/yum.repos.d/HDP-2.3.4.0-3485.repo'] because contents don't match
2016-02-11 21:20:11,404 - Package['fuse'] {}
2016-02-11 21:20:11,558 - Skipping installation of existing package fuse
2016-02-11 21:20:11,558 - Package['fuse-libs'] {}
2016-02-11 21:20:11,587 - Skipping installation of existing package fuse-libs
2016-02-11 21:20:11,602 - Package['rpcbind'] {'use_repos': ['HDP-2.3.4.0-3485', 'HDP-UTILS-2.3.4.0-3485'], 'skip_repos': ['HDP-*']}
2016-02-11 21:20:11,603 - Installing package rpcbind ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0-3485,HDP-UTILS-2.3.4.0-3485 rpcbind')
2016-02-11 21:20:12,250 - Package['hadoop_2_3_*'] {'use_repos': ['HDP-2.3.4.0-3485', 'HDP-UTILS-2.3.4.0-3485'], 'skip_repos': ['HDP-*']}
2016-02-11 21:20:12,250 - Installing package hadoop_2_3_* ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0-3485,HDP-UTILS-2.3.4.0-3485 'hadoop_2_3_*'')
2016-02-11 21:20:12,932 - Package['snappy'] {'use_repos': ['HDP-2.3.4.0-3485', 'HDP-UTILS-2.3.4.0-3485'], 'skip_repos': ['HDP-*']}
2016-02-11 21:20:12,932 - Installing package snappy ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0-3485,HDP-UTILS-2.3.4.0-3485 snappy')
2016-02-11 21:20:20,569 - Package Manager failed to install packages. Error: Execution of '/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0-3485,HDP-UTILS-2.3.4.0-3485 snappy' returned 1. Error: Package: snappy-devel-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20)
Requires: snappy(x86-64) = 1.0.5-1.el6
Removing: snappy-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20)
snappy(x86-64) = 1.0.5-1.el6
Updated By: snappy-1.1.0-3.el7.x86_64 (ol7_latest)
snappy(x86-64) = 1.1.0-3.el7
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 357, in install_packages
skip_repos=[self.REPO_FILE_NAME_PREFIX + "*"] if OSCheck.is_redhat_family() else [])
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 49, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0-3485,HDP-UTILS-2.3.4.0-3485 snappy' returned 1. Error: Package: snappy-devel-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20)
Requires: snappy(x86-64) = 1.0.5-1.el6
Removing: snappy-1.0.5-1.el6.x86_64 (@HDP-UTILS-1.1.0.20)
snappy(x86-64) = 1.0.5-1.el6
Updated By: snappy-1.1.0-3.el7.x86_64 (ol7_latest)
snappy(x86-64) = 1.1.0-3.el7
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
2016-02-11 21:20:20,570 - Installation of packages failed. Checking if installation was partially complete
2016-02-11 21:20:20,570 - Old versions: ['2.3.2.0-2950', '2.3.4.0-3485']
2016-02-11 21:20:20,614 - New versions: ['2.3.2.0-2950', '2.3.4.0-3485']
2016-02-11 21:20:20,615 - Deltas: set([]) But snappy.x86_64 1.0.5-1.el6 has been installed already, pls check below: [root@Lnx0 ambari]# yum list snappy Loaded plugins: langpacks, ulninfo Repository HDP-2.3.4.0 is listed more than once in the configuration Repository HDP-UTILS-1.1.0.20 is listed more than once in the configuration
Installed Packages snappy.x86_64 1.0.5-1.el6 @hdp-UTILS-1.1.0.20
Available Packages snappy.i686 1.1.0-3.el7 ol7_latest snappy.x86_64 1.1.0-3.el7 ol7_latest [root@Lnx0 ambari]#
... View more
- Tags:
- Hadoop Core
- upgrade
Labels: