introspection ( prepare images ) failing in 3rd party ovb jobs fs001/35

Bug #1817598 reported by wes hayutin
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
tripleo
Fix Released
Critical
Unassigned

Bug Description

https://logs.rdoproject.org/75/638775/4/openstack-check/tripleo-ci-centos-7-ovb-3ctlr_1comp-featureset001/18390e5/job-output.txt.gz#_2019-02-25_12_00_14_293447

TASK [overcloud-prep-images : Prepare the overcloud images for deploy] *********
2019-02-25 12:00:14.293447 | primary | Monday 25 February 2019 12:00:14 +0000 (0:00:03.376) 0:00:27.416 *******
2019-02-25 12:05:52.735322 | primary | fatal: [undercloud]: FAILED! => {
2019-02-25 12:05:52.736915 | primary | "changed": true,
2019-02-25 12:05:52.737221 | primary | "cmd": "set -o pipefail && /home/zuul/overcloud-prep-images.sh 2>&1 | awk '{ print strftime(\"%Y-%m-%d %H:%M:%S |\"), $0; fflush(); }' > /home/zuul/overcloud_prep_images.log",
2019-02-25 12:05:52.737900 | primary | "delta": "0:05:37.729674",
2019-02-25 12:05:52.737988 | primary | "end": "2019-02-25 12:05:52.678768",
2019-02-25 12:05:52.738042 | primary | "rc": 1,
2019-02-25 12:05:52.738142 | primary | "start": "2019-02-25 12:00:14.949094"
2019-02-25 12:05:52.738168 | primary | }
2019-02-25 12:05:52.738187 | primary |

https://logs.rdoproject.org/75/638775/4/openstack-check/tripleo-ci-centos-7-ovb-3ctlr_1comp-featureset001/18390e5/logs/undercloud/home/zuul/overcloud_prep_images.log.txt.gz#_2019-02-25_12_05_52

2019-02-25 12:05:52 | RegisterOrUpdateError: Exception registering nodes: {u'status': u'FAILED', u'message': [{u'result': u'Node cbc2a92b-4a44-46d6-82ca-373ec01726f9 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node cbc2a92b-4a44-46d6-82ca-373ec01726f9. Error: IPMI call failed: power status.'}, {u'result': u'Node 461d1cf3-0d3c-46bd-852a-c2437d6d9673 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node 461d1cf3-0d3c-46bd-852a-c2437d6d9673. Error: IPMI call failed: power status.'}, {u'result': u'Node 6294311b-ffaa-42cd-b2bd-05fa376930c8 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node 6294311b-ffaa-42cd-b2bd-05fa376930c8. Error: IPMI call failed: power status.'}, {u'result': u'Node c6d05de8-c871-46b5-a383-90a8d3f12b63 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node c6d05de8-c871-46b5-a383-90a8d3f12b63. Error: IPMI call failed: power status.'}], u'result': u'Failure caused by error in tasks: send_message\n\n send_message [task_ex_id=b9cf1c32-a19b-49fe-aba8-9f02329df172] -> Workflow failed due to message status. Status:FAILED Message:({u\'result\': u\'Node cbc2a92b-4a44-46d6-82ca-373ec01726f9 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node cbc2a92b-4a44-46d6-82ca-373ec01726f9. Error: IPMI call failed: power status.\'}, {u\'result\': u\'Node 461d1cf3-0d3c-46bd-852a-c2437d6d9673 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node 461d1cf3-0d3c-46bd-852a-c2437d6d9673. Error: IPMI call failed: power status.\'}, {u\'result\': u\'Node 6294311b-ffaa-42cd-b2bd-05fa376930c8 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node 6294311b-ffaa-42cd-b2bd-05fa376930c8. Error: IPMI call failed: power status.\'}, {u\'result\': u\'Node c6d05de8-c871-46b5-a383-90a8d3f12b63 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node c6d05de8-c871-46b5-a383-90a8d3f12b63. Error: IPMI call failed: power status.\'})\n [wf_ex_id=f41c1195-91a2-4b8c-83ae-c7a85ef5651a, idx=0]: Workflow failed due to message status. Status:FAILED Message:({u\'result\': u\'Node cbc2a92b-4a44-46d6-82ca-373ec01726f9 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node cbc2a92b-4a44-46d6-82ca-373ec01726f9. Error: IPMI call failed: power status.\'}, {u\'result\': u\'Node 461d1cf3-0d3c-46bd-852a-c2437d6d9673 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node 461d1cf3-0d3c-46bd-852a-c2437d6d9673. Error: IPMI call failed: power status.\'}, {u\'result\': u\'Node 6294311b-ffaa-42cd-b2bd-05fa376930c8 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node 6294311b-ffaa-42cd-b2bd-05fa376930c8. Error: IPMI call failed: power status.\'}, {u\'result\': u\'Node c6d05de8-c871-46b5-a383-90a8d3f12b63 did not reach state "manageable", the state is "enroll", error: Failed to get power state for node c6d05de8-c871-46b5-a383-90a8d3f12b63. Error: IPMI call failed: power status.\'})\n'}

https://logs.rdoproject.org/75/638775/4/openstack-check/tripleo-ci-centos-7-ovb-3ctlr_1comp-featureset001/18390e5/logs/bmc-76484-console.log

The root cause is that the overcloud nodes are not able to get an IP address from rdo-cloud.

[ 183.791620] openstackbmc[2796]: File "/usr/lib/python2.7/site-packages/pyghmi/ipmi/private/session.py", line 373, in _assignsocket
[ 183.792667] openstackbmc[2796]: tmpsocket.bind(server[4])
[ 183.794226] openstackbmc[2796]: File "/usr/lib64/python2.7/socket.py", line 224, in meth
[ 183.794919] openstackbmc[2796]: return getattr(self._sock,name)(*args)
[ 183.795631] openstackbmc[2796]: socket.error: [Errno 99] Cannot assign requested address
[ 183.839068] openstackbmc[2798]: Traceback (most recent call last):
[ 183.839375] openstackbmc[2798]: File "/usr/local/bin/openstackbmc", line 322, in <module>
[ 183.840111] openstackbmc[2798]: main()
[ 183.840757] openstackbmc[2798]: File "/usr/local/bin/openstackbmc", line 317, in main
[ 183.841440] openstackbmc[2798]: os_cloud=args.os_cloud)
[ 183.842488] openstackbmc[2798]: File "/usr/local/bin/openstackbmc", line 52, in __init__
[ 183.843571] openstackbmc[2798]: address=address)
[ 183.844628] openstackbmc[2798]: File "/usr/lib/python2.7/site-packages/pyghmi/ipmi/private/serversession.py", line 271, in __init__
[ 183.845716] openstackbmc[2798]: self.serversocket = ipmisession.Session._assignsocket(addrinfo)
[ 183.846652] openstackbmc[2798]: File "/usr/lib/python2.7/site-packages/pyghmi/ipmi/private/session.py", line 373, in _assignsocket
[ 183.847945] openstackbmc[2798]: tmpsocket.bind(server[4])
[ 183.848816] openstackbmc[2798]: File "/usr/lib64/python2.7/socket.py", line 224, in meth
[ 183.850143] openstackbmc[2798]: return getattr(self._sock,name)(*args)
[ 183.850966] openstackbmc[2798]: socket.error: [Errno 99] Cannot assign requested address
[ 184.201130] os-net-config[2831]: /usr/lib/python2.7/site-packages/requests/__init__.py:91: RequestsDependencyWarning: urllib3 (1.16) or chardet (2.2.1) doesn't match a supported version!
[ 184.206968] os-net-config[2831]: RequestsDependencyWarning)

This is most likely due to the rdo-cloud outtage on 2/23. We are still in outtage, please contact support

Revision history for this message
Rafael Folco (rafaelfolco) wrote :

fs001 passed, fs035 failing for diff error. Closing.

Changed in tripleo:
status: Triaged → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.