from (pid=14317) _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py:419
2018-02-12 14:04:08.975 DEBUG cinderclient.v3.client [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] GET call to volumev3 for http://10.169.41.188/volume/v3/279b987f9ea2449b9f98fd94fb700fd8/volumes/67618d54-dd55-4f7e-91b3-39ffb3ba7f5f used request id req-0554e55f-d450-4572-ae57-7716bbe1ec2c from (pid=14317) request /usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py:722
2018-02-12 14:04:08.996 DEBUG oslo_concurrency.lockutils [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Lock "(u'zun-aio', ComputeNode(architecture='x86_64',cpu_used=2.0,cpus=2,created_at=2018-01-31T07:35:09Z,hostname='zun-aio',kernel_version='4.4.0-112-generic',labels={},mem_available=2698,mem_free=5935,mem_total=7983,mem_used=2048,numa_topology=NUMATopology,os='Ubuntu 16.04 LTS',os_type='linux',paused_containers=0,pci_device_pools=PciDevicePoolList,running_containers=3,stopped_containers=3,total_containers=3,updated_at=2018-02-12T06:04:00Z,uuid=76d633b6-76d9-4b05-b1a2-5fe05c52f868))" acquired by "zun.scheduler.host_state._locked_update" :: waited 0.000s from (pid=14317) inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2018-02-12 14:04:08.997 DEBUG zun.scheduler.host_state [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Update host state from compute node: ComputeNode(architecture='x86_64',cpu_used=2.0,cpus=2,created_at=2018-01-31T07:35:09Z,hostname='zun-aio',kernel_version='4.4.0-112-generic',labels={},mem_available=2698,mem_free=5935,mem_total=7983,mem_used=2048,numa_topology=NUMATopology,os='Ubuntu 16.04 LTS',os_type='linux',paused_containers=0,pci_device_pools=PciDevicePoolList,running_containers=3,stopped_containers=3,total_containers=3,updated_at=2018-02-12T06:04:00Z,uuid=76d633b6-76d9-4b05-b1a2-5fe05c52f868) from (pid=14317) _locked_update /opt/stack/zun/zun/scheduler/host_state.py:50
2018-02-12 14:04:08.998 DEBUG zun.scheduler.host_state [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Update host state with service: ZunService(binary='zun-compute',created_at=2018-01-31T07:35:09Z,disabled=False,disabled_reason=None,forced_down=False,host='zun-aio',id=1,last_seen_up=2018-02-12T06:03:16Z,report_count=16899,updated_at=2018-02-12T06:03:16Z) from (pid=14317) _locked_update /opt/stack/zun/zun/scheduler/host_state.py:53
2018-02-12 14:04:08.999 DEBUG oslo_concurrency.lockutils [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Lock "(u'zun-aio', ComputeNode(architecture='x86_64',cpu_used=2.0,cpus=2,created_at=2018-01-31T07:35:09Z,hostname='zun-aio',kernel_version='4.4.0-112-generic',labels={},mem_available=2698,mem_free=5935,mem_total=7983,mem_used=2048,numa_topology=NUMATopology,os='Ubuntu 16.04 LTS',os_type='linux',paused_containers=0,pci_device_pools=PciDevicePoolList,running_containers=3,stopped_containers=3,total_containers=3,updated_at=2018-02-12T06:04:00Z,uuid=76d633b6-76d9-4b05-b1a2-5fe05c52f868))" released by "zun.scheduler.host_state._locked_update" :: held 0.003s from (pid=14317) inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2018-02-12 14:04:09.000 DEBUG zun.scheduler.base_filters [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Starting with 1 host(s) from (pid=14317) get_filtered_objects /opt/stack/zun/zun/scheduler/base_filters.py:67
2018-02-12 14:04:09.001 DEBUG zun.scheduler.filters.cpu_filter [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] <zun.scheduler.host_state.HostState object at 0x7f60bc043ed0> does not have 2.00 usable vcpus, it only has 0.00 usable vcpus from (pid=14317) host_passes /opt/stack/zun/zun/scheduler/filters/cpu_filter.py:39
2018-02-12 14:04:09.001 INFO zun.scheduler.base_filters [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Filter CPUFilter returned 0 hosts
2018-02-12 14:04:09.002 DEBUG zun.scheduler.base_filters [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Filtering removed all hosts for the request with container ID '228452dc-8cf0-46fd-ac26-2d23851b0b31'. Filter results: [('CPUFilter', None)] from (pid=14317) get_filtered_objects /opt/stack/zun/zun/scheduler/base_filters.py:109
2018-02-12 14:04:09.002 INFO zun.scheduler.base_filters [req-68a556fb-62a9-4b2f-af44-f96cdec1f350 admin admin] Filtering removed all hosts for the request with container ID '228452dc-8cf0-46fd-ac26-2d23851b0b31'. Filter results: ['CPUFilter: (start: 1, end: 0)']
[pid: 14317|app: 0|req: 3/6] 10.169.41.188 () {62 vars in 1233 bytes} [Mon Feb 12 14:04:08 2018] POST /container/experimental/capsules/ => generated 791 bytes in 877 msecs (HTTP/1.1 202) 9 headers in 404 bytes (1 switches on core 0)
Need to report the error when scheduler failed.
Reviewed: https:/ /review. openstack. org/633371 /git.openstack. org/cgit/ openstack/ zun/commit/ ?id=d0a7940981b b123bfc12d4ed2d 24a4bf50e513e4
Committed: https:/
Submitter: Zuul
Branch: master
commit d0a7940981bb123 bfc12d4ed2d24a4 bf50e513e4
Author: Hongbin Lu <email address hidden>
Date: Sat Jan 26 23:50:22 2019 +0000
Consolidate Container and Capsule in compute
In before, create/delete Capsule has its own RPC api and compute
manager implementation. The code is largely duplicated with
the Container equivalent. In fact, the code duplication leads
to bugs or missing features and it is hard to maintain.
This commit refactor the compute node implementation for capsule.
First, the capsule RPC API is removed and the controller will
use the container RPC for create/delete capsule in compute node.
Second, the capsule implementation is removed from compute manager.
Instead, we will reuse the container implementation for capsule.
Third, we introduce capsule operation in container driver.
The capsule-specific logic will be implemented by different drivers.
After this patch, all existing container features (i.e. resource
tracking and claiming, asynchronized delete, etc.) will be available
for capsule immediately. In long term, the common implementation
for capsule and container will be easier to maintain.
Closes-Bug: #1801649 5a4f370bdfc7fa8 fb5fb815e8d
Closes-Bug: #1777273
Partial-Bug: #1762902
Partial-Bug: #1751193
Partial-Bug: #1748825
Change-Id: Ie1d806738fcd94