15:00:16 #startmeeting qa 15:00:16 Meeting started Tue Jan 10 15:00:16 2023 UTC and is due to finish in 60 minutes. The chair is kopecmartin. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:16 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:16 The meeting name has been set to 'qa' 15:00:24 #link https://wiki.openstack.org/wiki/Meetings/QATeamMeeting#Agenda_for_next_Office_hours 15:00:27 agenda ^ 15:01:22 o/ 15:01:28 Hi o/ 15:01:35 \o 15:03:36 let's get started 15:03:39 #topic Announcement and Action Item (Optional) 15:03:56 this is the first qa office hour this year \o/ 15:03:56 happy new year everyone 15:04:08 Happy new year! \o/ 15:04:24 also a planning for the next PTG has started but more about that in a second 15:04:54 #topic Antelope Priority Items progress 15:04:55 #link https://etherpad.opendev.org/p/qa-antelope-priority 15:05:10 * kopecmartin checking the status as it's been a while 15:06:40 lib/neutron cleanup seems finished 15:06:59 yeap, all is done AFAICT 15:07:55 great news, yup, it looks done 15:08:17 btw, slaweq what about this "Decide which job variant should become the new tempest default"? 15:08:22 I see the review merged 15:08:28 is there anything else that needs to be done? 15:09:52 kopecmartin IIRC conclusion was that we should move grenade jobs to run ovn backend by default 15:09:53 and keep it with ovn like it is currently 15:09:53 that's done, let me find patches 15:10:14 https://review.opendev.org/c/openstack/grenade/+/862475 15:10:29 this is patch which changed it 15:10:29 yup, this one's there and merged 15:11:00 I also wonder if we should add ceph@jammy to that list for tracking, otherwise I fear we'll have a release without proper ceph testing 15:11:35 kopecmartin: What about the "unit test coverage"? Do we want it? I think it should be a simple patch. 15:12:23 frickler: to the list within "Decide which job variant should become the new tempest default"? 15:12:48 no, as a dedicated topic for A 15:12:57 lpiwowar: we do want it, but that task is great for a new contributor and we can wait a bit with that 15:13:08 kopecmartin: ack 15:13:09 frickler: right, i see, yes, we can do that 15:15:05 o.k., added, will fill in some details later 15:15:17 frickler: perfect, thank you 15:15:46 that leaves us with one last effort for the cycle - "Use admin clients *only if* admin access needed" 15:16:09 arxcruz: anything new ^? 15:16:23 just wondering, i know it's quite early in the year 15:16:23 kopecmartin: nope 15:16:37 ack 15:17:07 moving on then 15:17:08 #topic OpenStack Events Updates and Planning 15:17:25 the name of the next release is Bobcat 15:17:37 we can already register for the next virtual PTG 15:17:48 I've signed QA team up already 15:18:04 i'll create etherpad for gathering topics to discuss soon 15:18:35 the date of the PTG is March 27-31, 2023 15:18:40 #link https://openinfra.dev/ptg/ 15:20:45 * kopecmartin updated the office hour wiki page 15:20:47 #topic Gate Status Checks 15:20:54 #link https://review.opendev.org/q/label:Review-Priority%253D%252B2+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade) 15:21:04 nothing there, anything urgent to review? 15:23:01 #topic Bare rechecks 15:23:06 #link https://etherpad.opendev.org/p/recheck-weekly-summary 15:23:48 QA is at ~18% - 28 bare rechecks out of 158 15:24:40 #topic Periodic jobs Status Checks 15:24:46 stable 15:24:47 #link https://zuul.openstack.org/builds?job_name=tempest-full-yoga&job_name=tempest-full-xena&job_name=tempest-full-wallaby-py3&job_name=tempest-full-victoria-py3&job_name=tempest-full-ussuri-py3&job_name=tempest-full-zed&pipeline=periodic-stable 15:24:54 master 15:24:56 #link https://zuul.openstack.org/builds?project=openstack%2Ftempest&project=openstack%2Fdevstack&pipeline=periodic 15:25:18 looks good 15:25:41 #topic Distros check 15:25:48 cs-9 15:25:49 #link https://zuul.openstack.org/builds?job_name=tempest-full-centos-9-stream&job_name=devstack-platform-centos-9-stream&skip=0 15:25:54 fedora 15:25:55 #link https://zuul.openstack.org/builds?job_name=devstack-platform-fedora-latest&skip=0 15:26:01 debian 15:26:02 #link https://zuul.openstack.org/builds?job_name=devstack-platform-debian-bullseye&skip=0 15:26:07 focal 15:26:07 #link https://zuul.opendev.org/t/openstack/builds?job_name=devstack-platform-ubuntu-focal&skip=0 15:26:13 rocky 15:26:13 #link https://zuul.openstack.org/builds?job_name=devstack-platform-rocky-blue-onyx 15:26:58 wau, it looks better than ever \o/ 15:27:21 pretty green indeed 15:27:32 a great start of the year 15:28:49 #topic Sub Teams highlights 15:28:54 Changes with Review-Priority == +1 15:28:58 #link https://review.opendev.org/q/label:Review-Priority%253D%252B1+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade) 15:29:19 2 reviews 15:29:32 one in gate queue already 15:29:38 the second 15:29:41 #link https://review.opendev.org/c/openstack/tempest/+/869440 15:29:50 gmann: when you have sec, can you have a look ^ ? thanks 15:30:10 #topic Open Discussion 15:30:14 anything for the open discussion? 15:30:45 we should at least mention the tox 4 fun once 15:30:55 seems to be mostly under control by now 15:31:03 yes, i was about to 15:31:12 thanks to the workarounds it looks ok, true 15:31:26 although it was pointed out on the ML that devstack unnecessary installs tox several times 15:31:34 and that hasn't been addressed yet 15:31:48 it kinda relates to the issues we've seen with tox4 15:32:22 I'm working on the fix of the tempest-multinode-full-py3 job. I'm still unsuccessful. If someone has some input I would be really grateful:) 15:32:44 ^^ https://review.opendev.org/c/openstack/tempest/+/866692 15:34:33 quite important to mention that it's probably not a tempest's bug, it somehow fails because of the cirros image 15:34:35 #link https://bugs.launchpad.net/tempest/+bug/1998916 15:35:12 lpiwowar: let's try to raise the cirros issue on ML, maybe someone will have more insight 15:35:26 cirros seems to have some issues in the CI in general, yes 15:35:39 didn't get to investigate further yet 15:35:53 some might be due to network initialization 15:36:00 yeah :/ we tried to bump the version but that didn't help 15:36:05 kopecmartin: Ok, I will do that:) 15:36:23 https://github.com/cirros-dev/cirros/issues/97 15:37:09 is there something you need the updated cirros for? 15:37:22 that could be related, it would explain the timeouts 15:37:43 frickler: no, we just wanted to verify if the error can be reproduced with newer version of cirros or it's been fixed 15:38:59 o.k., it seems neutron also has some issues with updating cirros, so I'd suggest to stick to 0.5.x for now 15:39:14 I think that it might be also possible that it is not caused by the cirros image. Today encountered a error: "Instance f4e582fe-52e6-40e3-b366-2b11ed589089 could not be found." 15:39:24 as sad as that makes me with my cirros hat on 15:40:02 I will investigate a bit more and write the email on ML and give an update next week. 15:40:25 ok 15:40:39 thanks 15:41:06 btw regarding the tox4, maybe that could be another item in the priority etherpad - to get rid of multiple tox installations 15:41:14 in devstack 15:43:02 is that for the tempest part? or in some larger context? 15:43:44 ovn stuff installs it which conflicts with the efforts gmann made to pin tox 15:44:04 it sounds like it is installed via ansible somewhere for devstack and also at least by the ovn tooling and maybe also by tempest stuff 15:44:33 probably want a central "install_tox" function that can check if it is already installed and if not install the desired version then everything can call that 15:45:20 separately tempest is installed with tox many redundant times too 15:45:44 ah, ok, install_tox() sounds like a good idea 15:48:06 sorry , i had to go to another meeting 15:48:12 anything else for the open discussion? 15:48:58 #topic Bug Triage 15:48:59 #link https://etherpad.openstack.org/p/qa-bug-triage-antelope 15:49:04 numbers recorded 15:49:15 that's unfortunately all i had time for 15:49:45 if there isn't anything else , let's close this for today 15:49:49 thanks everyone :) 15:49:55 #endmeeting