15:00:10 #startmeeting qa 15:00:10 Meeting started Tue May 30 15:00:10 2023 UTC and is due to finish in 60 minutes. The chair is kopecmartin. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:10 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:10 The meeting name has been set to 'qa' 15:00:16 #link https://wiki.openstack.org/wiki/Meetings/QATeamMeeting#Agenda_for_next_Office_hours 15:00:18 agenda ^^ 15:05:54 o/ sorry got distracted 15:06:21 np, me too 15:06:41 let's start 15:07:46 #topic Announcement and Action Item (Optional) 15:08:04 there's gonna be ptg in person in 2 weeks during the summit in vancouver 15:09:21 #topic Bobcat Priority Items progress 15:09:33 #link https://etherpad.opendev.org/p/qa-bobcat-priority 15:10:15 the cirros update was done I think? 15:10:30 https://review.opendev.org/c/openstack/devstack/+/881437 15:10:58 correct 15:11:00 it was 15:11:13 for the venv/bookworm patch I pushed an update earlier today and it has some newish failures, need to look into those 15:11:57 #link https://review.opendev.org/c/openstack/devstack/+/558930 15:12:36 ack, thanks 15:12:45 for cirros there are new versions pending to be released with updated kernels 15:13:00 but switching to those should hopefully go much smoother 15:13:29 #link https://github.com/cirros-dev/cirros/issues/102 15:14:00 \o/ at the end it wasn't that bad with the dhcp client change, it just took me forever to find time to propose the changes :/ 15:15:09 for the venv patch maybe someone with more rocky/centos experience could look into the failures one those distros 15:19:01 seems there are no volunteers around, so we can go on ;) 15:19:06 i'll try, at least i'll try to ping someone who can :) 15:19:15 yay 15:19:27 yeah, sorry, i had to step to a quick call .. 15:19:29 let's move on 15:19:48 #topic OpenStack Events Updates and Planning 15:20:13 the etherpads for the upcomming ptg in person in vancouver were autogenerated 15:20:14 #link https://ptg.opendev.org/etherpads.html 15:20:26 here is ours 15:20:27 #link https://etherpad.opendev.org/p/vancouver-june2023-qa 15:20:46 if anyone has anything to discuss, feel free to add it there in advance 15:21:10 AI for me to promote this on ML and add a structure to the etherpad 15:21:34 #topic Gate Status Checks 15:21:40 #link https://review.opendev.org/q/label:Review-Priority%253D%252B2+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade) 15:21:56 2 reviews there 15:21:57 two config error fixes, please approve 15:23:17 * kopecmartin looking 15:23:40 also CI for those branches seems pretty broken 15:24:09 but I guess we cannot simply retire them while others still want to run jobs 15:24:24 uff, yeah, i see we need a lot of non votings 15:24:33 yeah 15:25:00 approved 15:25:47 #topic Bare rechecks 15:25:51 #link https://etherpad.opendev.org/p/recheck-weekly-summary 15:25:56 we're doing good 15:26:08 QA has bare recheck rate below 13% over the last 90 days 15:26:25 #topic Periodic jobs Status Checks 15:26:25 periodic stable full 15:26:25 #link https://zuul.openstack.org/builds?pipeline=periodic-stable&job_name=tempest-full-yoga&job_name=tempest-full-xena&job_name=tempest-full-zed&job_name=tempest-full-2023-1 15:26:27 periodic stable slow 15:26:29 #link https://zuul.openstack.org/builds?job_name=tempest-slow-2023-1&job_name=tempest-slow-zed&job_name=tempest-slow-yoga&job_name=tempest-slow-xena 15:26:31 periodic extra tests 15:26:33 #link https://zuul.openstack.org/builds?job_name=tempest-full-2023-1-extra-tests&job_name=tempest-full-zed-extra-tests&job_name=tempest-full-yoga-extra-tests&job_name=tempest-full-xena-extra-tests 15:26:35 periodic master 15:26:37 #link https://zuul.openstack.org/builds?project=openstack%2Ftempest&project=openstack%2Fdevstack&pipeline=periodic 15:27:24 the centos 9 stream fips job got fixed 15:27:42 by this 15:27:45 #link https://review.opendev.org/c/openstack/devstack/+/884277 15:28:06 slow-2023-1 looks unstable. and some errors on master today 15:28:34 yeah, one post failure one timeout, let's monitor that one 15:29:12 tempest-full-test-account-no-admin-py3 and one other got broken by enabling enforce scope in glance and nova by default in devstack 15:29:14 #link https://bugs.launchpad.net/tempest/+bug/2020859 15:29:22 #link https://bugs.launchpad.net/tempest/+bug/2020860 15:30:18 the other jobs (devstack-no-tls-proxy and tempest-slow-parallel) failed due to a timeout in some requests in a few tests 15:30:24 :/ 15:31:33 let's see if it's gonna repeat 15:31:50 #topic Distros check 15:31:51 cs-9 15:31:52 #link https://zuul.openstack.org/builds?job_name=tempest-full-centos-9-stream&job_name=devstack-platform-centos-9-stream&skip=0 15:31:54 fedora 15:32:01 #link https://zuul.openstack.org/builds?job_name=devstack-platform-fedora-latest&skip=0 15:32:01 debian 15:32:01 #link https://zuul.openstack.org/builds?job_name=devstack-platform-debian-bullseye&skip=0 15:32:02 focal 15:32:04 #link https://zuul.opendev.org/t/openstack/builds?job_name=devstack-platform-ubuntu-focal&skip=0 15:32:06 rocky 15:32:08 #link https://zuul.openstack.org/builds?job_name=devstack-platform-rocky-blue-onyx 15:32:10 openEuler 15:32:12 #link https://zuul.openstack.org/builds?job_name=devstack-platform-openEuler-22.03-ovn-source&job_name=devstack-platform-openEuler-22.03-ovs&skip=0 15:32:40 all looks quite good, i expected worse :D 15:33:39 #topic Sub Teams highlights 15:33:39 Changes with Review-Priority == +1 15:33:41 o/ 15:33:46 #link https://review.opendev.org/q/label:Review-Priority%253D%252B1+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade) 15:33:58 nothing there 15:34:00 #topic Open Discussion 15:34:04 anything for the open discussion? 15:34:28 yes 15:34:47 I spoke with elodilles briefly about the pending stable/newton cleanups 15:35:12 for grenade also ? 15:35:23 grenade and devstack 15:35:26 cool 15:35:36 not sure yet how to proceed 15:36:05 those eols happened on the edge of transitioning from manual branch deletion to automated 15:36:35 so either a gerrit admin can clean up manually or we can try to run the automation for those still 15:37:16 if it need more work on automation scripts may be asking gerrit admin to remove manually will be easy 15:38:25 I'd ask myself for that, by I'd still like to have the approval from the release team first 15:38:33 i'd be also for the option that requires less input from our side 15:38:48 ++ 15:39:55 you can see the history quite nicely here https://review.opendev.org/admin/repos/openstack/devstack,tags 15:41:44 anyway, waiting for feedback for now 15:41:56 and that's it from me 15:42:09 frickler: thanks for handling it. 15:42:27 I'd like to mention the failure in the test-account job that started to appear when we enabled the enforce scope in devstack for nova. I want to work on fixing it. Currently there seems to be an issue with getting users from the same project and distinguishing users with member and reader role - for pre-provisioned credentials. I do not know 15:42:27 whether someone has an idea of how should I proceed or what should I be aware of. I have a rough idea but someone might know more. -- https://review.opendev.org/c/openstack/tempest/+/884509 15:43:08 yeah pre provisioned account if not yet ready for new RBAC, it is not small change 15:43:28 I started looking into those and found the issues but will work this week to propose something 15:43:40 so disable scope in that job for now? 15:43:42 *is not yet ready 15:43:57 frickler: yeah, that is one option. 15:44:20 until we get pre provisioned account setup ready 15:44:38 I can propose the disable today so that job continue running 15:44:56 gmann: ack, sounds good to me:) 15:45:21 There seems to be a lot of work with the pre-provisioned credentials. 15:45:29 and once this gets fixed we need to make these jobs voting otherwise they are broken many times 15:45:40 yeah and no admin one also 15:45:45 gmann: +1 15:46:30 I cleaned up some object_storage tests that were causing failure in this job. But once I finished this new error started appearing. 15:46:50 #link https://bugs.launchpad.net/tempest/+bug/1996624 15:46:56 #link https://review.opendev.org/c/openstack/tempest/+/881575 15:47:16 the new errors are unrelated to the fix and LP 15:47:33 k, will check those today or tomorrow 15:47:34 lpiwowar fixed the bug, thank you, once the gate is stable/fixed, we can proceed 15:47:50 kopecmartin: ack 15:47:56 there is one more interesting bug 15:47:57 ++ thanks lpiwowar 15:47:59 #link https://bugs.launchpad.net/tempest/+bug/2020659 15:48:18 which was caused , somehow, by this 15:48:20 #link https://review.opendev.org/c/openstack/tempest/+/881675 15:48:28 that patch uncovered something broken in the logic 15:48:40 there is a fix in progress 15:48:42 #link https://review.opendev.org/c/openstack/tempest/+/884584 15:49:00 it did not fail in that change ? 15:49:08 no 15:49:17 it failed in other jobs in totally different project 15:49:31 in order to fail you had to set this 15:49:31 [network].floating_network_name = public 15:49:41 humm, some different config 15:49:43 I commented on this today. Is the verify_ssh() function needed altogether? 15:49:54 yep, and that lead to a different execution path ... 15:49:58 it's described in the LP 15:50:15 yeah, there is a discussion in 884584 15:50:32 not sure how to proceed, seems like the whole logic (if else) is quite outdated 15:50:47 We wait for the server to be sshable. So create_server() should do the checking instead of the verify_ssh() ? 15:50:47 anyway, we can continue discussing there 15:50:49 ack, let me check. we can make it more consistent with other tests SSH verification 15:51:03 lpiwowar: yeah 15:52:06 added it in list, will check this today 15:52:13 thanks gmann 15:52:16 gmann: ack:) 15:52:27 i also commented on your comment gmann here 15:52:29 #link https://review.opendev.org/c/openstack/tempest/+/879923 15:52:33 regarding the cleanup 15:52:39 and resource naming 15:53:07 kopecmartin: ack 15:53:33 #topic Bug Triage 15:53:37 #link https://etherpad.openstack.org/p/qa-bug-triage-bobcat 15:53:41 all recorded tehre ^^ 15:53:49 and as we've already covered the bugs 15:53:53 this is all from my side 15:54:18 if there isn't anything else ... 15:54:32 thank you everyone , see you online 15:54:33 #endmeeting