18:00:04 <knikolla> #startmeeting tc
18:00:04 <opendevmeet> Meeting started Tue Aug  8 18:00:04 2023 UTC and is due to finish in 60 minutes.  The chair is knikolla. Information about MeetBot at http://wiki.debian.org/MeetBot.
18:00:04 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
18:00:04 <opendevmeet> The meeting name has been set to 'tc'
18:00:11 <knikolla> #topic Roll Call
18:00:13 <slaweq> o/
18:00:16 <knikolla> Hi all, welcome to the weekly meeting of the OpenStack Technical Committee
18:00:19 <knikolla> A reminder that this meeting is held under the OpenInfra Code of Conduct available at https://openinfra.dev/legal/code-of-conduct
18:00:23 <knikolla> Today's meeting agenda can be found at https://wiki.openstack.org/wiki/Meetings/TechnicalCommittee
18:00:25 <knikolla> o/
18:00:39 <dansmith> o/ (still finishing prior meeting)
18:00:42 <rosmaita> o/
18:00:43 <gmann> o/
18:01:05 <noonedeadpunk> o/
18:01:16 <knikolla> we have one noted absence from Amy.
18:01:22 <JayF> o/
18:02:49 <knikolla> #topic Follow up on past action items
18:02:54 <knikolla> We have one action item from the last meeting
18:02:58 <knikolla> rosmaita to review guidelines patch and poke at automating it
18:03:13 <rosmaita> did not have time to do any automation this week
18:03:14 <knikolla> (I imagine I meant writing SLURP release notes guidelines)
18:03:41 <knikolla> no worries, adding it back to the action list then :)
18:03:41 <rosmaita> i think i did update the patch to fix dansmith's issues
18:04:19 <knikolla> awesome, thanks.
18:04:29 <knikolla> #action rosmaita to poke at automating SLURP release guidelines.
18:04:58 <knikolla> dansmith: as the next topic item is gate check, we can wait a minute or two until you're back from the other meeting.
18:05:14 <dansmith> well,
18:05:29 <dansmith> I'd say progress has been made and things are slightly better now
18:05:31 <gmann> things are more stable now
18:05:43 <gmann> yeah, many and many fixes are merged in past couple of week
18:05:46 <knikolla> #topic Gate health check
18:05:50 <dansmith> but I'm still rechecking the nova patch that I was rechecking when we had this conversation last week
18:06:00 <dansmith> some of that is unrelated to other projects,
18:06:22 <slaweq> I saw today some nova patch which was rechecked 20 times before merged
18:06:26 <dansmith> but we did effectively merge a gate break that took us a couple days to resolve
18:06:33 <dansmith> slaweq: yeah
18:06:37 <noonedeadpunk> we had some issues with ubuntu images related to libgcc1
18:06:47 <noonedeadpunk> But it was fixed quickly after being reported
18:07:08 <slaweq> I also started looking at the SQL queries which neutron is doing but I don't have anything important yet
18:07:09 <gmann> slaweq: it should be better now, we had many patches waiting on multiple patches to merge
18:07:38 <dansmith> I've seen no OOMs since we merged the patch to bring conccurrency back down
18:07:46 <dansmith> as in, opensearch shows none
18:07:52 <noonedeadpunk> Though, we have spotted couple of nasty regressions in keystone while upgrading to 2023.1, but that's different topic...
18:07:54 <rosmaita> just caught 2 OOMs on a single run
18:07:57 <dansmith> but there were 12 in the one job I monitor the day before
18:08:02 <dansmith> rosmaita: which job?
18:08:07 <dansmith> we didn't back it down for all jobs
18:08:10 <gmann> rosmaita: and today?
18:08:56 <gmann> tempest-full-py3, multinode jobs are now on 4 concurrency so any job inheriting from those run with 4
18:08:58 <rosmaita> cinder-tempest-plugin-lvm-lio-barbican hit two OOM errors:
18:08:58 <rosmaita> Aug 08 16:19:05 np0034894611 kernel: Out of memory: Killed process 49892 (mysqld)
18:08:58 <rosmaita> and not surprisingly, after that there are a bunch of 500s from identity service. System eventually recovers, and then:
18:08:58 <rosmaita> Aug 08 16:39:26 np0034894611 kernel: Out of memory: Killed process 126697 (qemu-system-x86)
18:08:58 <rosmaita> and a bunch of timeouts during tests
18:09:14 <rosmaita> so we should do whatever you did for those other jobs
18:09:37 <dansmith> what's the inheritance for that job?
18:10:04 <rosmaita> https://zuul.opendev.org/t/openstack/job/cinder-tempest-plugin-lvm-lio-barbican
18:10:07 <gmann> it is with 6 concurrency as it inherit from devatck-tempest
18:10:13 <dansmith> yeah, doesn't inherit from the one we lowered
18:10:31 <rosmaita> yeah, it inherits from devstack-tempest
18:10:34 <gmann> but let's monitor more if there are more oom there and we can lower it down accoringly
18:10:37 <rosmaita> what's the one you fixed?
18:10:43 <dansmith> that's good data, that we've seen zero on the integrated-storage job but the other jobs are still hitting
18:10:55 <gmann> rosmaita: https://review.opendev.org/c/openstack/tempest/+/890689
18:11:00 <rosmaita> thanks
18:11:13 <gmann> lower down the concurrency to 4
18:11:15 <dansmith> lemme opensearch that job
18:11:24 <dansmith> 2 today so far, 4 yesterday
18:11:38 <dansmith> it runs a lot less than the other jobs so the numbers are definitely lower overall
18:11:52 <dansmith> highest in the last two weeks is 8 for scale
18:13:02 <dansmith> other than that, it's been just small gains to make things better
18:13:18 <gmann> let's monitor more if it occurs or even in other jobs too
18:13:24 <dansmith> things are definitely better, but skipping a 100% fail regression took like four rechecks to land in tempest yesterday
18:13:30 <gmann> it is easy fix to set it back to 4
18:13:38 <dansmith> yup
18:14:21 <fungi> if you have a regression fix you need prioritized in the gate, don't hesitate to give me a heads-up
18:14:22 <gmann> yesterday I saw highest number of patches merged in tempest in last 5-6 months :) I checked in today morning and many were merged.
18:14:45 <dansmith> fungi: yeah I keep thinking about doing that, but.. just never do
18:14:46 <gmann> otherwise it was always avg of 3-4 days to merge anything
18:14:56 <fungi> even if it means forcing it back into the gate pipeline after a failure
18:15:09 <dansmith> gmann: I'm still rechecking this to try to clean up our gate: https://review.opendev.org/c/openstack/nova/+/889992
18:15:15 <dansmith> and it keeps failing :)
18:15:47 <gmann> it will also go in today :)
18:15:55 <dansmith> promise? :)
18:16:02 <gmann> heh
18:16:13 <dansmith> anyway, that's enough from me
18:16:20 <gmann> nova has more diff set of configuration jobs so we never know
18:16:54 <gmann> nothing else from me too on gate
18:17:39 <knikolla> noonedeadpunk: you mentioned some keystone regressions. do you have some links I can look at and bring up to tomorrow's keystone meeting?
18:18:41 <noonedeadpunk> knikolla: one was https://bugs.launchpad.net/openstack-ansible/+bug/2028809
18:19:19 <noonedeadpunk> another wasn't filled as a bug, but all tokens are invalidated right after upgrade, which requires cache to be wiped (I assume)
18:20:12 <noonedeadpunk> second caused by https://opendev.org/openstack/keystone/commit/f6a0cce4409232d8ade69b7773dbabcf4c53ec0f
18:21:23 <noonedeadpunk> first one is most nasty, as you never know who had a password >54 symbols, so really random ones are invalidated
18:21:29 <knikolla> the added field in the token? https://bugs.launchpad.net/keystone/+bug/2029134
18:21:43 <noonedeadpunk> ah, yes
18:22:21 <knikolla> thanks for bringing them up.
18:22:25 <knikolla> moving on :)
18:22:37 <knikolla> #topic Unmaintained status replaces Extended Maintenance
18:22:51 <knikolla> #link https://review.opendev.org/c/openstack/governance/+/888771
18:22:52 <knikolla> I did a manor update to bring the proposal inline, please take some time this week to review.
18:22:53 <knikolla> Hopefully this is the last iteration :)
18:22:56 <knikolla> minor*
18:23:17 <rosmaita> "manor" sounds more elegant
18:23:45 <gmann> ok, I have not got chance to look into that as my tuesday is meeting till noon
18:23:52 <gmann> I will check after meeting
18:23:55 <knikolla> that would make it unaffordable :)
18:24:17 <slaweq> 😀
18:24:20 <knikolla> the entirety of the update is the addition of the line "The PTL or Unmaintained branch liaison are allowed to delete an Unmaintained branch early, before its scheduled branch deletion."
18:25:21 <dansmith> gmann: you too? idk what it is about tuesdays but people love to schedule meetings on it
18:25:53 <dansmith> gmann: that nova n-v patch just failed again.. I dunno which timezone "today" was in for you :D
18:26:06 <gmann> yeah, my total meeting (upstream + downstream) goes to 8-9. half of them in morning and half in evening :)
18:26:31 <gmann> dansmith: :),
18:26:53 <gmann> I take my word back
18:27:00 <dansmith> :P
18:27:26 <JayF> I know I was a holdout on the old proposal, I've already +1'd this one
18:27:39 <JayF> so it's extremely likely we've hit consensus if folks can make time to vote soon we can get cinder off the hook
18:28:03 <knikolla> thanks slaweq and JayF
18:28:05 <gmann> yeah, we are already late on this
18:28:15 <gmann> will check right after meeting
18:28:49 <knikolla> we can move on to the next topic then
18:28:52 <knikolla> #topic User survey question updates by Aug 18
18:28:57 <knikolla> #link https://lists.openstack.org/pipermail/openstack-discuss/2023-July/034589.html
18:29:06 <knikolla> If we want to do any changes to the user survey for 2024, the deadline for proposing question changes is Friday, August 18.
18:29:10 <knikolla> Please see the above linked email for more information.
18:29:22 <gmann> you mean TC question only right ?
18:29:32 <noonedeadpunk> knikolla: wanna add there if ppl are aware of password limit to 54 symbols ?:)
18:29:35 <knikolla> That applies for both questions from the TC and from project teams
18:29:42 <dansmith> noonedeadpunk: hah
18:30:04 <JayF> Ironic is working on user survey questions in an etherpad here, suggestions/comments welcome if anyone wants to take a look: https://etherpad.opendev.org/p/ironic-user-survey-questions-2023
18:30:23 <JayF> I'm happy to take a similar look at TC questions; but I'm not sure I have any to propose but I will think.
18:30:30 <fungi> yeah, i mean, if you see things that are project-specific but need updating (projects missing from some lists, for example) please call those out
18:30:31 <gmann> we did for TC question last time after doing the survey analysis and what can be useful for our analysis
18:30:52 <rosmaita> iirc, Helena or Allison sent a link to a list of the questions, anyone have that link?
18:31:00 <fungi> for example, i noticed the list of "what openstack services do you install" has a few missing
18:31:06 <gmann> but did we do analysis for last survey after we changes the questions ?
18:31:43 <gmann> last analysis I find is 2021 #link https://governance.openstack.org/tc/user_survey/analysis-04-2022.html
18:31:52 <gmann> which I think is before we updated the questions
18:32:12 <fungi> #link https://lists.openstack.org/pipermail/openstack-discuss/2023-August/034596.html
18:32:23 <fungi> that has the link to the spreadsheet to take notes/feedback in
18:32:31 <rosmaita> thanks!
18:32:58 <fungi> but also you can reply on the ml if you prefer
18:33:16 <fungi> which may be more useful if there's something you want to discuss about it rather than specific feedback you have
18:33:46 <gmann> JayF: this is what we updated for TC question for 2023 survey #link https://etherpad.opendev.org/p/tc-2023-1-ptg#L104
18:34:45 <JayF> ack, ty
18:34:47 <gmann> but my question is we should do the survey analysis otherwise we are not at all looking into the response/feedback
18:35:44 <knikolla> That's a fair point.
18:35:59 <slaweq> @gmann you mean 2023 survey analysis?
18:36:08 <gmann> updating a few more question if we need is good. also one thing to note that adding the new questions has some cap. we can remove/update the old one first if needed
18:36:27 <gmann> slaweq: last we did for 2021 only. I think 2022 survey feedback is out after that
18:36:34 <slaweq> Ok
18:36:53 <gmann> slaweq: 2023 feedback is not yet out which is due in Aug end
18:37:19 <slaweq> Ahh, ok
18:37:27 <gmann> Aug 23rd is deadline to fill 2023 survey so after that it will be available
18:37:38 <gmann> that has our latest updated questions we discussed in PT
18:37:40 <gmann> PTG
18:37:47 <slaweq> So I ment 2022, last one which just finished recently if I'm not mistaken
18:37:56 <gmann> yeah
18:38:04 <slaweq> And I agree that we should do analysis of it
18:38:41 <gmann> As per past analysis, there were some of the important information from TC perspective
18:38:43 <slaweq> I can help with that 😃
18:39:21 <knikolla> slaweq: I can forward you the excel file with 2022 responses.
18:39:33 <slaweq> @knikolla thx
18:39:34 <gmann> slaweq: thanks
18:39:42 <knikolla> as I think I may have it.
18:39:55 <fungi> is there an nda associated with that data?
18:40:36 <knikolla> https://lists.openstack.org/pipermail/openstack-discuss/2022-October/030843.html
18:41:12 <slaweq> Thx
18:41:12 <fungi> okay, so that's the nda-free version. cool
18:41:24 <gmann> slaweq: this can be helpful too https://www.openstack.org/analytics
18:41:59 <slaweq> @gmann thx, I know that page already
18:42:06 <gmann> cool
18:43:11 <knikolla> We can touch base next week again and see if we have any ideas or insights that we can shape up to be updates to questions.
18:43:20 <knikolla> anything else on the topic?
18:44:02 <gmann> slaweq: I found the email from aprice including the xls, forwarded the email to you
18:44:23 <slaweq> Thx @gmann
18:45:22 <gmann> slaweq: but that is super unreadable :) and jungleboyj knows the trick to read/convert it to better way
18:45:54 <gmann> he is the one did two user survey analysis and posted in TC page
18:46:01 <slaweq> Ok. I will ping him if I will need any help there 😃
18:46:09 <gmann> ++
18:46:25 <fungi> if the xls version is not the redacted/anonymized one, you may also need to agree to an nda
18:46:57 <knikolla> the xls only has aggregate project usage data
18:47:15 <fungi> oh, so same as the csv version you linked to above
18:48:09 <gmann> humm, it should have TC questions also
18:50:29 <gmann> its there with  TC* column for example 'TCOpenStackVersion' 'TCStableBranches'....
18:51:45 <knikolla> anything else on the topic?
18:52:36 <knikolla> #topic Reviews and Open Discussion
18:52:40 <fungi> don't forget, quarterly informal call with the openstack community and openinfra board tomorrow at 18:00 utc (23 hours from now), put your conversation starters in the pad if you have any:
18:52:43 <fungi> #link https://etherpad.opendev.org/p/2023-08-board-openstack-sync August 2023 OpenInfra Board Sync with OpenStack
18:52:43 <knikolla> #link https://review.opendev.org/q/project:openstack/governance+status:open
18:52:45 <fungi> call details are in that pad too
18:53:06 <knikolla> thanks fungi
18:53:09 <knikolla> tomorrow, 18:00UTC
18:53:22 <fungi> yes
18:53:33 <knikolla> Sync up call with OpenInfra Board.
18:53:58 <knikolla> tc-members please try to make it if you can
18:55:14 <knikolla> alright, thanks all!
18:55:19 <knikolla> #endmeeting