16:00:09 #startmeeting tc 16:00:09 Meeting started Wed Jan 18 16:00:09 2023 UTC and is due to finish in 60 minutes. The chair is gmann. Information about MeetBot at http://wiki.debian.org/MeetBot. 16:00:09 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 16:00:09 The meeting name has been set to 'tc' 16:00:11 #topic Roll call 16:00:15 o/ 16:00:19 o/ 16:00:23 o/ 16:00:24 o/ 16:00:33 o/ 16:00:56 o/ 16:00:59 o/ 16:01:41 let's start 16:01:43 #link https://wiki.openstack.org/wiki/Meetings/TechnicalCommittee#Next_Meeting 16:01:50 agenda for today ^^ 16:01:59 #topic Follow up on past action items 16:02:04 gmann to send email to PTLs on openstack-discuss about checking PyPi maintainers list for their projects 16:02:34 not done yet as I am waiting for the automation here. but we will discuss it in separate topics 16:03:05 #action gmann to send email to PTLs on openstack-discuss about checking PyPi maintainers list for their projects 16:03:06 o/ 16:03:07 continuing this 16:03:15 #topic Gate health check 16:03:20 any news on gate? 16:03:26 yeah, 16:03:34 things are not good, at least in nova land 16:03:41 we've been struggling to merge patches for the last week 16:04:06 I;ve seen UCA being out of sync, but no ide if that's used by devstack 16:04:15 biggest problem that affects others is the ceph-multistore job (and sometimes others I think) go off the rails and one tempest worker swells to 1G ram before being OOMed 16:04:50 nova also has some functional test instabilities that have sprung up, but shouldn't be affecting other projects 16:05:27 it was pointed out this morning that fedora 35 went eol, which has caused some disruption. also dnspython apparently released a new version which doesn't work with eventlet and its maintainers have no interest in addressing that problem (thankfully upper constraints is shielding most projects from this, but it was noticed yesterday in swift's docs jobs due to how requirements were being 16:05:28 it's kind of similar in neutron land - we have many random failures recently 16:05:29 installed) 16:05:31 yeah, same for many other projects too. I saw some oslo.db wsrep_sync_wait issue also #link https://review.opendev.org/c/openstack/oslo.db/+/870723 16:06:11 I kind of wonder what has changed to consume ram. As py310 on itself has dropped ram usage a bit 16:06:26 there was also the oslo messaging rpc warning that flooded the logs, and might be contributing to instability, but we've been unable to merge that patch because of all the other problems :/ 16:06:29 for a week. 16:06:35 and I saw bunch of failures due to identity endpoint not found error in devstack, it was all on last Friday (13.01) 16:06:43 but this is fine now 16:06:58 This is somewhat (pun not intended) ironic, because we're at a decent spot in the Ironic gate for probably the first time in 2023. 16:07:25 slaweq: the "identity endpoint not found" was an outage in ovh's swift service, we took them out of the log upload list temporarily once it was identified, but they eventually resolved it 16:07:39 k 16:07:42 fungi ok, thx for info 16:07:44 slaweq: I've seen that as well 16:08:41 so yeah, sounds like the gate health is not good, but it also sounds like it's known and being worked, so that's good 16:08:45 though maybe there was a different devstack failure that had a similar error message. now that i think about it you wouldn't have seen that message in the normal job logs because they weren't uploaded 16:09:05 yeah, let's see how it progress in this and coming week. 16:09:06 fungi: the one I was seeing was devstack and our own identity endpoint 16:09:14 way before log upload 16:09:21 ah, okay, so separate issue 16:09:57 https://zuul.opendev.org/t/openstack/build/d35164323178458886fc0fcb511ede20 16:10:08 this is example of what I was talking about 16:10:17 so it's, as dansmith said 16:10:31 slaweq: yep, that's the one 16:11:40 ok, anyways let's move and keep monitoring the issues and fixes. situation is not good as we are going near to m-3 16:11:54 seems to be getting a 404 on querying keystone 16:11:55 Ok/ 16:12:29 #topic TC 2023.1 tracker status checks 16:12:30 Bah o/ 16:12:37 #link https://etherpad.opendev.org/p/tc-2023.1-tracker 16:13:19 any updates on any items ? 16:14:22 seems like 9 of the items are pending/not started and we have not left much time for this cycle 16:14:43 request everyone to check their assigned items. thanks 16:15:05 will-do 16:15:18 rosmaita: I think no progress in SIG i18 things? 16:15:19 thanks 16:15:37 rosmaita: FYI, I have added the followup in Feb sync up call with Board #link https://etherpad.opendev.org/p/2023-02-board-openstack-sync 16:15:39 I will get to it finally I hope 16:15:46 thanks 16:16:10 gmann: nothing new to report; the board did agree to purchase the weblate hosting, but there was some kind of problem processing the payment 16:16:28 k 16:16:42 ooh, embarrassing.. card rejected after you already ate dinner 16:16:44 more than payment I am more concern on finding the maintainer 16:17:03 agree, that is a big issue 16:17:19 but i think once there is a "there" there, it may be easier to find a volunteer 16:17:26 well, zuul errors decreased a bit from last time I checked and mostly related to EM branches 16:17:29 ian and seongsoo are optinistic 16:17:35 *optimistic 16:17:48 cool, that will be great 16:18:04 noonedeadpunk: nice. 16:18:14 last word i saw was that payment got initiated on 2022-12-29, i hadn't heard about payment processing issues. is the foundation staff/wes aware? 16:18:14 dansmith: dinner was not yet served 16:18:22 anyways let's discuss i18 things in syncup call 16:18:48 fungi: i was copied on something to wes and jonathan from weblate 16:19:03 got it, thanks. i'll try to check in on that as well 16:19:04 rosmaita: okay I will revise my joke 16:19:04 i will follow up in the email thread with everyone 16:19:15 dansmith: :) 16:19:18 thanks 16:19:29 #topic Cleanup of PyPI maintainer list for OpenStack Projects 16:19:39 there is update on 'xstatic-font-awesome' 16:20:26 horizon team discussed it in last two meeting, one is just before our meeting. and agreed to hand over the maintenance of this repo to external maintainers 16:20:30 #link https://meetings.opendev.org/meetings/horizon/2023/horizon.2023-01-18-15.00.log.html#l-102 16:20:58 in that case i guess the next step is to start the retirement process for the fork of it in opendev 16:21:13 PyPi external maintainers find the opendev/openstack process to be too heavy and slow for themn 16:21:43 as next step Horizon PTL will start the retirement process for this repo and also do the audit for other xtstaic repo also 16:22:09 and horizon can use it as any other deps maintained outside of openstack 16:22:11 gmann: not surprising since they didn't even publish sources for their latest package... 16:22:43 yeah, they should fix the repository link in PYPi page also 16:23:11 or rename the package 'xstatic-font-not-so-awesome' 16:23:16 clarkb: i see an sdist for it 16:23:26 rosmaita: :) 16:23:27 or did you mean something else by publish sources 16:23:28 fungi: ya but that isn't the xstatic source aiui 16:23:40 fungi: there is xstatic tooling and stuff that produces that sdist 16:23:44 anyways that is way forward for xstatic-font-awesome 16:24:09 now on the other repo cleanup and audit process, which we discussed in previous meeting also 16:24:20 I'm almost done automating some python code that consults the list of all projects/deliverables from governance and checks the maintainers on pypi 16:24:34 knikolla[m]: perfect 16:25:31 if you can hold off on sending the email until tomorrow, I can give you a list of projects that need cleanup 16:26:08 knikolla[m]: sure, no hurry on email. 16:26:55 Plan is same as discussed last week. 1. once knikolla[m] prepare the list of project need cleanup ask PTLs to do audit if there is any external active maintainer in such repo 2. in PTG check audit status 3. decide on cleanup timing 16:27:01 thanks knikolla[m] for updates and automation 16:27:14 we will continue this topic in next meeting too. 16:27:27 anything else to discuss on this today? 16:28:27 #topic Less Active projects status: 16:28:30 first is Zaqar 16:28:37 Gate is broken due to MongoDB not present in ubuntu 22.04 16:28:54 thats because mongodb changed their licensing and is no longer open source 16:29:10 release team is concern on the Zaqar gate status and so for considering it for the 2023.1 cycle release 16:29:16 did zaqar not ever switch to a different database? 16:29:53 I rechout to PTL about this issue and there is some progress on gate fix by PTL. skipping MongoDB things for now 16:30:16 #link https://review.opendev.org/c/openstack/zaqar/+/857924/ 16:30:34 ah, okay so it can be used with open source databases too 16:31:26 clarkb: zaqar team mentioned fix is merged in mongodb but package is not included in Ubuntu jammy #link https://review.opendev.org/c/openstack/zaqar/+/857924/comments/a0d5d45e_3008683c 16:31:33 yeah hope so 16:32:19 but not sure when they will be able to do as it seems only PTL is the only active mainatiner 16:32:28 gmann: thats via mongodb repos. It won't add mongodb to ubuntu and mongodb is no longer open source 16:32:48 also there is proposal to mark zaqar as inactive #link https://review.opendev.org/c/openstack/governance/+/870098 16:32:54 clarkb: yeah 16:34:02 as we are in critical time where release team need to decide the final list of deliverables they need to handle for this cycle, we should define some deadline for zaqar team otherwise we can mark it as Inactive and release team can skip the release 16:34:26 though we already passed the m-2 which is the deadline to decide on such project 16:34:29 technically the decision was supposed to have been made by january 6 16:34:33 yes 16:34:45 eventually, I've jsut spotted that barbican-ui didn't have any patch being merged for last 2 years 16:35:03 it's release-independent though 16:35:19 yes, and patches are still merging to barbican more generally 16:35:25 ok, we might have many repo in such category but as long as they are up, gate green it should be fine 16:35:37 and yes depends on release model they are in 16:36:28 well, gate can't be green due to https://review.opendev.org/c/openstack/barbican-ui/+/845523 not being merged 16:36:29 for zaqar decision, I will propose 25 Jan which is next TC meeting to check and decide. 16:36:52 noonedeadpunk: ohk 16:37:10 Does that put hardship on the releases team? We're already past the deadline that we pushed backward on their request? 16:38:05 only thing I am considering here is PTL is trying to fix it now and let's wait for a week. it is kind of exception but we also did not track it early and marked inactive before m-2 16:38:33 but I agree for future we should make some process/audit for such projects well in advance and decide by m-2 of release 16:39:27 I will talk to release team also if 25th Jan is ok for them as an exception for this project. 16:39:38 ack 16:40:11 #action gmann to reachout to release team to check about Zaqar gate fix by 25th Jan as deadline to consider it for this cycle release 16:40:26 I think bare minimum would be to pass CI for Zaqar, right? 16:40:26 cool, moving to next project 16:40:33 noonedeadpunk: yeah 16:40:52 I will try to check on where they are and help out if possible 16:41:04 noonedeadpunk: that will be great, thanks 16:41:49 noonedeadpunk: this is the patch where discussion and fixes are being tried #link https://review.opendev.org/c/openstack/zaqar/+/857924 16:42:13 Mistral 16:42:25 all other repo of mistral are in good shape except python-mistralclient 16:42:33 team is working on that 16:42:52 beta release for Mistral repo (other than client) is good too 16:43:26 * slaweq needs to drop now, rechecks stats are updated, there is nothing urgent in that topic to discuss, see You o/ 16:43:28 same thing, I will check with release team about python-mistralclient deadline but at least mistral deliverables will be good to go 16:43:31 aha, thanks for the link 16:43:34 slaweq: ack, thanks 16:43:51 Adjutant 16:44:35 all good here, gate fixed, beta version released, it is remove from 'inactive project' list too #link https://review.opendev.org/c/openstack/governance/+/869665 16:44:51 i will remove it from agenda of this meeting 16:45:33 that is all from less active projects, if anyone encounters few more feel free to add it in meeting agenda 16:45:53 #topic Recurring tasks check 16:45:58 #link https://etherpad.opendev.org/p/recheck-weekly-summary 16:46:05 slaweq updated this week data in etehrpad 16:46:42 kuryr and cinder seems more bare recheck 16:47:32 rest all good. this week gate has been unstable so let's continue checking it in next week 16:47:57 #topic Open Reviews 16:48:00 #link https://review.opendev.org/q/projects:openstack/governance+is:open 16:48:25 open reviews are all ok, zaqar one we discussed, rest other are waiting for project-config depencies. 16:48:40 that is all from today agenda. we have ~12 min, anything else to discuss? 16:49:44 if nothing else let's close it. thanks everyone for joining. 16:49:48 #endmeeting