Wednesday, 2024-04-03

fungiand it merged00:01
fungiso we should be all set00:01
ttxHappy release day!06:29
ttxI have a doctor's appointment this morning but should be back around 10:00utc06:29
fricklerso, what are we going to do with docs? I've approved https://review.opendev.org/c/openstack/openstack-manuals/+/914602 now but the actual release patch isn't mergeable in its current state. sadly next to no feedback from tc-members either06:38
hberaudo/07:07
elodilles~o~07:51
fricklerso I updated https://review.opendev.org/c/openstack/openstack-manuals/+/914603 now to add dalmatian07:52
elodillesfrickler: thanks. let's see how it renders and then i'm OK with it07:57
elodillesthe semaphore patch has merged, thanks o/07:58
hberaudeverything looks operational on the python infra https://status.python.org/07:59
elodillesrelease-team: don't forget to not approve any stable release patch today. we have to avoid any interference 08:00
hberaudack08:00
opendevreviewElod Illes proposed openstack/releases master: Add release note links for 2024.1 Caracal #2  https://review.opendev.org/c/openstack/releases/+/91494208:15
elodillesplease review & merge this when the jobs have finished ^^^08:17
elodilleshberaud frickler : jobs have finished, please review: https://review.opendev.org/c/openstack/releases/+/91494208:47
hberaudack08:47
frickler+409:06
elodilles\o/09:06
elodillesthanks09:06
elodillesi'll disappear now to grab some food, but be back before 10:00 UTC09:07
opendevreviewMerged openstack/releases master: Add release note links for 2024.1 Caracal #2  https://review.opendev.org/c/openstack/releases/+/91494209:15
fricklerhmm, seems the foundation has already published the release, we are done? see https://www.openstack.org/software/openstack-caracal and also the landing page09:30
fricklermaybe someone should update the map to drop things that no longer exist or aren't part of the release https://object-storage-ca-ymq-1.vexxhost.net/swift/v1/6e4619c416ff4bd19e1c087f27a43eea/www-assets-prod/openstack-map-v20210201-01-1.pdf09:35
ttxhmm09:36
ttxThat page should not have been linked to until release time09:36
ttxIf you click on lastest release on https://www.openstack.org/software/ it still (correctly) links to Bobcat09:37
ttxBut the openstack.org main page seems to have jumped the gun09:37
ttxI'll try to get it fixed09:39
ttxfrickler: the map is driven from changes proposed to https://opendev.org/openinfra/openstack-map09:40
ttx2023.05.01 version is the current release09:41
ttxVersion linked from https://www.openstack.org/software/ is current09:41
ttxfrickler: Where did you get the link to that old version?09:42
fricklerttx: directly on openstack.org, very big with video link and all09:43
frickleroh, you mean the map?09:44
elodillesyeah, on the main page it says: Latest Release: OpenStack Caracal --> https://www.openstack.org/ if you scroll down a bit09:44
fricklerttx: that's on the caracal page09:44
ttxOK I found it09:44
ttxShould be fixed to link to latest too09:45
fricklerand the 2023.05 map is still outdated, ec2-api, tripleo, chef, all gone. some more inactive and not released09:45
ttxfrickler: yeah nobody proposed those updates to openstack-map apparently09:46
ttxmaybe we should have a release process step to check for that after the milestone-2 membershipfreeze09:47
ttxIf someone can propose changes I can review them and get staff to update the PDF accordingly09:49
* hberaud school run, bbiab09:51
frickleryeah, I'll look at te map repo09:55
fricklerseems some things are already in there, like tripleo removal09:56
fungifrickler: https://www.openstack.org/software/openstack-caracal was published early for double-checking (like every year), just not linked from other pages yet09:59
ttxAlright, openstack.org is back to Bobcat10:00
fungii suppose it's a questionable workflow choice, but up to the folks making that content i suppose10:00
ttxlooks like there was an error staging the page earlier10:00
ttxfrickler: thanks for the flag!10:00
elodillesmeanwhile, i think that things look fine to proceed with the Final release patch. https://status.python.org/ looks green, zuul is OK too, right fungi?10:01
fungioh, i see, the problem is the main page of the site in that case10:01
ttxyeah, that specific bit should definitely not have been enabled10:01
fungielodilles: checking10:02
elodilles~o~10:02
ttxyeah map 2023.05.01 already has tripleo removed normally10:03
ttxAlso feels like Skyline could be added? Not sure what status it has those days10:04
fungizuul status graphs look okay10:05
elodillesfungi: ACK, thanks o/10:05
fricklerttx: skyline is still listed as "emerging". but I've been thinking about that too, to fill the space emptied by EC2API. not sure how to layout that, though10:05
fungittx: skyline is still an "emerging technology" according to the tc10:05
elodillesrelease-team: let's review the release patch: https://review.opendev.org/c/openstack/releases/+/91476410:05
fricklerdoesn't mean it may not be listed, though, IMO10:05
ttxfrickler: our designer will adjust width if needed10:06
fungi"emerging" status was a reason not to merge sunbeam's release highlights, but i suppose that was more because they're not part of the release officially?10:07
ttxyeah... skyline deliverables are part of release so I think it would make sense to add them10:08
fungioh, nevermind, sunbeam isn't emerging, just not released with the other deliverables10:09
fricklerhttps://review.opendev.org/c/openinfra/openstack-map/+/914948 not sure if it will work that way though, maybe better just mark inactive projects as "don't publish on map"?10:13
frickleralso we should probably add that update to the TC inactive workflow10:13
frickleralso interesting question whether sunbeam should still be on that map then @jamespage?10:15
elodillesgentle reminder that we are 15 minutes late behind the schedule, this should be reviewed as soon as possible: https://review.opendev.org/c/openstack/releases/+/91476410:15
fungii guess the map is official openstack projects grouped by function, not necessarily limited to release-managed projects10:15
fricklerare the navigator pages also fed from that repo? https://www.openstack.org/software/project-navigator/deployment-tools10:16
fricklerelodilles: ok, if nobody else will go ahead, I'll just approve it, then?10:17
elodillesfrickler: +110:18
jamespagefrickler: sunbeam should be in that map - we're still emerging and likely to be somewhat release independent in terms of what and when we release - which made it a bit awkard to fit into the cycle highlights10:18
jamespagebut we'll figure that out10:18
frickleroh, the CI for the map doesn't actually build one, /me sad10:21
elodilles:/10:21
elodilles(btw, now that we are talking about https://www.openstack.org/ , there is a teeny-tiny issue i'd like to mention: under the 'OpenInfra Foundation Member Spotlight' when randomly Ericsson is showed, its link is false (404) as it points to 'https://www.openstack.org/www.ericsson.com'. anyone knows where can this be fixed?)10:30
fungielodilles: i can bring it up with the folks who manage the site in a few hours once they're awake10:32
elodillesfungi: ACK, thanks in advance!10:33
fungiand yeah, the map isn't auto-generated because it requires some manual layout work to produce, from what i understand10:33
fricklergate almost done10:49
hberaud\o/10:49
opendevreviewMerged openstack/releases master: Caracal Final  https://review.opendev.org/c/openstack/releases/+/91476410:49
elodillesthere it is ^^^ \o/10:49
elodillesnow comes the post-release jobs10:50
hberaudIIRC we should now wait for the end of the post-release job before launching the check of missing tarballs10:50
hberauds/jobs/10:51
fricklereverybody says post-release, why is the pipeline named release-post? that got me confused multiple times already10:51
hberaudI don't have the historical naming decision 10:52
fungiit's the release-specific version of the "post" pipeline (so its priority can be raised above other pipelines)10:53
* hberaud note this definition10:53
fungiit runs after the gate pipeline (so post-gate), not really after a release (which post-release would imply)10:54
fungibut the rationalization there is weak, pipeline names are arbitrary strings anyway10:55
elodillesfrickler: you are right, the queue name is release-post o:) i guess we just write it 'post-release' as that would be how it should look like grammatically. though it's better to emphasize that it is a 'release' job so i'm OK with the naming o:)10:56
elodillesfungi: +110:56
fricklerlooks like tarballs should be ready? 64 refs now in release pipeline11:30
fricklerzuul seems to get stuck when trying to show builds or buildsets though :-/11:31
hberaudIIRC we have to wait for end of these jobs11:32
hberaudnot sure all tarballs are already there at this point11:32
hberaudand AFAICS a couple jobs are just queued and not yet started11:33
elodilleshmmm, zuul shows that 'openstack-upload-github-mirror' jobs are running, but if i look into the jobs it shows e.g. "Build ID 900ba831db5d455d8b0b47a430fb168c not found  --- END OF STREAM ---"11:35
elodillesand the rest of the jobs haven't even started. strange.11:35
fungiseems like the pipeline status update is waiting for the management events to process11:35
fungicompleted builds open fine, but the in-progress builds aren't connecting to a log stream yeah, probably means they either haven't really started to run yet or already finished but the result hasn't been popped off the queue yet11:37
fungiif i fabricate a build result url for one of them, it only has the start data recorded so far11:39
hberaudI didn't remember seeing significant updates since ~20 min11:40
elodilleswhere 'release-openstack-python' job started, there the situatuion is the same: by opening the job it says the same "Build ID 9ad86b1f9c654d198d609fd7b2df1a34 not found --- END OF STREAM ---"11:40
elodillesyes it seems it's not updating11:41
fungi"5 management events"11:41
fungii don't think it updates the queue state until it processes those11:41
funginode requests and in-use nodes definitely picked up at the same time the tags got pushed11:43
fricklerlet's hope that the sandbox bug reproducer doesn't have any global effect11:45
fungi2024-04-03 11:07:43,133 DEBUG zuul.Pipeline.openstack.release: [e: 561ba4a3f77049a1b4b12b98d8292489] Build <Build 9ad86b1f9c654d198d609fd7b2df1a34 of release-openstack-python voting:True> started11:46
fungi2024-04-03 11:07:53,278 DEBUG zuul.Scheduler: [e: 561ba4a3f77049a1b4b12b98d8292489] [build: 9ad86b1f9c654d198d609fd7b2df1a34] Processing result event <BuildStatusEvent build=9ad86b1f9c654d198d609fd7b2df1a34 job=c2702684fb5c4737b3ca095b62159032>11:46
fungithat's the adjutant release-openstack-python build that's still showing as in-progress11:47
fungijust the first example i searched the logs for11:48
fungiso yeah, it seems that result events are just queuing up, but also that build only ran 10 seconds11:49
* hberaud short dad taxi11:50
fungii think that project might have a problem, but perhaps unrelated to the overall situation11:50
fungihttps://pypi.org/project/barbican/ shows barbican did get released, but its result also hasn't been processed11:50
hberaudzaquar just showed some updates on my side11:51
fungioh, it's updating again11:51
fungithe management events count fell to 011:52
fungiso yeah, seems we were waiting for a pipeline reconfiguration, possibly due to something merging that changed job configs?11:52
fungias the results count falls, more of the statuses should get corrected11:53
fungifalse alarm on adjutant, it does seem to have completed successfully11:54
fungii forgot its name on pypi is different11:54
funginot sure why the started and result processing events were so close together in the debug log, the build itself took 2 minutes to complete11:55
fungiresults queue is caught back up now11:59
elodillesyepp, now the queue is shrinking12:00
elodillesso, fingers crossed12:00
* hberaud back12:08
hberaudwill have to do another dad taxi run in ~1h1512:08
elodilleshberaud: ACK12:08
hberaudWednesday is kids days...12:09
elodilles:)12:09
elodillesalmost there: queue length is 4 :-o12:13
fricklernova upload seems to have failed https://zuul.opendev.org/t/openstack/build/567d434246344041a1a8a4f7a19fce0612:13
elodilles:S12:13
fungiyeah, looks like pypi barfed12:14
hberaud /o\12:14
fungii think we can reenqueue the tag, double-checking12:14
fricklersame for glance and possibly some others12:14
elodillesand some other, too: trove trove-dashboard skyline-console..12:14
elodillesetc :/12:14
fungiargh12:14
fungiterrible time for a pypi outage12:15
hberaudyes12:15
fungii guess we need to collect a complete list of those and then i can batch re-enqueue the corresponding tags12:15
elodilles12 recent ones: https://zuul.opendev.org/t/openstack/builds?job_name=release-openstack-python&result=POST_FAILURE&skip=012:15
elodillesinterestingly the times are matching with the time when the queue became unblocked12:17
elodilles(~11:50)12:17
hberaudwe surely reach a timeout...12:17
hberaudor something like that12:18
fungithose are start times12:18
hberaudindeed12:18
fungithat's when zuul's queue processing resumed and it started a lot of builds12:18
fungithe durations are all short enough that they shouldn't have encountered timeouts12:19
funginor do i see any evidence of job timeouts in the logs12:19
hberaudack12:19
fricklerso do we need to protect pypi uploads with a semaphore? maybe like max 5 or 10 in parallel?12:19
fungii would be surprised if we broke pypi12:19
hberaudAFAIK that's the first time we face such situation12:20
hberaudso IMO I don't think we need a semaphore12:20
fungihttps://status.python.org/ isn't indicating any problems12:20
fungibut it may only update when the admins find out something broke12:21
hberaudand I agree with fungi, I don't think we broke pypi, their infra seems robust12:21
fungilikely we just got unlucky with something else they've got going on12:21
fungii'll try to reenqueue one tag and see if it works a second time12:21
hberaudwfm12:22
fungiwait, no i won't12:22
fungihttps://pypi.org/project/nova/12:22
fungiwe're not going to be able to rerun these12:22
fungithe upload worked, but returned an error12:22
fungioh, actually it only uploaded the wheel and not the sdist12:23
fungihttps://pypi.org/project/nova/#files12:23
fungithis is going to be a real problem. pypi won't allow us to reupload any file with the same filename12:23
hberaudwoot...12:23
elodilles /o\12:24
fungiand we discarded the sdist and signatures when the build aborted due to the error12:24
hberaudcan't we remove existing artifacts manually and then reenqueue?12:24
fungiwe can remove them, but we can't reupload them. pypi blocks upload of any filename which ever previously existed, as a security measure12:25
hberaudI see12:26
fricklerso we need new tags for these?12:26
fungihttps://pypi.org/project/trove/#files is in the same situation (wheel but no sdist)12:26
funginew tags would be the fastest solution, yes, bump them all by a patchset of .112:26
hberaudwfm12:27
fungiother options include hacking up the job to only upload an sdist, or manually building the missing artifacts and signatures and uploading them manually, all of which will take more time than we have12:27
hberaudall jobs are now finished so I think 12 is the final number for the failing series12:28
fungialso the .0 tags are going to get forever marked as erroneous in our consistency checker12:28
fungittx: ^ heads up in case you're not following closely12:28
elodillesso a PATCH bump for the 12 failed deliverables, using the same hash as was in the Final patch, right?12:28
hberaudas this is an error I think the best way is to bump the patchset12:28
fungielodilles: correct12:29
hberaudpatchset would remove ambiguity12:29
fungiso basically release nova 29.0.1 with the same commit that 29.0.0 pointed at12:30
fricklerpatchset = patch version or am I misunderstanding something?12:30
hberaudbugfix version12:30
fungiyes, sorry, patch level component of the version string12:30
fungimy fingers like to type patchset for obvious reasons12:30
hberaudlol12:31
fricklerelodilles: do you want to make that patch? or shall I?12:32
elodillesi'm doing it right now12:32
ttxI'm here, just having a hard time to follow12:33
fungittx: 12 of the deliverables had successful wheel uploads to pypi but then it errored back during sdist uploading12:34
ttxso we have partial uploads to PyPI and are forced to change the version number to do a new one12:34
fungisince pypi won't allow re-uploading of files that already exist, we can't reenqueue those tags12:34
fungiso, yes, fastest solution is x.y.1 versions of the 12 deliverables that were impacted12:34
ttxyeah that makes sense12:34
opendevreviewElod Illes proposed openstack/releases master: Re-release final RC for 12 failing deliverables  https://review.opendev.org/c/openstack/releases/+/91495812:35
elodillesplease review carefully ^^^12:35
fungiand live with the fact that the x.y.0 versions have missing artifacts12:35
fungiwe can also yank them on pypi if we want and/or remove them from the releases version list i guess, but that's less urgent12:36
fungimainly just hiding them so they cause less confusion12:36
ttxelodilles: patch looks good, hapy to push W+1 if everyone is ok12:38
elodillesACK12:38
ttxare we ready to approve that one? previous jobs completed ?12:38
hberaudlgtm12:38
hberaudI think that yes we are ready12:39
fungiyes, release pipeline is empty12:39
hberaudprevious jobs completed12:39
ttxAlright here it comes12:39
hberaudthanks elodilles 12:40
elodillesnp12:40
fricklerfungi: seems the check pipeline has quite some backlog, do we want to move ^^ directly into gate for timing reasons?12:41
fungiyes, we can enqueue it to the gate12:41
fungido you want to do that, or shall i?12:42
fricklerfungi: if you have the command handy please go ahead12:42
fungidone12:44
fricklerwhat about the manuals patch, do we want to proceed with that, too, or does that have to wait until the end really?12:44
fungii'll also dequeue it from check in order to avoid confusion12:44
fungi9 minutes eta until it merges12:45
fungifrickler: i think the main reason the docs update is held to the end is to avoid implying it's released too far in advance, and in case the release is aborted for some reason12:47
elodillesyepp. though it took some time last occasion, so we should start merging it before 13:00 UTC. maybe the re-release patch will be merged by then :/12:49
elodillesthough if we want to wait for pypi ACK, then we can wait a bit more until every package appears well on pypi12:50
fungior at least be sure one of them uploads and that we don't have some persistent problem affecting those 12 sdists12:51
elodillesfungi: +112:51
elodillesanyway, please review the patch in advance, so that we just need a +W to proceed if everything turns out to be fine12:52
elodillesbtw, the "openstack-tox-docs" job has the same situation when looking at the job logs: 'Build ID 189ef8a138e6499aa147081dd0c03d06 not found' :S12:54
elodilles(0 management events)12:54
fungiyou caught it between when the build ended and when zuul reflected its status as completed, i suspect12:57
elodilleshmmm, validation runs longer than i thought :S13:03
hberaudskyline-console...13:03
hberaud12 deliverables to checks...13:04
hberaudtrove-dashboard13:04
elodillescompared to 66 in the original patch13:04
elodillesthough nova might be the biggest deliverable / repo13:04
hberaudalmost there...13:05
hberaudtrove13:05
hberauddone13:07
opendevreviewMerged openstack/releases master: Re-release final RC for 12 failing deliverables  https://review.opendev.org/c/openstack/releases/+/91495813:08
elodillesfinally \o/13:08
elodilleslet's see the release-post queue o:)13:08
* hberaud grab pop corn13:08
fungiseems there's a management event being processed for the release-post pipeline, and the trigger event to enqueue the tagging job is waiting behind that13:09
hberaudnothing appear queued on my side13:14
fungiyeah, it's that "3 trigger events, 1 management events, ..." under the heading of the release-post pipeline13:15
hberaudok13:15
fungiseems there's another reconfiguration underway affecting that pipeline (the "management event") which has to complete before the trigger from the change-merged event (one of the "trigger events") gets processed13:16
fungiunfortunately, reconfigurations seem to be taking a very long time. not sure if it's that we have too many projects with too many branches and zuul is struggling to recheck all the configuration in them, or if we have some other condition making it take so long13:17
fungiit's something to look into after the release is done, for sure13:18
fungianyway, it's in there now13:19
fungiand tag-releases is running13:20
ttxshould we run missing-releases?13:20
hberaudI think we should wait for these late deliverables13:21
hberaud(IMO)13:21
ttxoh right, misunderstood current state13:21
fungiyeah, we're almost at the point where we should see the 12 tags start appearing in the release pipeline13:22
fungithere they are13:25
*** jbernard_ is now known as jbernard13:26
hberaud\o/13:28
hberaudI need to grab my son from his ping pong lesson13:30
* hberaud dad taxi part 3!13:30
elodillesACK13:30
elodilles:)13:30
hberaudback in minutes13:30
elodilles(management events continue to turning up time to time :S)13:31
fungiat least they seem to be running finally13:35
fungiwaiting for one of the release-openstack-python builds to kick up13:37
* hberaud back13:45
ttxthat tag pipeline is not moving fast13:50
ttxah, some jobs queued13:51
fungifinally13:51
fungiwatching glance to see if it uploads this time13:51
elodillesfingers crossed :X13:51
hberaudoars crossed even13:52
elodilles:D13:52
hberaudtoo much suspence today :)13:53
fricklerfrom the console log the glance pypi upload succeeded13:54
elodillesglance seems to be there: https://pypi.org/project/glance/#files13:54
fungiuploaded! https://pypi.org/project/glance/#files13:54
fungiyep13:54
hberaudsame thing for manila13:55
hberaudhttps://pypi.org/project/manila/#files13:55
elodillesnova, too \o/13:55
hberaudnetworking-bagpipe too13:56
elodillesit's nice that jobs are not finishing according to zuul, all having 'Build ID not found' :P but at least the files are there13:57
frickleryes, zuul is taking its time to process results again13:57
fungiso the good news is that whatever pypi's problem was has cleared up, but these stalls for build processing are pretty crippling13:57
elodillesyepp :/13:58
ttxrunning missing-releases to see how far we are13:58
elodillesttx: ++13:58
hberaudok thanks ttx 13:59
fungialso we could probably consider approving the docs change at this point, since the pypi issue seems to not be persistent13:59
ttxyeah... just a sec as I complete the missing-releases13:59
elodilles(8 out of 12 release-openstack-python jobs have officially finished)14:00
hberaud\o/14:00
ttxso far so good, standing by to approve doc change14:03
hberaudnice14:03
elodilles(all 12 release-openstack-python jobs succeeded)14:04
hberaudawesome14:04
ttxI feel confident we can approve te docs patch at this point14:05
ttxWe can fix missing tarballs if any while the oatch is processed14:05
hberaudttx: can we strikeout the missing-release task?14:05
ttxalmost,... in progress14:06
hberaudack14:06
ttxI can W+1 the docs change once you pile up your approvals14:06
elodilleshere is the patch: https://review.opendev.org/c/openstack/openstack-manuals/+/91460314:07
ttxhttps://review.opendev.org/c/openstack/openstack-manuals/+/91460314:07
fricklerI won't +2 since I submitted the patch, feel free to go ahead though14:08
ttxalright w+114:08
elodilles~o~14:09
ttxmissing-releases completed successfully14:10
elodilles\o/14:10
ttxhttps://review.opendev.org/c/openstack/releases/+/914858 is up next14:10
fungiso no new surprises from missing-releases?14:10
ttxit did not report any issue14:11
funginot even the broken partial/missing x.y.0 artifacts?14:11
ttxno...14:11
ttxbut then it only checks latest14:12
fungioh, got it14:13
fungithat makes sense then14:13
fricklerI guess one could run it on HEAD^1 to cross-check that14:14
* frickler goes to test that14:14
fungii thought there was a script that checked all the historical links, but maybe that's a different step14:15
elodilles(yepp, it reported the missing x.y.0 to me ~2 hrs ago)14:15
elodillesand we didn't even get false alarms like in the past cycles (due to py2py3 universal wheels, or something like that)14:16
fricklerelodilles: like this?   did not find python 3 wheel https://tarballs.openstack.org/ansible-role-atos-hsm/ansible_role_atos_hsm-7.0.0-py3-none-any.whl14:17
hberaudabsence of evidence is not evidence of absence 14:17
elodillesfrickler: nope, it was some error that was reported at the end of the run14:18
hberaudindeed we seen these errors since a couple of series now14:18
hberaudbut in the same time it seems to me that several things have been done around these universal wheels14:19
hberaudon our side, on things like distutils/setuptools/pbr and on pypi14:20
fricklerseems there's only this single repo remaining with that situation. it does have a good py2/3 wheel though14:23
Clark[m]I don't know if matters or is tool late but a x.y.0.post0 type version may be more accurate. However figuring out the right format is probably more trouble than it is worth a .1 releases are cheap and easy14:25
elodillesfrickler: my bad, those were exactly the errors (e.g.: https://paste.opendev.org/show/bAagRq3rkx39T10YwG1v/ )14:25
elodillesso who want's to push the button here? https://review.opendev.org/c/openstack/releases/+/914858 o:)14:26
* frickler likes pushing buttons14:27
hberaudAFAICS I'd argue that yes14:27
elodillesthere it goes, thanks frickler \o/14:28
fricklerI even have a big red button lying around here somewhere. always wanted to insert some circuitry to make it an USB keyboard with just the enter key ;) exactly for this use case14:29
elodilles:D14:29
hberaud:)14:29
elodillesmaybe next time ;)14:29
hberaudyou have 6 months14:29
elodilles:]14:29
fricklerthat's some motivation indeed :D14:30
elodilles:))14:30
elodillesbtw, i've prepared my part of the release announce mail, feel free to review: https://etherpad.opendev.org/p/relmgmt-weekly-emails14:30
fungiClark[m]: also i think we've never used post releases, so didn't seem like the time to experiment14:30
Clark[m]++14:31
Clark[m]Do we know why pypi failed in the first place? I wonder if they were throttling us14:31
elodillesmaybe we can test it with release-test repo, so that we can use next time (i really hope we'd never need it)14:32
hberaudand I just pasted my part at the bottom14:32
fricklerdu we want to mention the .1 releases in the mail?14:32
frickler*do14:32
hberaudmaybe in the openstack-discuss part14:32
elodilleshberaud: LGTM!14:32
fungiClark[m]: i suppose it could have been an account-level throttle14:32
hberaudat the bottom14:32
fungibut seems odd it would have been applied at the end of the upload rather than the start14:33
hberaudfrickler: or maybe in a separated threads directly targetted to the right team rather14:33
hberaudopinion?14:34
elodillesfrickler: IMO no need to mention in the 'announce' list, as hberaud says, maybe on openstack-discuss14:34
fungii'm standing by to approve the openstack-announce post when the time comes14:34
elodillesfungi: thx14:34
fungibut after that i need to take a break so i can get the shower i meant to take at 10:00z14:34
hberaudI think adding it to the official announce emails would diluate the broken release topic14:35
elodillesfungi: do you suggest then to send it now so that you can shower? :) or should we wait a bit to be closer to 15:00 UTC?14:35
fricklerok, I'm also fine with not mentioning, was just an idea of mine14:36
fungiwe should send it before 15:00 since that's when all the press releases go out14:36
hberaudsure np14:36
fungialso it could be mentioned in a follow-up reply on openstack-discuss rather than in the announcement message14:36
elodillesbtw, we usually wait until docs.o.o updates, right? is everything up to date there?14:37
hberaudwill send my part once openstack-announce will shown the first one14:37
fricklerzuul says the docs publishing will finish in about 18 mins14:38
elodilles14:56 UTC14:38
fungiand then there's a 0-5 minute wait for the cron that releases the afs volumes14:38
elodillesif that holds14:38
elodilleshmmm14:39
fungiit runs every 5 minutes to sync changes from the writeable volume to the read-only replicas which the site serves14:39
fungibasically if the "Synchronize files to AFS" task completes between 14:55 and 15:00 then the content of the site will update at 15:0014:44
opendevreviewMerged openstack/releases master: Mark 2024.1 Caracal as released  https://review.opendev.org/c/openstack/releases/+/91485814:45
elodillesfungi: i've sent the announcement, feel free to approve any time14:46
*** whoami-rajat_ is now known as whoami-rajat14:47
fricklersadly there's no indication of progress at all on the job console14:48
fungicool, i'll give it a few minutes so hopefully the releases site update and docs update are close to being reflected publicly14:49
elodilles+114:50
fricklerfinished14:52
fungihttps://docs.openstack.org/ redirects to https://docs.openstack.org/2024.1/14:53
elodilles\o/14:53
fungibut some files may not be fully updated for another 1.5 minutes14:53
fungialso the publish-tox-doc-releases build waiting in release-post hasn't started yet14:53
fungiapproved the announcement now14:55
fungihaving the releases site trail the announcement by a few minutes won't hurt14:55
fungialso mailman takes at least that long to send copies to all the subscribers14:56
ttxshould we flip the switch on the openstack.org website?14:56
fungii suppose that can happen at any time too now14:56
hberaudupdated the template with the link from openstack-announce, please double check https://etherpad.opendev.org/p/relmgmt-weekly-emails14:57
ttxok will ask14:57
elodilleshberaud: link is working, mail LGTM!14:58
hberaudthx14:58
fricklerhberaud: +115:00
hberaudsent15:00
*** blarnath is now known as d34dh0r5315:00
frickler\o/ party time15:00
elodilles~o~15:01
fungibuilds are finally starting for the releases site update15:01
ttxZuul certainly feels lazy today15:05
hberaudI think we can break out the champagne15:08
elodilles🍾15:09
fungidiscussion on #opendev has turned up a possibility, we're getting a lot of web queries that might be overloading the relational database where it records build results and such15:09
fungiif db writes are taking a long time (or timing out) then zuul might be blocking in places where we just don't expect the db to behave pathologically15:12
fungior this could be a recent regression in zuul leading to some sort of cascade effect15:14
elodilles:S15:26
fungithe publish build is finally starting15:28
ttxIt's up now15:37
elodilles\o/15:37
fungiconfirmed, https://releases.openstack.org/ looks correct15:38
elodillesyepp15:38
elodillesthanks everyone for your work \o/15:39
hberaud\o/15:58
*** gthiemon1e is now known as gthiemonge15:59
clarkbthinkign out loud here: if there is a throttle (or even an unintentional one due to how our release process works) we may want ot reach out ot pypi and ask if there is nayhting we should do different to avoid issues int he future16:11
clarkbI wouldn't care so much except they don't let you easily modify existing content which puts us in a weird spot16:11
fungii suppose it's possible that whatever's going on with zuul caused those jobs to be more bursty than usual and we tripped some limit16:29
fungilots of articles are out, for people who enjoy seeing the press around releases:16:46
fungihttps://www.itprotoday.com/iaas-and-paas/openstack-caracal-release-focuses-ai-performance-security16:46
fungihttps://www.techzine.eu/news/infrastructure/118369/caracal-release-of-openstack-bets-on-ai-workloads-and-vmware-refugees/16:46
fungihttps://www.computerweekly.com/blog/Open-Source-Insider/OpenStack-Caracal-improves-agility-delivers-bite-as-VMware-alternative16:46
*** vishalmanchanda_ is now known as vishalmanchanda16:55
*** gmann_ is now known as gmann16:55
*** carloss_ is now known as carloss19:46

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!