Monday, 2022-03-28

*** marios is now known as marios|ruck04:57
elodilleshberaud ttx : hi, when you are around: there are 2 things that need some reviews05:14
elodilleshberaud ttx : the 1st one is an easy one, another reno links patch: https://review.opendev.org/c/openstack/releases/+/83504505:15
elodilleshberaud ttx : the 2nd one is the difficult one: the yoga-final patch is failing due to 3 deliverables that is impacted by the latest setuptools releases05:17
elodilles(the patch is: https://review.opendev.org/c/openstack/releases/+/835322 )05:17
elodilleshmmm, i see there was a new setuptools release yesterday (61.2.0), so i've rechecked the patch to see the impact of that05:20
elodilles(anyway, some words about the issue: https://meetings.opendev.org/irclogs/%23openstack-release/%23openstack-release.2022-03-26.log.html )05:21
hberaudack will do soon06:33
elodillesthx \o/07:18
-opendevstatus- NOTICE: zuul isn't executing check jobs at the moment, investigation is ongoing, please be patient07:19
elodillesso that's why the jobs haven't started ^^^07:20
ttxlooking07:25
elodillesthx!07:26
ttxprobably simpler to wait for an infra fix anyway07:26
ttxAlso I would consider doing late releases for those 3 -- they are pretty peripheral anyway07:28
ttxaren't those ansible roles release-trailing?07:29
ttxah barbican stuff07:30
elodillesttx: releasing the ansible-role-thales-hsm ansible-role-atos-hsm and sahara-image-elements would probably be not so risky - though 1st we need to get those workaround patches merged ( https://review.opendev.org/q/topic:setuptools-issue-3197 ) somehow07:50
elodillesttx: what about barbican?07:50
elodillesoh, yes, those are barbican team's deliverables07:52
elodilles(the ansible-role-*-hsm ones)07:52
elodillesi've tried to test whether the new setuptools release (61.2.0) helps, but it seems it does not07:56
opendevreviewMerged openstack/releases master: Add remaining release note links for Yoga  https://review.opendev.org/c/openstack/releases/+/83504508:48
elodillesso gate is working again ^^^08:54
elodillesthe result is the same with latest (61.2.0) setuptools: https://zuul.opendev.org/t/openstack/build/3faec3c5dce94457916cadbe4b2da21109:04
elodillesi've added reviewers to the 3 WA patches09:05
elodillesalso pinged the teams on their channels09:12
elodillesother option i guess (if we don't get answers from the teams) is to skip those from the yoga-final patch / release, as they are not even active projects it seems09:13
elodilleshberaud ttx : what do you think? ^^^09:13
hberaudWFM09:14
ttxideally we would release them09:15
ttxsince they are part of the release and we have no real process for not releasing them09:16
ttxWould there be a way to somehow pin to a working setuptools?09:16
elodillesso we now either do another 'final-rc' for ansible-role-atos-hsm, ansible-role-thales-hsm and sahara-image-elements or leave them out from yoga if we can't get the patches merged and release them.09:16
elodillesttx: well, we can add this to tox.ini:09:17
elodillesrequires = setuptools<61.0.009:17
ttxLike if that release happened on Wednesday morning and broke 12 projects what would we have done09:17
elodillesthat's true09:18
ttxideally we would be isolated from that, but I guess setuptools is special09:18
elodillesyes, most of the things are constrained, but not setuptools, virtualenv, pip, etc. afaik09:19
ttxOK so let's try to get those reviews some attention -- barring that I would consider asking the TC to allow infra to merge them to save their release09:20
ttxWe can give teams most of today to react, then tomorrow we switch to TC/infra to get them in09:21
ttxI'm fine with late releases for those since they are pretty minor09:22
elodillesttx: sounds like a plan!09:22
ttxbut they should still make it to the final release :)09:22
ttxgmann and fungi: see ^09:22
elodillesthanks, the plan looks good09:23
ttxbasically we need the changes in https://review.opendev.org/q/topic:setuptools-issue-3197 merged and a new release for the 3 affected deliverables before EOD Tuesday, which may require some TC authorizing Infra to force the patches in.09:24
elodillesand we still have the 'setuptools constrain' in tox.ini as an option (that should work but maybe we need some testing for that)09:24
elodillesas you suggested09:24
ttxelodilles: which tox.ini would that be?09:25
ttxopenstack/release?09:25
elodillesyes09:25
ttxhmm, that is tempting09:25
elodillesand after the release we should revert that09:25
elodilles(i'll push now some DNM testing patch for that, just to make sure)09:26
ttxOK let's discuss today with gmann and fungi which plan B they prefer (forcing changes or constraining setuptools for release day). Plan A remains to get those patches in today09:26
opendevreviewElod Illes proposed openstack/releases master: DNM: testing to cap setuptools  https://review.opendev.org/c/openstack/releases/+/83542309:30
elodillesthis is the DNM patch, let's see if it works ^^^09:30
opendevreviewElod Illes proposed openstack/releases master: DNM: testing to cap setuptools  https://review.opendev.org/c/openstack/releases/+/83542310:10
elodillesupdated the DNM patch, this should work now (we need to constrain setuptools via virtualenv as it bundles setuptools) ^^^10:14
hberaudit works \o/10:28
hberaudkudos10:29
elodillesso yes, this is another option \o/10:32
hberaudThis one seems a good security for us that we can handle without asking to external team. We are autonomous with that11:24
hberaudAnd we could drop it once the final yoga is out11:25
elodilleshberaud: yepp11:38
fungielodilles: the new way is to call `python3 -m build` using https://pypi.org/project/build/11:44
fungibut i wasn't aware that direct calls to setup.py were more than simply deprecated at the moment (i don't see any removal called out in the setuptools changelog)11:46
fungipossibly more relevant is the virtualenv 20.14.0 release on friday if these things are being run by tox, since that updated installed envs from setuptools 60.10.0 to 61.1.011:49
*** dviroel|pto is now known as dviroel11:52
fungihttps://zuul.opendev.org/t/openstack/build/3faec3c5dce94457916cadbe4b2da211 is really making my browser work hard11:54
fungioh wow that's massive11:55
fungiokay, after giving up and opening the log in a text editor, i see that the problem is the same one tripleo was dealing with on friday: "Multiple top-level packages discovered in a flat-layout"12:29
fungithe warnings about calling setup.py are non-fatal, btw12:31
fungia number of potential workarounds are mentioned in https://github.com/pypa/setuptools/issues/3197 though the workaround tripleo went with was adding a py_modules=[] parameter in the setuptools.setup() of their setup.py files in the affected projects12:36
fungithankfully this only impacts a very small number of our projects (those which add multiple packages in a single repo)12:37
elodillesfungi: sorry, yes, only 3 project is affected12:41
elodillesfungi: and I've proposed similar workarounds (with some gate fix): https://review.opendev.org/q/topic:setuptools-issue-319712:41
elodillesbut still we need to merge these and release them12:42
elodilleswe have counted 3 possible way-forward so far (and 'python3 -m build' is maybe the 4th :))12:43
*** dviroel is now known as dviroel|brb12:43
elodilles(Sean actually mentioned some similar fix Saturday, but i haven't tried that yet, and this one ^^^ seems more close to what validator script does)12:45
elodillesmaybe now the easiest and less painful solution is if we use the 'setuptools pinning via virtualenv in tox' patch temporarily, for the release, and then we revert it (the patch would be something like this: https://review.opendev.org/c/openstack/releases/+/835423 )12:47
elodillesbut I'll try soon the 'python3 -m build' as well12:47
fungiwell, using build is only probably going to silence the warnings about calling setup.py, it's not likely to address the multiple top-level packages problem13:02
fungiwe probably do want to change our release scripts to no longer call setup.py directly, but that's not urgent13:07
elodillesif that is the new way, then it needs to be changed anyway, i agree13:07
elodillesi'll propose a patch for that13:08
fungiyeah, i've been using it on my personal projects which also rely on pbr, and it works fine there so should work for openstack as well13:08
fungithis is how i build and test releases for my projects with tox: https://paste.opendev.org/show/bSzpKscw3K6mgjuPXGD4/13:10
*** dviroel|brb is now known as dviroel13:27
opendevreviewElod Illes proposed openstack/releases master: Replace old sdist and wheel build command in validate  https://review.opendev.org/c/openstack/releases/+/83545013:34
opendevreviewElod Illes proposed openstack/releases master: Pin setuptools for Yoga release  https://review.opendev.org/c/openstack/releases/+/83542313:44
elodillesi've updated the setuptools pinning patch as well, so that we can rebase the release patch to this ^^^ if needed13:46
hberaud2'ed13:53
elodillesanother thing is: zigo mentioned @ openstack_sahara that he sees issue with multiple projects, that comes from oslo.context replaced 'tenant' to 'project_id'13:57
elodilleshe mentioned: magnum, mistral, murano, sahara, trove, zaqar having that error13:58
zigoYeah, at lease those ...13:58
zigoIt's kind of blocking my packaging then.13:58
zigoAnd if all of these projects have (pending?) patches, it's kind of hard to track.13:59
elodillesi guess this issue needs to be fixed somehow before we release Yoga :S13:59
elodilleshberaud ttx ^^^13:59
hberaudprojects listed above are mostly not really active so it could take long time to see them fixed14:03
hberaudanother option for these deliverables could be to pin oslo.context below the version 4.0.0 instead of moving tenant to project_id14:05
zigoMagnum and Sahara got the fixes.14:05
zigohttps://review.opendev.org/c/openstack/magnum/+/83429614:05
zigoOthers I haven't investigated yet.14:05
hberaudcool14:06
zigohberaud: *NO*, that's *NOT* an option to hide the dust under the carpet.14:06
zigoNot everyone uses venv ...14:06
zigoThe solution could be to fix oslo.context and release a 4.0.1 that doesn't yell with deprecation.14:06
hberaudthe tenant was deprecated since long time ago14:07
zigo4.1.1 I  mean.14:07
zigoYeah, but it wasn't just failing UT.14:07
zigoMistral got the fix too.14:09
hberaudWhat do you mean by fixing oslo.context? I'm not fan to back and forth with this deprecated argument too...14:09
zigoI'll investigate the others and let you know the state of things ... :)14:09
opendevreviewVishal Manchanda proposed openstack/releases master: Release horizon 22.1.0(Yoga)  https://review.opendev.org/c/openstack/releases/+/83546114:11
elodillesthe main problem is that some not actively maintained project missed the deprecation warning and now that oslo.context is really removed the 'tenant' context argument they are broken. so i also think this is not something that oslo.context "should" fix, but something that the other projects should14:30
elodillesthe question is whether we now release them 'as broken packages' and they need to fix this via a stable release OR we jeopardize the release by adding more and more changes, for which we don't even have the time14:32
hberaudAgree with you. oslo.context is not broken...14:33
elodillesi tend to vote for the 1st option. it also indicates that those projects need maintainers14:33
hberaudI'd suggest to fix them via stable release14:33
hberaud+114:33
elodilleshberaud: ++14:33
elodillesttx: do you agree?  ^^^14:34
hberaudi see no reason to prevent oslo from moving forward if the other projects are at a standstill14:34
elodillesthe 'api-break' change (oslo.context 4.0.0) was released almost 2 months ago... it would be better to release them sooner, but if there are not enough maintainers then maybe that would not make any difference to release them 3 months ago :/14:42
hberaudyeah14:45
elodillesand I also undersand zigo that it is a pain in packaging :/14:49
elodilles(as well)14:49
*** amoralej|off is now known as amoralej14:53
ttxsorry I can't look right now14:56
elodillessorry, no worries, hberaud and I agree, just wanted to ask for your opinions as well o:)15:04
zigohberaud: The reason is: you submited it a WAY too late in the release cycle. That's the kind of breaking change to do at the begining of a cycle.15:17
zigoBest course would be: revert the change in the Yoga branch in oslo.context, but leave it as-is in master.15:18
zigoIf you look at the code of these projects, most of them have the word "tenant" a bit everywhere ...15:18
zigoJust for the fun of it, I wrote this: https://review.opendev.org/c/openstack/trove/+/833186 (which of course breaks everything, as I just did sed / grep stuff without thinking...).15:19
zigoYes, it's a disaster, and projects should move forward, I very much agree with the reasonning.15:20
zigoBUT, that's not the way to fix things.15:20
zigoI mean, not at the end of a release cycle.15:20
*** marios|ruck is now known as marios|ruck|call15:23
ttxI think it's important that all projects use the same oslo.context release, so pinning project-by-project would not be good15:54
ttxI'm not sure I understand how broken those packages are and how many are affected15:55
ttxAnd yes that shows that 2 months is no longer sufficient lead time for projects to adapt to lib changes15:55
ttxat least for sub-maintained projects15:56
ttxwe should fix the ones we can today for a last-minute release... and the others on a stable release asap15:58
ttxbut again, I'd need a description of how broken they actually are -- they pass testign so I assume they are not completely broken?15:59
elodillesoslo.context was released between milestone-2 and milestone-3. the question is would it make a difference for these projects if the release would be produced earlier. I think maybe a bit less project would be broken, but otherwise could be the same. :/16:00
ttxelodilles: are they unusable, or just weirdly incoherent?16:01
ttxsorry, going into meetings again16:01
zigottx: That's the issue, they do not pass testing with the newer oslo.context.16:02
elodillestesting is broken (simple example patch for fix of such issue: https://review.opendev.org/c/openstack/castellan/+/834669)16:03
*** marios|ruck|call is now known as marios|ruck16:03
fungiit's likely that they simply merged no new changes after the oslo.context bump in requirements broke them16:03
elodillesbut i'm not sure whether we have broken code as well in projects themselves beyond testing16:04
fungiif they're fairly unmaintained, then probably nobody even noticed until downstream packaging work tripped over it16:04
elodillesthat is definitely the case16:05
* zigo heads back home16:05
*** dviroel is now known as dviroel|lunch16:13
*** marios|ruck is now known as marios|out16:15
fungielodilles: on further reading, it looks like updating tox.ini to force setuptools>=61.1 may also solve the problem16:26
fungithe pr claiming to have solved it for at least some projects is included in that version from saturday16:26
fungiit's just not the default version installed by tox yet16:26
fungier, well, bundled in virtualenv i mean16:27
elodillesyes it needs virtualenv to bundle setuptools>=61.116:27
elodilleslet me check if we have that already16:27
elodillesstill virtualenv 20.14.0 is the latest from Friday16:28
elodilles* latest release16:28
hberaudzigo: Well, I proposed my patch at the begining of yoga (in october)... and the patch was merged 6 months later so I don't think I submitted it too late https://review.opendev.org/c/openstack/oslo.context/+/81593816:28
elodillesfungi: in case virtualenv will be released before Wednesday (with bundled setuptools 61.2) then we are OK :)16:29
fungielodilles: alternatively we can set https://tox.wiki/en/latest/config.html#conf-requires to require setuptools>=61.1 as a workaround16:30
*** jbadiapa is now known as jbadiapa|off16:30
clarkbif we do that in a base enough tox job it should have good coverage16:31
fungiso on the oslo.context 4.x conflict, do we have a complete list of which projects are still impacted?16:32
elodillesfungi: that does not work, we need to do it via virtualenv (see former patchsets of my patch)16:32
fungielodilles: tox updating setuptools in its virtualenv doesn't solve it? or we're not calling tox?16:32
elodillesfungi: when i set setuptools, it still installs latest virtualenv with the bundled setuptools (see PSets of this patch: https://review.opendev.org/c/openstack/releases/+/835423 )16:34
elodillesfungi: but pinning virtualenv works16:35
fungihttps://github.com/pypa/virtualenv/pull/2324 doesn't seem to be getting fast-tracked16:36
elodillesfungi: about oslo.context we don't have the complete list, but zigo reported these as broken magnum, mistral, murano, sahara, trove, zaqar. this might not be the complete list.16:37
fungiso our options with oslo.context are 1. roll back the removal to a deprecation in an emergency point release on stable/yoga, 2. try to get fixes merged to all affected projects, 3. don't include the above projects in the yoga release, or 4. release yoga with those untestable and possibly broken16:40
clarkbfungi: I think you can also set the constraint for the older version?16:40
clarkbI guess that implies caps in the affected projects and is part of 2?16:41
fungiright, that's probably a sub-option of #216:41
fungisince just pinning oslo.context in global requirements won't really pin it in the projects themselves (but will result in them getting tested with older oslo.context and maybe also being a signal to package maintainers to not package oslo.context 4.x with yoga?)16:42
fungigets to be very mixed-signal though with oslo.context 4.x appearing in the yoga release itself16:43
clarkbya16:43
elodillesfungi: good summary. when discussing it with hberaud we agreed that maybe option 4 is the one we should go for, and the failing projects can propose stable releases afterwards16:52
*** dviroel|lunch is now known as dviroel16:55
elodillesfungi: https://meetings.opendev.org/irclogs/%23openstack-release/%23openstack-release.2022-03-28.log.html#t2022-03-28T14:30:2716:56
fungiokay, so just to be clear, the consensus is that releasing broken services is preferable to rolling back oslo.context in the last hours before the release?16:58
elodillesthough maybe about your 4 option: option 1 is feasible, option 2 seems time consuming, option 3 feasible too (if we find all the affected projects + we don't have core projects among them), option 4 feasible, not nice, but indicates which projects are 'undermaintained' and can be fixed with stable releases16:59
elodillesfungi: so far I think that is the consensus, at least me and hberaud said so. but i'm not super confident, and open to other options17:02
fungii do agree that holding back progress in maintained projects because of unmaintained projects is not great, but releasing services in a state where downstream consumers can't test them (and the broken tests may even indicate that the projects themselves are broken) is also not great. not including those projects in the release might help send a stronger signal as to the actual17:06
fungiproblem, in this case, but is something which might benefit from input from tc members as well17:06
elodillesfungi: that's true, i can accept this17:08
elodillesso you vote for option 317:08
clarkbIf it were me I think I would undeprecate that since clearly it wasn't well coordinated17:08
clarkbcome back and update all of the places that need updating and depends on those changes in oslo.context to make the switch17:09
ttxSo a new oslo.context would fix it but we'd lose another cycle17:09
elodillesclarkb: so you vote option 117:09
clarkbyes option 1 is my vote. We have the tools to test this stuff and make these removals safe. We didn't do that seems like its a good idea to reset and try again using the tools17:10
ttxi don;t think (3) is an option. We promised those projects would be part of the release, they have to be. Even if untestable (that would not be the first time)17:10
ttx(2) would be best, but unlikely17:11
ttxI would consider (1) but I'm not sure I understand all the implications17:11
fungi2 is particularly unlikely for the same reasons the fixes hadn't been done already, yeah17:11
ttx(1) is probably the least disruptive17:12
fungi1 would need a revert in stable/yoga, a point release tagged, and stable/yoga requirements update merged17:12
ttxI assume it's a pretty localized change17:12
fungibut yes, as to how much of oslo.context would need unrolling in order to revert the deprecation i'm not sure17:13
ttxIf that's not acceptable (like we can't get it merged in oslo.context) than we'd go for (4)17:13
fungiluckily it merged relatively late in the cycle, so there wasn't a lot of time to do other related things after it, i guess17:13
ttxI feel like the TC should make a call between (1) and (4) sicne (2) did not happen in time17:13
ttxyeah I'd like to hear how feasible (1) is17:14
ttxBecause the initial change took 3 months to merge17:15
fungiso, if i'm reading the oslo.context stable/yoga history correctly, the only changes to land after the tenant removal were a constraints and .gitreview updates for stable/yoga earlier this month, mypy typehinting added last month, a setuptools workaround merged in january17:17
ttxhberaud did propose it at the start of the cycle, that was just not processed downstream fast enough17:17
zigofix for trove: https://review.opendev.org/c/openstack/trove/+/834373 (currently building the package)17:17
fungitypehinting might need to be tweaked if 815938 is reverted on stable/yoga, but probably not much (or may not be critical)17:18
zigoNot enough, it still fails ... :(17:18
elodillesthe patches in oslo.context: https://paste.opendev.org/show/bzXX9zGo8Prb4N6SM50X/17:18
ttxhow many packages are broken?17:18
elodilleszigo: do you have a list perhaps? ^^^17:18
zigoelodilles: sahara has a patch and is fixed.17:18
ttxIf it's just a couple, I'd lean towards (4) and calling out for quick stable releases17:18
fungittx: 815938 was proposed in october, had no revisions, and was eventually approved in mid-january17:19
elodilleszigo: the problem is that we proposed the release patch for sahara, but the team did not react to that so it was abandoned :(17:19
ttxright -- it's just symptomatic of our inability to process such a change in 6 months17:19
elodilleszigo: and that is the better case, because at least there we had fix for the issue. :/17:20
zigottx: For the moment, I believe the list is: mistral, murano, trove.17:20
fungiwell, the sahara example also points out that at least some of the projects involved may just be unmaintained for all intents and purposes17:20
ttxi'd be tempted to release those as-is17:20
zigoI backported the patch for Sahara, and it's fine.17:20
elodillesfungi: yes, it seems so17:20
ttxfungi: do you think the TC could step in and force those patches today? If we have a new release tomorrow it can make it17:21
ttxso... strongarm option 217:21
fungithe tc should be able to approve the changes, eys17:21
fungier, yes17:21
clarkbya if fixes can be landed making 2 happen that seems reasonable17:22
fungii can grant tc-members approval rights in gerrit over the repos or i can just elevate my own privs and approve those if they agree17:22
ttxelodilles: could you make a case on #openstack-tc for forcing patches in mistral, murano, trove to fix trheir release in time for the Yoga date?17:22
ttxThose patches are pretty uncontroversial I suspect17:22
zigoAgreed.17:22
zigoIt's just mostly s/tenant/project_id/17:23
ttxOK so my vote is on forcing those patches in master and stable/yoga for mistral, murano, trove17:23
elodillesttx so you mean on IRC? yes i can17:23
ttxelodilles: yes17:23
elodillesOK, jumping over there17:24
ttxIf we have them all merged today we can do re-releases tomorrow and get the final release patch in time for Wednesday17:24
ttxzigo: thanks for bringing it up17:25
ttxsahara needs a re-release right17:25
ttx(in addition to mistral, murano, trove)17:25
zigoThe sahara one is already in the stable/yoga branch: https://review.opendev.org/q/I1bc81b3c13d2c08bc175d0d4f4365de7b4f71cf917:26
ttxAt this point rolling back 6 months of getting that oslo.context change in is more work / more impactful than forcing 3 patches in17:26
zigoThough the RC1 doesn't contain it.17:26
ttxack so it needs a RC217:27
zigoMistral is broken with:17:28
zigoAttributeError: 'Retrying' object has no attribute 'call'17:28
zigoSo probably not oslo.context ? I'm not sure ...17:28
fungizigo: have a link to the change?17:28
zigohttps://review.opendev.org/c/openstack/trove/+/834373 <--- This patch isn't enough, and I still get some errors.17:28
ttxit feels like we should run all tests when requirements freeze to spot those earlier17:29
clarkbhaving a single tempest job that installs all libs from git might be a good canary too17:30
zigoThen I haven't found a patch (yet?) for Murano.17:30
zigoMurano does 143 times: AttributeError: 'RequestContext' object has no attribute 'tenant'17:30
fungizigo: that trove change seems to be passing testing and is approved"17:36
fungii guess you mean you're seeing errors trying to use trove, not in upstream testing17:36
elodillesttx: yes, it sounds like we need such testing :S17:38
fungizigo: i see https://bugs.debian.org/1005467 which has example tracebacks for mistral at least17:38
elodilleszigo: what about zaqar? (magnum seems to be fixed as we were informed on tc channel)17:38
fungithe mistral bug looks like it might have to do with testtools17:39
fungithat was with testtools==2.5.0 according to the included freeze17:40
* zigo attemps to build zaqar17:42
zigoOh, zaqar looks like, it seems.... :)17:43
ttxelodilles: PTG discussion!17:46
elodillesttx: for sure a good topic17:46
ttxI'll have to drop, but it seems the TC will slowly get persuaded and hopefully fungi will fix as many of them as possible17:47
ttxFWIW worst case scenario we do release them as broken. That happened in the past, and we fxied them in stable releases17:47
fungiyeah, i'm happy to use gerrit admin access to approve changes if tc-members give the go-ahead17:48
*** amoralej is now known as amoralej|off17:51
ttxok it seems to go in the right direction17:53
ttxI'm dinnering17:53
elodillesbon appetit!17:53
elodillesi'll try to add the details here: https://etherpad.opendev.org/p/tenant-projectid-last-minute-fixes17:53
elodillesso that we can follow the progress17:54
fungizigo: looks like that's the same testtools version we're using in stable/yoga upper-constraints.txt, so we should have hit the error upstream if that were the problem. i'll have to look closer at the code in mistral to see where that retries class is coming from17:59
fungielodilles: are you going to send a formal proposal to openstack-discuss for bypassing the usual core reviewer and ptl approvals for these changes and release candidates, or would you like me to do so on behalf of the release team? i'm sure it's getting well into dinner time in your locale18:01
elodillesfungi: if you could do that that would be awesome :)18:03
elodillesfungi: i'm also in middle of my dinner ;)18:03
fungiokay, i'm happy to do so. i'll send something now. please get back to enjoying your evening!18:03
elodillesmeanwhile i'm also adding details to the etherpad  -- https://etherpad.opendev.org/p/tenant-projectid-last-minute-fixes18:04
elodilles(patch list to see where we are)18:04
elodilleszigo: if you could also double-check the list then it would be awesome: ^^^18:08
elodilles(though i'm still adding the patches there)18:08
fungizigo: are there any upstream test failure examples for mistral you're aware of? otherwise i'm inclined to leave that off the formal proposal to the tc for now while we investigate further relevance there18:14
*** lajoskatona_ is now known as lajoskatona18:27
fungiproposal posted here: https://lists.openstack.org/pipermail/openstack-discuss/2022-March/027864.html18:32
fungifor those not following along in #openstack-tc, an "emergency" meeting was convened in which the tc members in attendance agreed to that proposal18:33
elodillesfungi: ack, thanks!18:40
elodillesfungi zigo : as I see mistral and magnum already having the oslo.context tenant->project_id fix in their releases (if the patches I've found are the ones we are looking for -- https://etherpad.opendev.org/p/tenant-projectid-last-minute-fixes )18:42
elodillesfungi zigo : so i guess those shouldn't be broken due to our oslo.context issue. maybe there are something else?18:43
zigofungi: Trove's patch is *not enough* and it continues to fail with 'tenant' failures.18:47
fungiso the retry objects in those tracebacks are coming from tenacity's Retrying class18:47
fungiwhich would make sense in light of https://bugs.debian.org/100546718:47
zigofungi: Yeah, right.18:47
zigoIt needs 8.0.1 compat, it's likely only working with 6.x18:48
fungiwe're testing openstack projects with tenacity===6.3.1 on stable/yoga18:48
fungiyep18:48
fungiso that explains why we're not seeing those errors18:48
fungiso while i agree that openstack projects should be working on support for newer tenacity versions, that train has already sailed for yoga i think18:49
fungithe trove fix does seem to have merged on the master branch 9 days ago and passed testing18:52
fungii guess we'll see if there are errors on the backport to stable/yoga (835492)18:53
fungiall the voting jobs for that change have already passed too18:54
fungilooks like coverage and functional-mysql have failed but they're non-voting. tempest tests are also all non-voting on trove at the moment, it looks like?18:55
fungilooks like the only trove jobs which are voting on proposed changes are requirements-check, openstack-tox-pep8, openstack-tox-py36, openstack-tox-py39, and openstack-tox-docs18:56
elodillesit's a typical sign of a not well maintained project :S19:00
fungilooks like there's a lot of mapping from tenant to context.project_id referencesm so they seem to have tackled a lot of it19:09
fungithough in trove/configuration/service.py i still see a tenant_id=context.tenant instead of tenant_id=context.project_id in at least one place19:10
fungikwargs['tenant_id'] = context.tenant19:10
fungithere's another19:10
fungisame file19:11
fungizigo: where you're seeing the errors, does it seem to be raising in trove/configuration/service.py or somewhere else?19:11
gmannwithout test coverage/voting jobs it will be hard to trace them manually 19:11
fungigiven the complete lack of voting integration test jobs for trove, i feel pretty confident that project is entirely broken at this point19:14
fungiand probably not just from the oslo.context tenant removal19:14
fungiso next steps are probably someone proposing fixes context.tenant->project_id for murano and zaqar?19:17
fungiis anyone working on those yet or should i take a swing at it?19:17
fungi15 files need editing in murano19:20
elodillesfungi: i tried to look into it, but i won't have time for it for today i fear :S19:26
fungii'm working on the murano master patch now, about to push it but i'm coming into this somewhat blind and have little idea what i'm actually doing, so we'll see what happens ;)19:27
elodilles:S crossing fingers19:28
elodillesand yes, the problem for me was the same, i wanted to understand what i am changing and it seemed a bit too much given that i should sleep already as I woke up early today o:)19:30
fungiokay, an attempt at fixing murano's master branch is now linked in the etherpad. i'll put together something similar for zaqar while i await test results19:31
elodillesfungi: thanks! \o/19:31
fungizaqar has places where it's doing things like context.RequestContext(project_id...19:32
fungiuh, that line dates from 2014-10-22 according to git blame19:34
fungizigo: do you have any examples of failures for zaqar? a naive git grep isn't turning up obvious places where oslo.context is being called into incorrectly19:35
zigoAs I wrote, it just built fine, so it probably should be removed from the list.19:35
fungioh, zaqar? okay, i'll strike it off19:39
elodillesyes, unit test jobs seems to be passing for zaqar, though the gate has a failing tripleo-ci-centos8 job: https://review.opendev.org/c/openstack/zaqar/+/833321/119:39
zigohttp://shade.infomaniak.ch/trove_17.0.0~rc1-2_build.log19:39
zigohttp://shade.infomaniak.ch/trove_17.0.0~rc1-2_build.log.txt if you prefer to have it in the browser...19:44
zigoThis is *after* applying https://review.opendev.org/c/openstack/trove/+/83437319:44
fungielodilles: looks like the tripleo-ci-centos-8-scenario002-standalone job may be generally broken for stable/yoga (barbican has set theirs to non-voting): https://zuul.opendev.org/t/openstack/builds?job_name=tripleo-ci-centos-8-scenario002-standalone&branch=stable%2Fyoga&skip=019:51
elodillesoh, i see. actually there are failing centos8 jobs here-and-there so i guess that's some common issue19:54
elodilleszigo: btw it seems there are 2 patches in trove that are merged on master19:57
elodilleszigo: did you apply both?19:58
elodillesfungi: fyi ^^^ added the second patch to the list. i wonder if those need to be squashed... if they are needed for the fix then they should need the squash, too :-o20:01
fungiwell, since trove isn't testing things, squashing is probably unnecessary20:03
fungithey could be merged independently20:03
elodillesyes, true, you wrote that already, sorry20:04
zigoelodilles: Only one... :/20:18
zigoI probably missed the 2nd one then.20:18
zigoThanks, I'll try it tomorrow.20:19
elodilles+120:29
fungielodilles: so we need a backport of 834373 to stable/yoga then?20:29
elodilleslet me cherry pick that20:29
elodillesas i think so20:29
fungithe other trove backport has +1 from zuul now, at least20:29
*** dviroel is now known as dviroel|out20:30
funginote that we've got an unrelated problem with zuul, so it's not started testing on my murano change yet, but it's being actively investigated and we at least know how to get things moving again without disruption once we've collected some logs and poked at it for a bit20:32
elodilles:S20:32
elodillesfungi: ack20:32
elodilles(added a 'nit' comment for one of the trove patch as oslo.contect>=4.0.0 in requirements.txt is not necessary i think)20:37
fungiagreed, older versions of oslo.context should work with project_id (that's the point of the deprecation)20:40
opendevreviewTobias Urdin proposed openstack/releases master: Release tooz 2.11.0  https://review.opendev.org/c/openstack/releases/+/83551720:59
elodillesfungi: do you want me to remove the change from requirements.txt? do we have time for that?21:05
fungiwill it be a problem if it's in there?21:09
fungiand yeah, you can if you like, i'm indifferent on it21:10
elodilleswell, most probably not a problem. OK i won't waste on CI time on it then. and anyway, i have to leave now for some hours of sleep. see you tomorrow! thanks for working on this!21:20
elodillesjust let some note here in IRC if i need to do anything during (my) morning21:21
fungiwill do!21:21
elodilleso/21:21
fungias soon as zuul gets back on track, i'll make sure we've got good check results on the outstanding (non-release) changes so far, and then approve them21:48
fungimy murano patch failed a bunch of jobs, need to look into the cause(s)23:14

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!