Friday, 2023-04-21

*** iurygregory is now known as iurygregory|holiday02:26
*** amoralej|off is now known as amoralej06:08
opendevreviewMerged openstack/releases master: [cloudkitty] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87886407:55
fricklerreviewers please have a look at https://review.opendev.org/c/openstack/releases/+/880944 , this is one of the blockers for using latest oslo.db in openstack08:34
opendevreviewAmit Uniyal proposed openstack/releases master: [yoga][zed][antelope] Nova stable branch release  https://review.opendev.org/c/openstack/releases/+/88120009:00
opendevreviewMerged openstack/releases master: python-aodhclient 3.3.0  https://review.opendev.org/c/openstack/releases/+/88094409:05
*** amoralej is now known as amoralej|lunch12:43
opendevreviewAmit Uniyal proposed openstack/releases master: nova: Release yoga 25.1.1  https://review.opendev.org/c/openstack/releases/+/88120912:45
opendevreviewAmit Uniyal proposed openstack/releases master: nova: Release zed 26.1.1  https://review.opendev.org/c/openstack/releases/+/88121012:45
opendevreviewAmit Uniyal proposed openstack/releases master: nova: Release antelope 27.0.1  https://review.opendev.org/c/openstack/releases/+/88121112:45
hberaudttx, elodilles: is one of you is ok to replace me as chair for our meeting of Mai 5th? I fogot that my children will be in PTO during this same week so I'll be surely AFK this day. I'm ok to invert mine with one of your next weeks.12:48
hberauds/Mai/May12:48
elodilleshberaud: i think i can, but let me check the dates12:53
elodilleshberaud: May 5th looks OK to me, added my name as chair. we haven't filled out all the chair slots yet, so no need to decide now which one is good for whom :)12:57
elodilleshberaud: if you have some time could you please review these, as some of them can be merged i think: https://review.opendev.org/q/topic:xena-em+is:open+label:PTL-Approved13:02
hberaudelodilles: thanks13:03
hberaudOMW13:03
opendevreviewMerged openstack/releases master: [kolla] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87888313:14
opendevreviewMerged openstack/releases master: [glance] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87889313:17
elodillesthanks!13:17
opendevreviewMerged openstack/releases master: [winstackers] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87887613:20
opendevreviewElod Illes proposed openstack/releases master: [OpenStackAnsible] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87888713:22
opendevreviewMerged openstack/releases master: [monasca] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87889813:26
auniyalhello, gate question, I have three patches and they all kind of stuck at same places13:33
auniyalhttps://zuul.openstack.org/stream/cb10cc84c07a476fb7e5ec2f5e920220?logfile=console.log13:34
auniyalhttps://zuul.openstack.org/stream/52c638dab78a4e5f876c21aab82728fc?logfile=console.log13:34
auniyalokay third one just went further, 13:34
auniyalit took some time, but one is completed now, will just wait for others to complete13:36
opendevreviewMerged openstack/releases master: [sahara] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87888813:37
opendevreviewMerged openstack/releases master: [swift] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87889113:37
auniyalin case if logs get refreshed and you want want to know what happend13:42
auniyal"13:42
auniyalDEBUG: + cd /home/zuul/src/opendev.org/openstack/releases/.tox/validate/tmp/releases-acl5aa8w/openstack/nova13:42
auniyalDEBUG: + git checkout -f 47b6850bb1f3681e6af7d8248152a979bf5051d113:42
auniyalDEBUG: Previous HEAD position was 2bb8689835 Merge "Revert "Add logging to find test cases leaking libvirt threads""13:42
auniyalDEBUG: HEAD is now at 47b6850bb1 [stable-only] Update TOX_CONSTRAINTS_FILE for stable/2023.113:43
auniyalDEBUG: cwd = /home/zuul/src/opendev.org/openstack/releases/.tox/validate/tmp/releases-acl5aa8w/openstack/nova13:43
auniyalDEBUG: $ python3 setup.py sdist13:43
auniyal"13:43
*** amoralej|lunch is now known as amoralej13:49
opendevreviewAmit Uniyal proposed openstack/releases master: nova: Release yoga 25.1.1  https://review.opendev.org/c/openstack/releases/+/88120913:51
elodillesauniyal: hmmm, good question, i'm looking, but not yet see why. we did not have such issue before13:55
opendevreviewAmit Uniyal proposed openstack/releases master: nova: Release zed 26.1.1  https://review.opendev.org/c/openstack/releases/+/88120014:00
auniyalelodilles, its tox-validate, I updated patch so it will run again 14:05
auniyalagain here - https://zuul.openstack.org/stream/982a6041e65a42d6ae00f156361625c1?logfile=console.log14:06
fungicatching up... looks like the build took ~34 minutes at this step: https://zuul.opendev.org/t/openstack/build/cb10cc84c07a476fb7e5ec2f5e920220/log/job-output.txt#19980-1998114:08
fungiprobably someone should profile sdist building for the releases package14:09
fungioh, that's for openstack/nova actually14:09
fungiso creating an sdist of nova is taking a very long time14:10
fungithere was a new pip release over the weekend which i think changed some things about its dep solver, so i suppose that could be involved14:10
fungiwe don't call setup.py when we upload sdists any more, we probably should switch to calling build in the validate job too?14:11
fungihttps://pip.pypa.io/en/stable/news/#v23-114:13
elodillesi could not reproduce it locally even with the latest pip+virtualenv+tox14:23
elodilleson the otherhand, we were close to 1hr with the validation at Antelope release, but the timeout was way more than 1 hr: https://etherpad.opendev.org/p/release-job-timeouts14:25
fungiopenstack_releases.pythonutils.build_sdist() already calls `python3 -m build --sdist --wheel` so i wonder where that `python3 setup.py sdist` from the build log is coming from. i can't seem to find it14:26
elodillesthe timeout clearly changed btw: https://zuul.opendev.org/t/openstack/build/cb10cc84c07a476fb7e5ec2f5e920220/log/zuul-info/inventory.yaml#17714:26
elodillesoh, i see, gate job has bigger timeout value than check job: https://zuul.opendev.org/t/openstack/build/2960d0d64469464289de83f5ccc5773e/log/zuul-info/inventory.yaml#21414:28
elodillesso in this case, it seems with nova we reached the timeout14:29
elodilleson the otherhand, 'python3 setup.py sdist' should be replaced with the build command, yes14:31
fungiyeah, at this point i'm struggling to find where that's being called14:32
elodillesyes, me too. it seems to be coming from tools/clone_repo.sh maybe?14:33
elodillesnah, this must be it: https://opendev.org/openstack/releases/src/branch/master/openstack_releases/requirements.py#L9414:34
fungiaha, yep that's it14:35
fungiget_requirements_at_ref is called around that point in the validate script14:35
fungias an aside, we seem to build the same packages more than once in this job, which itself is inefficient14:36
elodilles:S14:36
fungihttps://zuul.opendev.org/t/openstack/build/cb10cc84c07a476fb7e5ec2f5e920220/log/job-output.txt#640114:37
fungifor reference14:37
fungithat happened near the beginning of the job and completed quickly14:38
fungii wonder if get_requirements_at_ref could reuse the already built sdist14:38
opendevreviewElod Illes proposed openstack/releases master: Replace old sdist and wheel build command in validate  https://review.opendev.org/c/openstack/releases/+/88122914:42
elodillesyes, reuse would be useful14:42
opendevreviewJon Bernard proposed openstack/releases master: Final Cinder team releases for stable/xena  https://review.opendev.org/c/openstack/releases/+/88113314:46
opendevreviewJon Bernard proposed openstack/releases master: [cinder] Transition Xena to EM  https://review.opendev.org/c/openstack/releases/+/87888214:54
opendevreviewSlawek Kaplonski proposed openstack/releases master: New neutron-tempest-plugin release  https://review.opendev.org/c/openstack/releases/+/88123014:55
opendevreviewDmitrii Shcherbakov proposed openstack/releases master: New neutron-lib release  https://review.opendev.org/c/openstack/releases/+/88123115:00
auniyalelodilles, fungi tox validate failed for me with - ERROR: Could not find 26.1.1: Command '['git', 'show', '26.1.1']' returned non-zero exit status 128.15:31
auniyalI had same error earlier too, 15:33
fungii suppose that could be a problem fetching git refs15:36
fungii've checked the cacti graphs for all our current gitea backends and am not seeing any signs of resource exhaustion nor load imbalance15:39
fungioh, 26.1.1 is the release you're adding15:40
fungiit doesn't exist upstream15:40
auniyalso I think its trying to run show cmd in master branch instead of stable/zed - https://zuul.opendev.org/t/openstack/build/c595a37b5b30479a8a63155f9a54e3b0/log/tox/validate/validate-request-results.log?severity=0#6215:41
clarkbgit show doesn't care baout its current branch. It will find whatever ref you ask it to show if it is present15:43
auniyalokay15:43
auniyalits probably crrect but still asking, is it correct that its deleting the target tag and then checking for it again ?15:53
auniyalfrom here https://zuul.opendev.org/t/openstack/build/c595a37b5b30479a8a63155f9a54e3b0/log/tox/validate/validate-request-results.log?severity=0#1344415:53
elodillesauniyal: that is a bit misleading, but it does not mean always that it is a 'real' ERROR. the validate scripts collects all the valid ERRORs and WARNINGs at the end of its run15:54
*** amoralej is now known as amoralej|off15:56
auniyalack, so its just get timeout on  python3 setup.py sdist15:56
auniyalno other issue w.r.t to patch15:57
auniyalelodilles, so I should just recheck again ?15:57
elodillesfungi: i've checked whether we could spare building the sdist again, and i think, we can't :( the get_requirements_at_ref command is called for different hashes and sdist needs to be built for both cases: https://opendev.org/openstack/releases/src/branch/master/openstack_releases/requirements.py#L38-L3915:58
elodillesauniyal: not until this fix hasn't merged: https://review.opendev.org/c/openstack/releases/+/88122915:59
clarkbdo we know why it is taking so long? setup.py shouldn't use pip so no dep solving should happen. Is pbr spending a bunch of time dealing with git to figure out authors and version stuff ?15:59
auniyalack15:59
elodillesauniyal: if that does not decrease the time, then we have to increase the timeout15:59
auniyalokay16:00
clarkbelodilles: well you should probably figure out hwy nova is slow too16:00
elodillesclarkb: no clue at all, for me locally, sdists builds in some seconds (even with latest pip+virtualenv)16:00
clarkbright i understand we don't currently no why. I'm just saying that understanding it before making a determination to increase timeouts is probably a good idea16:01
clarkbyou can probably add instrumentation to the setup.py sdist command16:02
fungimaybe even just turn on some verbosity, but regardless directly calling setup.py sdist is deprecated and increasingly discouraged by setuptools as time goes on16:02
elodillesclarkb: true, i just mentioned the timeout value increase as last resort, otherwise yes, we should figure out why it is that slow in zuul16:03
clarkbfungi: ya or monkeypatch the print() function to add timestamp prefixes16:03
fungiif using build instead of directly invoking setuptools makes things faster, i'd be inclined to not look much deeper into misbehavior of something that its maintainers are telling people not to run16:04
clarkbya I'm more concerned if the problem ends up in PBRs git handling16:04
clarkbbecause that will only get worse as more git commits are made16:05
clarkbelodilles: actually `PYTHONUNBUFFERED=1 python setup.py sdist` may be sufficient then in theory we would get the output logged by zuul with reasonable timestamps16:06
elodillesfungi is right though, as with 'build --sdist' sdist is built in ~30 secs16:06
fungiin the same job16:06
elodillesyes16:06
fungibasically it calls build and generates an sdist and a wheel in seconds, then calls setup.py to make an sdist and waits more than half an hour16:07
fungiso something clearly pathological is going on there16:07
fungimy money's on setuptools calling out to pip since it dropped easy_install a while back, and something's going sideways with pip's dep solving under that specific condition16:08
clarkboh I didn't realize setuptools will exec pip now.16:08
clarkbthat would be for setup requires only right? Still possible to have it go crazy though16:09
fungiyeah, at least for the setup_requires16:09
fungii suppose install_requires isn't relevant when creating packages16:09
opendevreviewElod Illes proposed openstack/releases master: nova: Release zed 26.1.1  https://review.opendev.org/c/openstack/releases/+/88120016:11
opendevreviewElod Illes proposed openstack/releases master: Speed up validate with skipping rebuilding sdist  https://review.opendev.org/c/openstack/releases/+/88124216:11
opendevreviewElod Illes proposed openstack/releases master: nova: Release zed 26.1.1  https://review.opendev.org/c/openstack/releases/+/88120016:12
elodillesfungi: btw, i now see another weird PTL-Approved behavior :S https://review.opendev.org/c/openstack/releases/+/881200/4#message-9db0664ea54283bd6d8c8c0b5c42c57528147a5316:13
elodillesfungi: my bad, it works properly16:14
fungioh good ;)16:14
elodillesfungi: i forgot that auniyal was added as release liaison already o:)16:14
fungiand as for the delay in zuul believing that it could merge the problem change from yesterday, i suspect some sort of cached lookup from before we altered the submit rule16:15
fungiso it didn't evaluate it again until it thought the cached results were stale/invalidated16:15
fungiif we continue to see it on new changes, then we can look deeper16:16
elodillesack, thanks for the info!16:16
auniyalthanks for looking into this elodilles fungi clarkb 16:17
auniyal fungi++ elodilles++ clarkb++ 16:17
elodillesauniyal: np16:23
elodillesfungi: 'build --sdist' did it's job: the job ran 8 mins instead of 1hr: https://zuul.opendev.org/t/openstack/build/93f0f4a1cd3842c1beb9273346a03e1616:25
fungiwas it consistently running longer before, or just occasionally? wondering if there's some additional outside factor16:25
fungii guess it may just be slow for releases of specific projects, e.g. nova16:26
fungiso hard to tell by looking at the build history for the job itself16:27
elodillesas i checked back my etherpad about the long running jobs during Antelope release, it showed 57 mins for validate job and as far as i remember it was nova, so i think yes16:27
fungifingers crossed this clears it up for good then16:28
elodillesfungi: yepp, it took long only for huge repositories afair16:28
elodillesand far worse was nova (again, afair)16:28
opendevreviewElod Illes proposed openstack/releases master: Set Xena status to Extended Maintenance  https://review.opendev.org/c/openstack/releases/+/88125417:37

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!