Tuesday, 2021-04-06

*** mlavalle has quit IRC00:01
melwittoh hm. I've been having to go and manually set bugs to Fix Committed and Fix Released so I didn't realize that should have been working00:05
fungiyeah, this has always been running: https://opendev.org/opendev/system-config/src/branch/master/playbooks/roles/gerrit/files/hooks/change-merged00:07
fungimelwitt: if you spot one which didn't get set to released when it should have, let me know and i can check logs (as long as it was recent enough to fit in our last month of gerrit error log retention)00:09
*** dpawlik5 has quit IRC00:10
*** dpawlik7 has joined #opendev00:11
*** hamalq has quit IRC00:28
*** mailingsam has quit IRC00:41
*** kopecmartin has quit IRC00:57
*** dpawlik7 has quit IRC00:57
*** kopecmartin has joined #opendev00:57
*** dpawlik5 has joined #opendev00:57
*** dpawlik2 has joined #opendev01:23
*** dpawlik5 has quit IRC01:23
*** dpawlik2 has quit IRC01:33
*** dpawlik3 has joined #opendev01:36
*** brinzhang has joined #opendev01:40
*** brinzhang_ has joined #opendev01:56
*** brinzhang has quit IRC01:58
*** dpawlik9 has joined #opendev02:35
*** dpawlik3 has quit IRC02:36
openstackgerritxinliang proposed opendev/system-config master: Enable openEuler mirroring  https://review.opendev.org/c/opendev/system-config/+/78487403:23
openstackgerritxinliang proposed opendev/system-config master: Enable openEuler mirroring  https://review.opendev.org/c/opendev/system-config/+/78487403:29
*** ysandeep|away is now known as ysandeep03:44
*** ykarel has joined #opendev03:51
*** dpawlik9 has quit IRC04:35
*** dpawlik6 has joined #opendev04:35
*** whoami-rajat has joined #opendev04:38
*** marios has joined #opendev05:03
*** dpawlik7 has joined #opendev05:34
*** dpawlik6 has quit IRC05:35
*** kopecmartin has quit IRC05:36
*** DSpider has joined #opendev05:41
*** sboyron has joined #opendev06:01
*** sboyron has quit IRC06:03
*** ralonsoh has joined #opendev06:06
*** sboyron has joined #opendev06:09
*** eolivare has joined #opendev06:35
*** dpawlik5 has joined #opendev06:42
*** dpawlik7 has quit IRC06:42
*** SWAT has joined #opendev06:48
*** dpawlik5 has quit IRC06:52
*** dpawlik07 has joined #opendev06:55
*** dpawlik07 has quit IRC06:57
*** dpawlik5 has joined #opendev06:58
*** sshnaidm is now known as sshnaidm|afk06:58
*** dpawlik5 has quit IRC07:12
*** dpawlik5 has joined #opendev07:13
*** dpawlik5 is now known as dpawlik07:14
*** amoralej|off is now known as amoralej07:15
*** andrewbonney has joined #opendev07:19
*** sboyron has quit IRC07:21
*** sboyron has joined #opendev07:22
*** dpawlik has quit IRC07:28
*** dpawlik6 has joined #opendev07:28
*** avass has quit IRC07:29
*** avass has joined #opendev07:30
*** avass has quit IRC07:32
*** avass has joined #opendev07:34
*** tosky has joined #opendev07:38
*** dpawlik6 has quit IRC07:43
*** jpena|off is now known as jpena07:51
*** ysandeep is now known as ysandeep|lunch08:21
openstackgerritTakashi Kajinami proposed openstack/project-config master: Add puppet-cinder-core and puppet-glance-core  https://review.opendev.org/c/openstack/project-config/+/78488808:23
openstackgerritTakashi Kajinami proposed openstack/project-config master: Add puppet-cinder-core and puppet-glance-core  https://review.opendev.org/c/openstack/project-config/+/78488808:28
*** dtantsur|afk is now known as dtantsur08:45
*** ykarel is now known as ykarel|lunch08:50
openstackgerritAlbin Vass proposed zuul/zuul-jobs master: Use openstacksdk 0.45.0 for python2.7  https://review.opendev.org/c/zuul/zuul-jobs/+/78489408:52
openstackgerritAlbin Vass proposed zuul/zuul-jobs master: Use openstacksdk 0.45.0 for python2.7  https://review.opendev.org/c/zuul/zuul-jobs/+/78489408:54
*** ysandeep|lunch is now known as ysandeep09:04
openstackgerritMerged zuul/zuul-jobs master: ensure-podman: Use official podman repos for ubuntu  https://review.opendev.org/c/zuul/zuul-jobs/+/76517709:17
*** lpetrut has joined #opendev09:21
*** zoharm has joined #opendev09:22
*** DSpider has quit IRC10:05
*** sshnaidm|afk has quit IRC10:09
*** ykarel|lunch is now known as ykarel10:11
*** sshnaidm has joined #opendev10:18
*** kopecmartin has joined #opendev10:21
*** fdegir4 is now known as fdegir10:22
*** ykarel_ has joined #opendev10:32
*** ykarel has quit IRC10:34
*** ykarel_ is now known as ykarel11:13
*** jpena is now known as jpena|lunch11:31
*** brinzhang0 has joined #opendev11:38
*** brinzhang_ has quit IRC11:41
*** amoralej is now known as amoralej|lunch12:02
*** jpena|lunch is now known as jpena12:27
openstackgerritMerged zuul/zuul-jobs master: Document algorithm var for remove-build-sshkey  https://review.opendev.org/c/zuul/zuul-jobs/+/78398812:50
*** ykarel_ has joined #opendev12:53
*** ykarel has quit IRC12:56
*** zoharm has quit IRC13:03
*** amoralej|lunch is now known as amoralej13:07
*** mailingsam has joined #opendev13:26
priteauHello. I had a kayobe patch for stable/stein (https://review.opendev.org/c/openstack/kayobe/+/783621) which was working fine last Tuesday which is now completely broken. pip2 is trying to install the latest versions of packages, which are not compatible with python2. I've been trying to identify a package release which would explain this change of behaviour, but no luck. Has there been any change in infra since last Tuesday which would explain13:30
priteau it?13:30
toskywe are discussing about it on #openstack-qa13:31
toskyzbr: can you please repeat here what you discovered?13:31
priteauHeh, thanks :)13:31
priteauI see, it's a metadata issue due to mirrors themselves13:34
toskythe tl;dr, if I got it correctly, is that there was a major pypi outage over the weekend, which caused some infrastructural issue on their side13:34
*** ykarel_ is now known as ykarel13:36
*** rh-jelabarre has joined #opendev13:37
priteauzbr: I believe the python version requirement issue is resolved if using PyPI servers directly, is this correct?13:37
zbrmy guess is that this may not solve from our mirrors, the holy-trinity of pip/setuptools/virtualenv all dropped support for py27, and it can become very tricky not to fail installing stuff on py27, for lots of reasons.13:38
zbri would say that the only way to keep supporting py27 would be to manually constrain versions of these packages, always adding <x.y where x.y is the version that dropped support for py27.13:39
priteauAt least in kayobe, our py27 jobs still use pip 20.3.4, which would install setuptools 44.1.1 and work fine13:40
zbrhttps://zuul.opendev.org/t/openstack/build/e98465dd84d84cd8b4da48aff000c7c8 -- based on what I read on that job, it used pip 9.0.3 :p13:41
zbrvirtualenv creation failed and i do not know which version of it was used, clearly a system one13:44
zbrcurrent versions of virtualenv do *not* download  pip/setuptools/wheel from pypi unless you tell them to13:44
priteauDunno about tempest, but our stable jobs used to work fine. See https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_fe4/783621/6/check/openstack-tox-py27/fe44c59/tox/py27-0.log for versions.13:47
openstackgerritMerged zuul/zuul-jobs master: ensure-zookeeper: add use_tmpfs parameter  https://review.opendev.org/c/zuul/zuul-jobs/+/78398213:48
*** avass has quit IRC13:48
*** avass has joined #opendev13:50
*** mlavalle has joined #opendev13:58
zbrpriteau: but i can see virtualenv being called with very specific options in your case, ones that are less likely to break.14:13
fungizbr: tosky: priteau: the pypi maintainers have a fallback bandersnatch-based mirror which lacks python-requires metadata in its simple indices, so if fastly directs traffic there instead of to warehouse, you'll see pip pulling too-new versions of things which don't support older python interpreters. if that's happening again, we don't really have a solution on our end for this, it impacts everyone not just14:14
fungiour ci jobs14:14
zbrfungi: it is still happening today, long after the outdate from the last weekend.14:15
priteauI understand. But AFAIK PyPI is working normally again, but opendev mirrors aren't?14:15
zbrpriteau: mirrors are static, which means they do work kinda similar to the pypi-fallback. still, my expectation was that once pypi.org issue was addressed, to be able to see the issue as fixed for us tool.14:17
fungiour pypi "mirrors" aren't mirrors at all, they're apache mod_proxy with a very short cache time on the indices14:21
fungisomething like 5 minutes14:21
avassfungi: thanks that explains the issue in #zuul as well14:22
fungiso if the pypi outage is over, then the errors are likely unrelated14:22
fungion the other hand, if what the pypi maintainers are saying is that the "outage" is over because they've pointed fastly at their bandersnatch mirror, then that's still a possibility14:23
priteauI can pip2 install fine when I use PyPI14:23
fungipriteau: remember there's a cdn involved, so what you see as "pypi" isn't necessarily the same pypi worldwide14:24
toskythe same jobs were working fine on April 1, and I'm pretty sure they (or one of their parents) haven't been updated since them14:24
fungiwe've seen plenty of times where some of fastly's endpoints are serving from warehouse and some are serving from the bandersnatch mirror14:25
zbrfungi: what I recommend so far was to ensure pinning of pip/setuptools deps for py27 jobs, to "help" pick only compatible deps and avoid this issue14:29
zbrsadly doing this is not as easy as it appears, as many simple commands like "virtualenv ..." need updates.14:29
fungiyou'd need to pin all dependencies14:29
zbri mentioned that too, you fix one and fing 10 more broken ones.14:30
zbrif it would be up to me I would remove py27 from everywhere and allow whoever can afford the cost to attempt to fix it in a stable branch.14:31
zbrto keep the challenge fun, we can even use the #sisyphus hashtag to track the progress ;)14:32
* mordred hands zbr a rock14:33
fungicurrently view-source:https://mirror.ord.rax.opendev.org/pypi/simple/setuptools/ is providing data-requires-python=">=3.5" with the setuptools-45.0.0-py2.py3-none-any.whl entry14:35
priteaufungi: Just tested it, this one works locally for me14:37
priteauBut https://mirror.mtl01.inap.opendev.org/pypi/simple/ doesn't14:37
fungi"You are using pip version 9.0.3..." maybe that's too old to support requires-python metadata?14:37
clarkbif you examine the responses in your browser debug tools you get all the cache info and what pypi server served it14:38
clarkbwe are very transparent about what you are getting14:38
clarkbyou can take that info to pypi and say these frontends are still serving stale data if you confirm that the metadata is missing14:38
fungiview-source:https://mirror.mtl01.inap.opendev.org/pypi/simple/setuptools/ is also including data-requires-python entries14:40
priteauindeed, that's what I noticed too14:41
fungihttps://pip.pypa.io/en/stable/news/#id531 "Implementation of pep-503 data-requires-python. When this field is present for a release link, pip will ignore the download when installing to a Python version that doesn’t satisfy the requirement."14:41
fungiso 9.0.3 is new enough for requires-python metadata14:41
priteauWell, of course it works now :D14:42
fungiare the failures consistent or random?14:42
priteauI had this in my pip.conf and I swear it wasn't working until minutes ago:14:42
clarkbnote the ttl on indexes was 10 minutes last I checked and typically one of our mirrors will be served by at least 2 regions (and multiple servers) from the cdn14:42
clarkbthis means every 10 mintues or so you roll the dice14:43
priteau[global]14:43
priteauindex-url = https://mirror.mtl01.inap.opendev.org/pypi/simple/14:43
clarkbit can go back to failing again in 10 minutes if the cdn is still stale14:43
fungiwe've also seen some endpoints in the same region serve from different backends14:43
fungiyeah, that14:43
fungiso basically our "mirror" (proxy) requests an index and fastly sends it to the bandersnatch backend and we cache a copy without the metadata, a few minutes later our cache time is up and we request it from fastly again and this time we're served a copy from warehouse with the metadata included14:44
fungidepending on which fastly endpoint answers that particular request14:45
fungiwe've tracked it down by manually searching the cache content on our proxies looking for copies of a particular index containing no metadata, and then checked the headers which indicate which fastly endpoint served that14:46
openstackgerritAlbin Vass proposed zuul/zuul-jobs master: Use openstacksdk 0.45.0 for python2.7  https://review.opendev.org/c/zuul/zuul-jobs/+/78489414:55
*** lbragstad has quit IRC14:58
*** hrw has joined #opendev14:59
hrwmorning14:59
hrwdoes someone remember/know how we are when it comes to Debian Bullseye support on nodes?14:59
clarkbhrw I think fungi has been working on it, not sure if images have been built and uploaded yet though15:00
clarkbyou can check the labels available in zuul15:00
hrwzuul lacks bullseye nodes so I started looking15:00
*** lbragstad has joined #opendev15:00
hrwok, I see that patches are in review. will look15:01
hrwnoonedeadpunk and fungi work on it15:02
fungiyeah, i can't remember where we ended up with it... may have been waiting for a dib release?15:03
hrwiirc dib 3.8 had it but would need to check15:03
noonedeadpunkpatch to dib hasn't merged15:06
*** lpetrut has quit IRC15:06
noonedeadpunkI stopped on this https://zuul.opendev.org/t/openstack/build/f37d1601ca0c4d16a69a1949978c5291/log/job-output.txt#2586515:06
noonedeadpunk`Root partition of test-image does not appear to have grown: 1817051136 < 5000000000`15:06
noonedeadpunkdidn't have time to look into the cause yet15:07
hrwthanks for info15:08
melwittfungi: here's a recent example where the bug Status didn't get updated, it still says In Progress: https://bugs.launchpad.net/nova/+bug/1882094 when the patch merged on March 23 https://review.opendev.org/c/openstack/nova/+/73362715:08
openstackLaunchpad bug 1882094 in OpenStack Compute (nova) "[queens][OSP13] An overcloud reboot will sometimes leave nova_api broken" [Undecided,In progress] - Assigned to melanie witt (melwitt)15:08
clarkbnoonedeadpunk: hrw: the growroot element is responsbile for that and the integration tests test it as it has had problems in the past15:08
clarkbif you need somewhere to look15:09
noonedeadpunkyeah, that's obviously growroot, just had no ideas on top of the head what exactly might be wrong with it15:10
clarkbthe integration should should hopefully capture the console log for the instance which should log growroot15:11
clarkb(we have been looking at growroot in another context and the console has it there, btu we access it via journald so not sure if it ends up where nova can get it too)15:12
*** hberaud has joined #opendev15:12
fungiunrelated, but i think our arm64 centos images are still failing to boot15:14
fungimelwitt: thanks, this is what got logged...15:16
fungi[2021-03-23T16:56:16.503+0000] [HookQueue-1] ERROR com.googlesource.gerrit.plugins.hooks.HookTask : hook[change-merged] exited with error status: 2 [CONTEXT PLUGIN="hooks" SUBMISSION_ID="733627-reset_conf-1616518548794-559060ab" project="openstack/nova" ]15:16
fungiwe'll probably need to do some local debugging to see why that's breaking15:16
fungidoesn't look like hook script stdout/stderr are getting included in the error log15:17
*** lbragstad has quit IRC15:17
melwittfungi: ok, thanks for taking a look15:17
fungii should be able to test manually running it, though it runs in the context of a container so testing it will be a little circuitous15:18
*** lbragstad has joined #opendev15:19
yoctozeptomorning infra15:20
yoctozeptoZuul is acting weirdly on https://review.opendev.org/c/openstack/project-team-guide/+/78400515:21
yoctozeptocan't get it to gate15:21
yoctozeptoand no logs :/15:21
hrwlong time since last time I used dib...15:22
clarkbyoctozepto: are there file filters on the jobs for that repo and if so does the chagne not trigger via those?15:26
clarkbyoctozepto: it ran the gate jobs.15:26
yoctozeptoclarkb: sorry, I was not precise15:26
yoctozeptojust saw what I wrote15:27
yoctozeptoI meant it failed twice with no logs15:27
clarkbI see it is failing with a POST_FAILURE and has no logs15:27
yoctozeptoyes, yes indeed15:27
clarkbwe can check the logs on the zuul executors to see what happened there15:27
clarkbI'm in a meeting but can do that after15:27
yoctozeptothanks! :-)15:27
clarkbthere are a number of post failures in the zuul status so likely not job specific15:28
clarkbfungi: ^ fyi15:28
clarkbI wonder if one of the regions we upload logs to is having issues15:29
clarkbas some recent job completions seem fine15:29
fungiyeah, i'm looking on ze07 where that build ran to see if i can spot what broke15:29
yoctozeptoah, yes, from Zuul status I can see lots of (middle) fingers15:30
fungilooks like the swift upload generated a traceback15:31
fungiHttpException: 503: Server Error for url: https://storage.bhs.cloud.ovh.net/v1/...15:33
fungiso ovh bhs swift raising 503 errors15:34
fungiwe may need to disable log uploads there15:34
fungi682 occurrences of that failure just on ze0715:36
fungimost recent was 15:22:37 so probably still ongoing15:36
yoctozeptouhoh15:37
fungino similar errors for storage.gra.cloud.ovh.net15:37
openstackgerritJeremy Stanley proposed opendev/base-jobs master: Temporarily disable log uploads to OVH BHS Swift  https://review.opendev.org/c/opendev/base-jobs/+/78499715:40
fungiinfra-root: ^ urgent, will probably need to bypass zuul to merge that asap15:40
clarkblooking15:41
clarkbI've approved it if you want to do the merge (or wait for it to fail first?)15:41
fungion it15:42
openstackgerritMerged opendev/base-jobs master: Temporarily disable log uploads to OVH BHS Swift  https://review.opendev.org/c/opendev/base-jobs/+/78499715:45
fungithere we go15:45
fungigertty is struggling to sync with gerrit at the moment15:45
yoctozeptogerrit struggles to sync with itself you mean15:46
yoctozeptoit's slow this time of day15:46
yoctozeptoEurope and Americas overlap time15:46
*** d34dh0r53 has quit IRC15:47
fungi#status notice POST_FAILURE results between 14:00 and 15:50 UTC can be safely rechecked, and were due to authentication problems in one of our storage donor regions15:52
openstackstatusfungi: sending notice15:52
-openstackstatus- NOTICE: POST_FAILURE results between 14:00 and 15:50 UTC can be safely rechecked, and were due to authentication problems in one of our storage donor regions15:52
fungiearliest error i found across all the executors was 14:01:26 utc15:53
fungioh, technically not true with the end time there, it's job started before ~15:45 since that's when the storage location is decided15:54
fungiso some currently running jobs might still fail but jobs started after 15:45 utc should not try to use it15:55
openstackstatusfungi: finished sending notice15:55
*** lpetrut has joined #opendev16:00
*** d34dh0r53 has joined #opendev16:08
*** ykarel has quit IRC16:11
*** hamalq has joined #opendev16:13
*** ysandeep is now known as ysandeep|away16:24
*** marios is now known as marios|out16:25
zbrfungi: how crazy does it sound to drop py27 from pbr https://review.opendev.org/c/openstack/pbr/+/784938 ?16:46
clarkbas we have explained over and over and over again it is very crazy :)16:46
clarkbthe problem is it is a setup_requires dep and you can't reliable set version specifers on those16:46
clarkbthis means existing python2 and older python3 users of pbr will start getting a new pbr if you drop python2 and then break16:47
clarkbthis includes things like openstack's stable branches16:47
clarkbbut also individuals like mordred and fungi have indicated they are users of pbr with python2.7 code bases outside of openstack contexts16:47
clarkbI suspect that if openstack decides to make a python3 only pbr that it will require a fork and new package name that can be transitioned to16:48
yoctozeptoalias python2='sudo rm -rf /'16:50
mordredyeah - basically I think there would need to be a REALLY good reason to do that. pbr itself doesn't change very much (and shouldn't) - there's no reason to actively remove py27 support given how far down the stack it sits16:52
*** ysandeep|away is now known as ysandeep16:53
mordredthat said - I think minimizing what py27 testing is done should be perfectly acceptable16:54
zbr... now includes a full deployment with py27, at least attempts to do it.16:56
clarkbyup because it wants to not break openstack train which is still a supported release?16:56
zbrTBH, i do like the idea of lowering the coverage on py27 without removing it.16:56
mordredI think it would be a good compromise16:56
mordredmake sure it's not regressing syntactically (the biggest risk)16:56
clarkbis the problem that the train job doesn't work for other reasons?16:57
zbrmost py27 jobs are failing to to the python_requires issue, all in different places.16:57
clarkbif so then ya, removing that job may make sense. If it is failing because pbr broke python2 compat then we shouldn't remove it as it caught an issue wen eed to cover in the unittests first :)16:57
clarkbzbr: not sure I understand16:57
mordredwe're still officially supporting train on 2.7?16:57
clarkbis this the mirror issue?16:57
clarkbif so that should be fixed in the near future aiui16:57
clarkbmordred: extended maintenance and all that16:58
mordredyeah - the mirror issue isn't just a 2.7 thing16:58
clarkbmordred: the stable branches go way back these days16:58
mordredclarkb: gross16:58
clarkbmordred: also train was the last python2 release so I expect it will be the one people awnt to keep around if they have to make a choice16:58
clarkbparticularly since centos 8 isn't an option into the future but centos 7 is16:58
*** jpena is now known as jpena|off16:58
zbrthe lack of python_requires will it will never be fully fixed, when pypi goes into downgraded mode, that hits py27 users in particular.16:59
clarkbbut makign this change doesn't help with that problem17:00
clarkbit just makes it worse because setup_requires isn't aware17:00
clarkbthey are completely separate problems17:01
zbrin fact I suspect it will only go worse in one month https://status.python.org/ as I am not sure what is the status of SNI even with py27. I seen pip 9.0.3 used in some of the failing jobs...17:01
clarkbthe fix for python_requires is either to use contraints or a lockfile or for pypi to actually solve the issue once and for all17:01
clarkbthe fix for setup_requires is to transition everything to pyproject.toml which we know won't happen (so we are stuck with keeping pbr python2 support)17:02
zbrnot sure who has the time to maintain those constraints, especially as you endup going down the dependency tree. Before you merge one fix, someone else will break your job by dropping py27 on another dependency.17:03
*** amoralej is now known as amoralej|off17:03
clarkbopenstack does daily updates of constraints17:03
zbrthese constraints must be inside the setup.cfg of each package, if you don't want the install to randomly fail.17:05
zbrwhile openstack-specific packages can use our constraints, we have a good number of generic libraries that are used outside openstack.17:05
clarkbok, I'm just pointing out that these are completely separate concerns17:05
clarkbbecause pbr is a setup_requires17:05
zbrthose need tuning of setup.cfg17:05
clarkblets not conflate them and make things more confusing17:06
fungithere are likely also openstack users of pbr on python 2.7, as i pointed out in my comment, installing old versions of e.g. nova will still pull the latest release of pbr17:06
clarkbyup in particular train, which i expect to stick around due to its being the last python2 release and centos 8 being cancelled17:07
fungii don't actually have current projects which support python 2.7 using pbr. plenty of people however have projects which use pbr and supported python 2.7 at some point in the past with releases their users may still be installing17:07
*** whoami-rajat has quit IRC17:07
fungizbr: pip constraints don't apply to pbr17:08
fungiat least not with older setuptools versions which use easy_install as a backend17:09
fungisetuptools only started to use pip to retrieve packages in the past few years17:09
fungiyou can't use new python ecosystem features to work around problems with old python ecosystem tools which pre-date those solutions17:10
zbrshortly pbr is expected to live in current form until a new version of python/pip/setuptools) will force its users to drop pbr in favour of the newer packaging system.17:13
clarkbI'm not sure I understand that statement, but roughly pbr must continue to support old software that relies on it. It can also support new software17:15
clarkbthe trouble is when you try and drop old17:15
*** andrewbonney has quit IRC17:16
fungizbr: "new version of python/pip/setuptools will force its users to drop pbr" but people using old python/pip/setuptools to install old package versions will still use the latest pbr release. making new things doesn't mean old things vanish or stop being used17:21
fungipriteau: are you still seeing new incidents of pip choosing too-new packages in your py27 jobs? wondering if pypi has cleared up whatever the cause was17:27
*** eolivare has quit IRC17:30
clarkbfungi: re centos arm64 images maybe we should email kevinz about trying fat32 config drives instead of iso?17:30
*** dtantsur is now known as dtantsur|afk17:31
*** ralonsoh has quit IRC17:31
fungido we think that suddenly changed on ~friday?17:31
fungihttp://travaux.ovh.net/?do=details&id=49850& seems to suggest that the swift problem from earlier should be resolved17:33
fungiis that how others read it?17:33
clarkbthe images may have changed ~friday which caused them to not handle iso17:33
clarkbthe other idea I had was to boot the upstream cloud arm64 image and see if it works17:33
clarkbfungi: "error rate back to normal at all locations" lgtm17:33
fungii'll push up a revert17:34
clarkbif you want to double check you can reenable it on test (or did we not remove it there) and trigger the base-test path17:34
openstackgerritJeremy Stanley proposed opendev/base-jobs master: Revert "Temporarily disable log uploads to OVH BHS Swift"  https://review.opendev.org/c/opendev/base-jobs/+/78501917:34
fungiwe didn't disable it on test, so yeah i can push a dnm change for that17:35
clarkbfungi: another idea is did we disable around that same time (maybe we made it happy in the process)17:35
clarkbjust keep an eye out for issues when reenabling if we suspect ^ we may have been part of the fix17:35
fungiheh, maybe17:35
fungii sure hope not17:35
fungiwe merged it at 15:45, they indicated that errors seemed to have ceased as of 15:32, so i think it was clearing up while we were pushing that17:36
clarkb++17:37
fungialso i think the timestamps which don't say utc are in cest or similar17:37
fungibecause that comment is timestamped 10 minutes from now17:38
fungiif it were utc17:38
clarkbah17:39
*** marios|out has quit IRC17:43
*** lpetrut has quit IRC18:13
*** rh-jelabarre has quit IRC18:31
openstackgerritJeremy Stanley proposed opendev/system-config master: [DNM] testing log uploads  https://review.opendev.org/c/opendev/system-config/+/78502518:39
*** sboyron has quit IRC18:46
*** auristor has quit IRC18:53
*** ysandeep is now known as ysandeep|away18:56
*** auristor has joined #opendev18:58
priteaufungi: there was still pip issues in my last recheck which ended one hour before your message. Rechecking again.19:09
fungii can try to prod folks in #pypa-dev if nobody has done so yet19:10
clarkbI haven't19:11
fungibut to be clear, pretty much anyone can do that19:12
priteauI tried to pin some deps to known good versions, but then I hit issues with deps of deps which are not even in our requirements list19:13
priteauI mean the global-requirements list19:13
fungihttps://status.python.org/incidents/rw171ylf8jw319:14
clarkbpriteau: are you not using contraints?19:15
clarkbconstraints should be global19:15
fungi"Unexpected behavior: Diverted requests to the Simple API internal mirror experienced additional confusing behavior due to missing data-requires-python metadata may have led to some users reliant on that feature receiving confusing errors or incorrect versions of unpinned dependencies."19:15
fungiclarkb: the errors seem to arise from installing too-new setuptools, which isn't covered by constraints19:21
priteauWe are using upper-constraints of course. But for example this job failed because it pulled arrow 1.0.3: https://41a7b4b4e7581ee9052b-e2efce41b3edfa8b1781e11d852066bf.ssl.cf2.rackcdn.com/783621/8/check/kayobe-tox-ansible/e55e392/job-output.txt19:21
priteauarrow is not in stein u-c19:21
clarkbthat seems like a bug in stein u-c constraints. But also makes sense that setuptools would cause problems too19:21
fungier, at least some of the errors i saw were pip on python 2.7 installing setuptools 45.0.0 even though that requires python 3.5 or newer19:22
fungiso clearly not obeying the data-requires-python metadata (because it seems to have been missing from the indices sometimes)19:22
fungianother reassuring entry in that incident writeup... "Updates to Internal Mirror: Our internal mirror will be upgraded to mirror all current Simple API features so that in the event of future outages, installs will not be impacted."19:25
fungiso maybe we'll stop seeing this problem at some unspecified point in the future19:25
clarkbhopefully our previous reports of this issue helped them figure it out quickly when problems started today19:27
clarkbI know that when we first reported the behavior it was news to them19:28
priteauMaybe it's just luck but my recheck has not failed yet: https://zuul.openstack.org/status#78362119:28
clarkbpriteau: the issue above reports the fixed it19:28
priteauWell, that was updated yesterday, and our jobs were still failing just a few hours ago19:30
clarkbah yup so someone should report to them that this was being observed today as well19:30
fungiright, they thought they fixed it yesterday, we're discussing in #pypa-dev currently19:31
clarkbfungi: thanks!19:31
fungithey're asking if we have a list of problem indices so they can manually purge them from the cdn19:32
*** mailingsam has quit IRC19:33
clarkbI suspect its likely to overlap with constraints though maybe not completely so19:33
fungiright19:34
clarkbsince our jobs would've been pulling all those packages. Maybe add setuptools pbr and hacking/flake8/pycodestyle for things not in the list?19:34
fungiwe're certainly not going to see the extent of the problem19:34
fungiit's more that they're asking for a list of which packages we've seen stale/incorrect indices for, since they'd be manually clearing each one in the cdn19:36
clarkbright, I'm suggesting that we don't bother trying to find that list and give them constraints + a few not in constraints instead19:37
fungiheh19:37
clarkbotherwise we'll just be doing whack a mole as we find new ones19:37
clarkbfungi: any indication if that API is still publicly accessible?19:37
fungiwhich api?19:37
clarkbbecause we could do it too if so (we had a purge api call in our old mirroring script iirc)19:38
clarkbthe one to flush cdn data for particular packages19:38
fungii don't think i'm going to ask them to manually clear hundreds of indices. at this point it could just be a few which are stuck for some reason. if we can demonstrate that it's more than a few, they'll likely look at more drastic measures19:38
fungioh that api19:38
fungii can ask19:38
clarkbbasically the old mirror script would scan for errors, if found, purge, then retry19:38
clarkband if that is what they intend on doing that maybe it is easier for us to script that up19:39
clarkband/or address the specific problems as we find them rather than asking them to do it19:39
fungilooks like we did a purge request to the stale url, judging from https://opendev.org/opendev/puppet-bandersnatch/src/branch/master/files/run_bandersnatch.py19:43
fungiis that how you read it?19:43
clarkbya19:43
clarkbcurl -XPURGE https://pypi.org/simple/package/ and curl -XPURGE https://pypi.org/simple/package/json looks like19:44
*** hamalq has quit IRC19:44
*** hamalq has joined #opendev19:45
fungiclarkb: priteau: di says that still works if we just want to purge the ones we spot19:55
clarkbwfm19:56
melwittclarkb: do you think this is good to approve now? the Depends-On patch has merged https://review.opendev.org/c/opendev/system-config/+/78254020:21
clarkbmelwitt: yes, I think we can go ahead with that. I need to pop out for a bit but can hit +A when I return20:22
clarkbor fungi can hit it if around20:22
priteauAll jobs of https://review.opendev.org/c/openstack/kayobe/+/783621 passed, except one. New error this time: distutils.errors.DistutilsError: Could not find suitable distribution for Requirement.parse('pbr')20:31
priteauThis might be a different issue20:31
melwittcool thanks clarkb20:32
*** rh-jelabarre has joined #opendev20:32
*** slaweq has quit IRC20:49
*** slaweq has joined #opendev20:51
fungipriteau: yeah, i'm quickly getting lost in the spaghetti of tenks deployment scripting and ansible there20:53
fungisomething is not finding pbr, that much is clear, but the logging from those scripts is not very thorough20:53
fungiwhen did this last work?20:54
fungiearlier today apparently, according to https://zuul.opendev.org/t/openstack/builds?job_name=kayobe-overcloud-centos&project=openstack/kayobe20:55
*** slaweq has quit IRC20:57
openstackgerritMerged opendev/system-config master: Run update-bug on patchset-created again  https://review.opendev.org/c/opendev/system-config/+/78254021:23
*** avass has quit IRC21:44
clarkbmelwitt: ^  fungi must've gotten it22:11
melwittclarkb: he did :) thanks22:11
clarkbpriteau: that error is coming from setuptools easy_install which is far less reliable than pip generally. Unfortunately it doesn't say which urls it attempted. However, from memory it doesn't use our mirrors at all and talks directly to pypi (since we dont' configure mirror setup for easy_install anymore22:46
clarkbfungi: ^ does that sound correct to you?22:46
fungiwe used to configure it but i think we dropped that some time ago22:52
clarkbya so I wonder if that is a case of the pypi cdn failing entirely22:53
clarkbI wonder if that easy_install failed on the sni brownouts23:10
clarkbthat job ran on centos-7 and was using python2.723:13
clarkbcentos-7 python2 is 2.7.5-89 according to a package listing23:15
clarkb2.7.0-8 don't have sni according to https://github.com/pypa/pypi-support/issues/978 xenial is 2.7.12 as a comparison23:16
clarkbI have no idea if rhel/centos have backported sni support into python2 though23:16
clarkb(would be in things like urrlib?23:16
clarkbpriteau: fungi: I'm beginning to suspect this is the issue since our mirror will bypass this on the pip side23:18
clarkbbasically our mirror apache proxy doesn't care if you sni or not since we use different ports and dno't rely on sni to address vhosts23:18
clarkbthen it will sni to the backend23:19
clarkbI also wonder if this is the sort of thing where rhel7/centos7 might update it does have 3 more years of shelf life23:21

Generated by irclog2html.py 2.17.2 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!