16:58:57 #startmeeting keystone 16:58:58 Meeting started Tue Jan 19 16:58:57 2021 UTC and is due to finish in 60 minutes. The chair is knikolla. Information about MeetBot at http://wiki.debian.org/MeetBot. 16:58:59 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 16:59:01 o/ 16:59:02 The meeting name has been set to 'keystone' 17:00:46 o/ 17:04:24 cmurphy: gagehugo: around? 17:04:32 o/ 17:04:43 o/ 17:05:02 #topic Follow-up on gate instability 17:05:43 I read the discussion in last week's keystone meeting and saw that there were concerns about dropping lower constraint testing from the gates 17:06:22 In the last TC meting, we discussed the topic and reached out to packagers for RDO, ubuntu, and debian, and they didn't use lower-constraints in their packaging requirements, instead relying on upper-constraints 17:07:35 #link http://eavesdrop.openstack.org/meetings/tc/2021/tc.2021-01-14-15.00.log.html#l-105 17:08:40 good to know 17:08:59 #link http://lists.openstack.org/pipermail/openstack-discuss/2021-January/019877.html 17:09:46 with regards to the timeouts, i have encountered those with the new pip dependency resolver when trying to fix l-c 17:09:59 the new dependency resolver will try all versions of a package to try and figure out which one works 17:10:18 and that can take a long time, between fetching, compiling and installing all known versions of a single package 17:10:37 fun fact - i kicked off a job to do that last week and it took 8 hours 17:10:58 before i stopped it :) 17:11:36 haha, yeah, i ran into the same issues 17:12:16 if you use a constraints file, you have to pin almost all the dependencies, otherwise it will try all the possible combinations of what you haven't pinned 17:12:42 iiuc - that includes transitive dependencies 17:13:41 i think including all transitive dependencies would be a good goal anyway? we've had problems where a dependency that wasn't in the constraints file broke the build and had to be added 17:14:22 but i don't mind going with the flow if other projects are dropping that job 17:14:48 that would incur a maintenance burden to keep 17:15:22 ideally, i understand and would like such a thing 17:15:36 i don't have an opinion - i just noticed it's going to take a while to figure out what sane list of dependencies we need (including transitive) to get things working again 17:15:37 but i feel that in a world where everything is containerized, that matters less nowadays 17:16:06 i could see it making an different for deployments that need to verify versions of software for security reasons 17:16:09 difference* 17:16:42 (e.g., we can't deploy something until we comb through the source) 17:17:45 hmmm, i feel that in that scenario they would have the manpower to maintain their own pinned versions 17:17:57 yeah - that's probably true 17:19:13 for now, we can disable the l-c job in stable branches that are broken, and see how the openstack-wide discussion progresses 17:22:07 #topic oslo.db 8.5 update breaks keystone 17:22:28 prometheanfire reported that in the -keystone channel 17:22:28 I'm in another meeting, but it looks like it's failing on the deprecation warning? 17:23:17 bnemec: do you have a link to zuul logs? 17:25:36 i have a spare hour after the meeting to do some investigating 17:28:15 we can circle back to the topic during open discussion 17:28:54 #topic PTL in Xena 17:28:58 Sorry, I naturally got called on in my other meeting. :-) 17:29:10 The logs are here: https://zuul.opendev.org/t/openstack/build/93390590a90f4ff59b3dc4d008371c41 17:29:24 * bnemec runs away from the PTL topic 17:29:45 Lately, I have been unable to devote the required amount of time and attention to keystone 17:30:04 I am not sure if I will (or should) be continuing in the role during the Xena cycle 17:31:37 thank you for putting in the time you have, knikolla 17:31:55 ++ 17:32:04 ^ 17:32:12 ++ thanks knikolla! 17:32:28 Thank you all. I appreciate that. 17:33:06 I wish I could have been doing a much better job as I owe a lot to keystone and OpenStack. 17:33:36 If anyone is interested in the role, talk to me. Otherwise, we can experiment with distributed leadership, or figure something else out. 17:36:33 If you have questions about DPL, I know a guy. ;-) 17:38:12 the tc resolution is here https://governance.openstack.org/tc/resolutions/20200803-distributed-project-leadership.html :) 17:39:47 bnemec: oslo is using distributed leadership right? 17:40:01 Yeah, we started this cycle. 17:41:17 cool 17:41:32 how is that working? 17:42:58 Pretty well so far. 17:43:08 Herve has taken over a lot of the PTL in duties in practice. 17:43:27 The rest of the liaisons mostly say "nothing to report" every week in the meeting. :-) 17:43:36 i see 17:43:53 Err, that should have been "PTL duties in practice" s/in// 17:45:24 alright, circling back to the oslo.db 8.5 topic 17:45:37 #topic oslo.db 8.5 update breaks keystone (part 2) 17:45:44 #link https://zuul.opendev.org/t/openstack/build/93390590a90f4ff59b3dc4d008371c41 17:46:03 DeprecationWarning: Using function/method 'db_version()' is deprecated in version '8.3.0': sqlalchemy-migrate support in oslo_db is deprecated; consider migrating to alembic 17:46:42 it does seem to be related to the deprecation of sqlalchemy-migrate support 17:47:02 It's the only thing I see in the logs that looks at all wrong. 17:47:15 yeah 17:47:42 i am unfamiliar with the amount of work required to migrate to alembic from sqlalchemy-migrate 17:47:58 i know we had a person working on it a while back, vishakha 17:48:35 we have a backlogged spec at https://specs.openstack.org/openstack/keystone-specs/specs/keystone/backlog/alembic.html 17:50:13 can we non-error on deprecation warnings for now while we figure this out? 17:52:28 That seems like the immediate path forward. I'm not immediately aware of what makes the deprecation warning fatal though. 17:54:19 i am not aware either 17:56:38 * lbragstad has to drop early 17:57:14 guess i'll debug while i cook lunch, thanks all! thanks bnemec for your time. 17:57:27 np 17:57:59 #endmeeting