14:01:15 #startmeeting glance 14:01:16 Meeting started Thu Jul 9 14:01:15 2020 UTC and is due to finish in 60 minutes. The chair is abhishekk. Information about MeetBot at http://wiki.debian.org/MeetBot. 14:01:17 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 14:01:19 The meeting name has been set to 'glance' 14:01:20 #topic roll call 14:01:24 o/ 14:01:29 #link https://etherpad.openstack.org/p/glance-team-meeting-agenda 14:01:31 o/ 14:01:31 o/ 14:01:44 welcome all :D 14:01:51 short agenda 14:01:58 lets start 14:02:16 #topic release/periodic jobs update 14:02:27 V2 is 2 weeks away 14:02:50 We need to review all the specs for Victoria on priority basis 14:03:26 Image encryption is merged, sparse image upload and dansmith's policy specs are in good shape 14:03:47 yes please :) 14:03:59 We need to focus on Duplicate downloads, cinder multiple stores support and remaining specs in order 14:04:59 Regarding periodic job, 4 tips job of py36 unit tests failed yesterday for glance_store, fix is already merged, green now 14:05:08 So all good at the moment 14:05:25 moving ahead 14:05:43 #topic Should we change the spec approval policy? 14:05:57 I want your opinion on this 14:06:40 Earlier we started this because we have enough of contributors to take care of work 14:07:00 situation is different for last 2-3 cycles 14:07:52 the way it is now, only the PTL can merge a spec ... i think leave it up to the PTL how many +2s to require 14:07:57 everyone is busy with enough work in hand which delaying the review process which might result in loosing interest of new contributors 14:08:12 i mean, on a per-spec basis 14:08:20 Yeah, I think it's probably good to not require every core's +2/ 14:08:20 there may be uncontroversial specs 14:08:34 but also some where you want everyone to have looked at them 14:08:55 yes, in that case I will personally ping all the members to have a look 14:09:01 we do that in nova, controversial specs require lots of buy-in 14:09:20 such as unified limits, duplicate downloads etc 14:09:30 my spec has had four cores vote +2 at various times across minimal changes, but not all at once 14:09:52 getting them all to +2 at the same time seem like it might be difficult 14:10:08 correct 14:10:33 i think it makes sense to be flexible about this 14:11:04 I think we've all been there. I'm fine leaving it to the PTL, just want to make sure rosmaita & smcginnis don't feel left out. It was indeed bit different store when I introduced the requirement when we were all more or less fulltime glance 14:11:27 or full time openstack at least with more time for glance ;) 14:11:48 So IMO we should come back to traditional way of having two +2's for simple/straight forward specs and more than 2 for some controversial specs 14:11:50 s/store/story/ 14:12:24 yes no offense here, I know everyone is giving their best here 14:12:39 abhishekk: That sounds like a good plan to me. 14:12:47 smcginnis, thank you 14:13:07 rosmaita, jokke ?? 14:13:18 sounds good to me 14:13:29 Cool 14:14:26 Moving ahead 14:14:41 abhishekk: yes I'm good with the plan. Maybe make sure the spec exists at least over a weekly meeting so everyone interested will have heads up and time to review. So how about no 1day specs where quick read and merge? 14:15:15 jokke, sounds good to me 14:15:22 jokke: ++ 14:15:51 I will make sure to highlight important specs and status in our weekly meeting (Starting from next) 14:16:08 abhishekk: sounds great. 14:16:28 ok, moving to open discussion 14:16:35 #topic Open discussion 14:17:14 As discussed in PTG, I have noted down TODO's list in glance 14:17:24 #link https://review.opendev.org/#/c/738675/ 14:17:32 #link https://docs.google.com/spreadsheets/d/1akuBhqmJfAC13Oi8-PjVZCwpX4KjotworWU_2w03QZA/edit?usp=sharing 14:17:45 I need to provide access though 14:17:46 smcginnis: rosmaita: ^^ we would need you two as that was collab between myself and abhishekk 14:18:05 I just updated the commit message as I had bug open for it, just forgot the tag 14:18:27 I will try to copy this spreadsheet using my personal mail id and will open for all 14:18:27 ok, i can look at that today 14:18:36 Will add TODO's from store and client also 14:18:55 jokke: rosmaita smcginnis: yeah, please, I need that to be merged before my full stack of 9000 dependencies will run :D 14:19:08 We will discuss more about this in our next meeting 14:19:25 Ah, just bug link added. I could have sworn I already reviewed that. 14:19:42 abhishekk: good job, will you just open the doc or do you want us to request access from that link? 14:19:54 smcginnis: you did indeed 14:20:05 smcginnis: and yes, just the missing bug link changed 14:20:14 jokke, I am going to open it for all 14:20:25 abhishekk: ++ 14:21:02 we have plenty of time, so I would like to finalize how we should proceed on race condition thing 14:21:10 #link Fix race condition in copy image operation 14:21:19 #link https://review.opendev.org/737596 14:22:39 Unique exception comment is easy to fix though, problem is where do we fix the race condition :D 14:23:05 yeah, I feel pretty strongly that it's important for the API to be consistent here, 14:23:42 because when I was trying to write the client side of this, it's difficult to detect the difference between "things are slow" and "the API told me I should wait a long time for a thing to be done, because I got my 20x" 14:24:35 obviously fixing the race is the important part for correctness, but multi-boot of lots of instances at a time *will* hit the situation where two novas ask and don't realize that one request is being silently ignored 14:25:50 other way to deal with this is to add revert to each task which might fail and cleanup those properties there 14:26:19 abhishekk: I will be very generous with my -2 for that :P 14:26:25 :D 14:27:55 smcginnis, rosmaita could you please also have a look at this patch? 14:28:33 Will do. 14:28:39 ok, will put it on my list 14:28:42 smcginnis, thank you 14:29:14 abhishekk: you mean https://review.opendev.org/737596 ? 14:29:20 rosmaita, yes 14:29:22 So smcginnis rosmaita just for context. The problem is that the copy-image has race, currently we're just in swamp with it. abhishekk & dansmith want to put locking in place, which I'm not fond of but it looks like one of the best options to tackle this. 14:29:41 The question is when and how do we trigger that lock and clean it 14:29:50 ok 14:30:22 You will also get the clear picture when you will go through the comments 14:30:39 OK, thanks. That helps. 14:30:56 Cool that's it for today 14:31:26 quick question: how are we looking on prospective new glance cores? 14:32:30 rosmaita, that's difficult to say as we don't have much new reviewers at the moment 14:32:45 i was afraid you'd say that :( 14:32:48 I am trying to connect some guys 14:33:18 or even old reviewers outside of this group 14:33:55 will ask them if they are interested for the same 14:33:57 rosmaita: feeling like getting too busy to keep up with glance anymore? 14:34:07 i am right at this minute! 14:34:31 lets revisit this again two weeks from now 14:34:36 ouch ... please try to avoid burning yourself out 14:35:32 rosmaita, I would suggest to stay onboard at least for this cycle 14:35:50 i can do that 14:35:50 I will ping you for important patches only 14:36:00 just want to see light at the end of the tunnel 14:36:01 rosmaita: we try to keep your load as minimum as we can 14:36:25 I will stand holding torch there :d 14:36:46 jokke, ++ 14:36:57 Just FYI, I will not be around tomorrow, my little sister is getting engaged this weekend 14:37:02 I'll keep abhishekk supplied with fresh batteries for his torch :D 14:37:10 :) 14:37:11 lol 14:37:27 most probably not able to check mails as well 14:37:48 abhishekk: enjoy and relay our best wisher for her 14:37:54 thank you 14:37:55 wishes 14:38:09 could you please restore your +2 on Dan's spec 14:38:53 any thing else guys? 14:39:26 not from me, I'll reread that spec 14:39:39 ack 14:39:45 thank you all 14:39:50 have a nice weekend 14:40:28 thanks all 14:40:45 #endmeeting