09:00:28 #startmeeting ptl_sync 09:00:29 Meeting started Tue Mar 17 09:00:28 2015 UTC and is due to finish in 60 minutes. The chair is ttx. Information about MeetBot at http://wiki.debian.org/MeetBot. 09:00:30 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 09:00:33 The meeting name has been set to 'ptl_sync' 09:00:36 #topic Heat 09:00:43 ttx sorry, i missed last weeks 09:00:48 #link https://launchpad.net/heat/+milestone/kilo-3 09:00:56 asalkeld: not a problem, I skipped most of them anyway 09:01:08 So... crunch time 09:01:16 6 BPs up 09:01:19 so most are very close to getting merged 09:01:29 Let's go through them one by one 09:01:30 so i am not very stressed about them 09:01:34 ok 09:01:36 https://blueprints.launchpad.net/heat/+spec/convergence-resource-table 09:01:51 Only needing https://review.openstack.org/#/c/156693/ ? 09:01:58 that's a one patch spec 09:02:00 yip 09:02:05 should get in soon 09:02:08 https://blueprints.launchpad.net/heat/+spec/decouple-nested 09:02:14 that's mine 09:02:25 *should* get it in time 09:02:39 3 patches up, all working well 09:02:54 some disent amounst the minions;) 09:03:06 https://blueprints.launchpad.net/heat/+spec/convergence-lightweight-stack 09:03:18 that might well miss 09:03:32 but it't only one patch 09:03:43 so willing to wait until the death 09:03:49 I see two patches 09:04:11 If that one misses, do you intend to request FFE for it ? 09:04:26 ttx: it can just wait 09:04:27 If not, maybe prioritize to Medium 09:04:31 ok 09:04:40 I assume the first two would get exceptions 09:04:46 if they missed 09:04:58 https://blueprints.launchpad.net/heat/+spec/convergence-push-data 09:05:20 that should be ok 09:05:26 again one patch 09:05:44 OK, it's linked to the other 09:05:56 https://blueprints.launchpad.net/heat/+spec/mistral-resources-for-heat 09:06:06 Looks like 2 patches 09:06:24 so this is a bp for code that goes into contrib/ 09:06:36 so IMO it can go in any time 09:06:46 (does not effect strings or deps) 09:06:51 yeah, at least could get an exception as "self-contained" 09:07:00 and gets installed seperately 09:07:00 https://blueprints.launchpad.net/heat/+spec/stack-breakpoint 09:07:23 there are 2 or 3 guys trying hard to get that in 09:07:29 it should make it 09:07:44 needed for tripelo 09:07:52 ok we'll see how it goes 09:08:10 yeah, we can wait 09:08:20 On the bugs side... anything k3-critical that we would not just defer to rc1 ? 09:08:24 they are all on the edge of getting merged 09:08:47 i think just defer everything that's not merged 09:09:04 #info k3 bugs can be deferred to rc1 if incomplete 09:09:26 #info 6 BPs left, all still standing a reasonable chance of making it 09:09:37 * asalkeld is hopeful 09:10:11 OK, that's all I guess. We'll make a new status point on Thursday (your evening, my morning) 09:10:26 hopefully for the final tag / early FFE decisions 09:10:29 ok - same time? 09:10:46 (just want to make a reminder) 09:10:46 sounds good. Ping me if I don't talk 09:10:59 ok, chat then 09:11:21 see you! 09:11:40 zz_johnthetubagu: around? 09:17:06 johnthetubaguy: o/ 09:17:25 ttx: hi 09:17:28 #topic Nova 09:17:35 #link https://launchpad.net/nova/+milestone/kilo-3 09:17:52 That's still 12 BPs in progress 09:18:05 Let's see how far they are 09:18:11 https://blueprints.launchpad.net/nova/+spec/v2-on-v3-api 09:18:29 I see 4 patches on https://review.openstack.org/#/q/topic:bp/v2-on-v3-api,n,z 09:18:53 But the last says it partially implements so it looks like there is more to it 09:19:49 hmm, I thought they were done there now, pretty much, but I am rarely online the same time as those folks so its tricky relaying info 09:20:19 Those might be add-ons patches 09:20:20 ah, yes, I think a lot of this is now unit test clean up 09:20:52 ok, if that's the case it would be good to stop mentioning the BP in the patches at some point and consider it "finished" 09:20:53 well, general cleanups anyways 09:21:02 https://blueprints.launchpad.net/nova/+spec/api-microversions 09:21:05 agreed, but folks do like the tracking 09:21:07 seems to be in the same situation 09:21:12 yes 09:21:37 #info v2-on-v3-api and api-microversions are "mostly finished" 09:21:46 https://blueprints.launchpad.net/nova/+spec/cells-v2-mapping 09:22:00 This one will be partial IIUC 09:22:24 in which case it is "mostly finished" too, I suspect 09:22:51 yes, there are a few patches, I think they wanted, but as you say, its kinda partial anyways 09:23:00 https://blueprints.launchpad.net/nova/+spec/kilo-objects 09:23:15 this one is basically tracking cleanups 09:23:21 Feels like another "what's in kilo is kilo" 09:23:27 right 09:23:47 its a tracking placeholder to help with reviews more than anything 09:23:51 #info cells-v2-mapping and kilo-objects are partial work anyway, so what's in kilo is kilo 09:24:02 https://blueprints.launchpad.net/nova/+spec/online-schema-changes 09:24:12 one last patch there I think 09:24:26 pending on https://review.openstack.org/#/c/154521/ 09:24:43 #info online-schema-changes pending on https://review.openstack.org/#/c/154521/ -- looking good 09:24:55 it says "partially implements" though 09:25:57 https://blueprints.launchpad.net/nova/+spec/make-resource-tracker-use-objects 09:26:11 yeah, thats true, I will check, but I thought that was it 09:26:27 Lots of patches up 09:26:46 This one sounds more at risk 09:27:00 although it's marked partial too 09:27:17 oh yes, I think you are right, that one was close to getting kicked out already 09:27:37 #info make-resource-tracker-use-objects might be so partial it could be considered deferred 09:27:46 https://blueprints.launchpad.net/nova/+spec/detach-service-from-computenode 09:28:07 I will check out that one closer after the meeting 09:28:23 Not sure what's waiting on 09:28:37 it might be done now, as much as it will be in kilo, let me check 09:29:05 you can check after meeting, no worry 09:29:16 yep is done, although its technically, a bit partial 09:29:22 #info detach-service-from-computenode might be considered finished 09:29:38 OK, medium stuff now 09:29:43 https://blueprints.launchpad.net/nova/+spec/keypair-x509-certificates 09:30:21 Remaining patches are novaclient and a bug fix 09:30:37 yes, they need work really 09:30:43 so we could move them to RC1 if necessary 09:30:56 and consider this finished 09:31:12 yes, I think thats probably easiest, I might open bugs for those patches 09:31:14 #info keypair-x509-certificates could just be considered finished and remaining bugs pushed to RC1 if need be 09:31:26 https://blueprints.launchpad.net/nova/+spec/v3-api-policy 09:31:39 This one is marked partial, so probably another of those "what's in kilo is kilo" 09:32:08 possibly, there are a few patches they really want in kilo, its on the etherpad I think 09:32:08 Loads of patches there 09:32:22 yeah, its a streatch to get those all in 09:32:33 #info v3-api-policy needs work, will be partial 09:32:46 https://blueprints.launchpad.net/nova/+spec/ec2-api 09:33:04 Isn't that finished? 09:33:53 I think it is 09:33:56 #info ec2-api is probably finished 09:34:02 https://blueprints.launchpad.net/nova/+spec/servicegroup-api-is-up-host-topic 09:34:34 A bit unclear what it's waiting on 09:34:42 also marked partial 09:35:22 I think its this one: https://review.openstack.org/#/c/149924/ 09:36:00 #info servicegroup-api-is-up-host-topic pending on https://review.openstack.org/#/c/149924/ 09:36:08 its a clean up though, so I might just remove the blueprint tag, lets see 09:36:29 https://blueprints.launchpad.net/nova/+spec/hyper-v-test-refactoring 09:36:37 That one looks forever partial 09:37:05 and candidate for deferral 09:37:15 #info hyper-v-test-refactoring may or may not make it 09:37:29 yeah, I vote we defer the BP, I mean its unit tests, so they can merge later if needed I guess 09:37:45 yeah, I'm more afraid of the distration than of the impact 09:37:50 distraction 09:38:34 #info 10 blueprints left, mostly partial & "what's in kilo will be kilo" work anyway 09:38:56 Quick look on the bugs side, see if any of those "critical" are actually k3-critical 09:39:09 like regressions 09:39:14 or showstoppers 09:40:12 https://bugs.launchpad.net/nova/+bug/1383465 is unlikely to be solved in the next two days 09:40:13 Launchpad bug 1383465 in OpenStack Compute (nova) "[pci-passthrough] nova-compute fails to start" [Critical,In progress] - Assigned to Yongli He (yongli-he) 09:40:26 nothing that stops kilo-3 from a quick look, right, thats the only big one 09:40:35 it got discussed in the nova meeting again 09:40:43 trying to move that forward before kilo ships 09:40:57 it was targeted to kilo-1 I think 09:41:16 and passed forward 09:41:34 https://bugs.launchpad.net/nova/+bug/1296414 is another of those long-standing issues 09:41:35 Launchpad bug 1296414 in OpenStack Compute (nova) "quotas not updated when periodic tasks or startup finish deletes" [Critical,In progress] - Assigned to Rajesh Tailor (rajesh-tailor) 09:42:06 ok, so I'll push to RC1 anything not merged by tag time 09:42:14 #info Bugs to be deferred if not merged 09:42:34 sounds good 09:42:44 I will ask around to see if there are any exceptions to that 09:42:48 johnthetubaguy: how about we meet again on Thursday and formally close all those partial things, then see if anything in the pipe justifies waiting ? 09:42:59 that sounds like a plan 09:43:04 same time ? 09:43:13 yes, I will pop that in my calendar now 09:43:23 sounds good, have a good day! 09:43:37 you too! 12:24:46 ttx: ready when you are 12:25:16 dhellmann: you might be one hour early, but we can probably do it nevertheless 12:25:37 ttx: oh, right, forgot you're not on dst yet 12:25:57 no pb, if you're available now, let's do this 12:26:03 sure thing 12:26:05 #topic Oslo 12:26:24 A quick look at milestone pages 12:26:30 #link https://launchpad.net/oslo/+milestone/kilo-3 12:26:58 there is a bug targeted: https://bugs.launchpad.net/cinder/+bug/1417018 12:26:59 Launchpad bug 1417018 in oslo.db "Cinder encounters dbapi error: NoSuchColumnError: "Could not locate column in row for column '%(140070887032400 anon)s.volumes_id'"" [Medium,In progress] 12:27:07 is that something we want to fix and sync in kilo ? 12:27:10 yeah, that one is actually done in oslo.db -- I'll be making a release later today 12:27:30 we haven't settled on the best way to fix it in the incubator, but the oslo.db fix should remove the issue from cinder 12:27:35 I'll update the targets 12:27:40 ok cool 12:28:17 https://launchpad.net/oslo/+milestone/next-kilo shows a bunch of PBR bugfixes 12:28:23 is there a PBR release planned ? 12:28:43 or would those actually appear in next-liberty ? 12:28:46 ttx: not to include those items. I'll retarget them too 12:28:54 ok 12:29:09 Looking at https://review.openstack.org/#/c/162656/ 12:29:12 actually, I don't think whoever has been doing pbr releases has been using the script, so I'll verify that they're not already done 12:29:37 do you think we should collect more +2s ? Or should I just W+1 it ? 12:30:09 not sure why the subsequent reviewers didn't w+1 12:30:32 yeah, sdague expressed some reservations and that may have made people hold off on the w+1 12:30:45 I would like to get that in asap so we can also land the subsequent requirements updates to the projects 12:31:02 the requirements freeze is this week, right? 12:31:03 yeah, timing is tight 12:31:04 jogo is +1 on that now, I'll approve 12:31:17 sdague: thanks, I was going to check in with you on that in a bit anyway 12:31:22 that's going to trigger a ton of patch / test churn, btw 12:31:23 requirements freeze is same as feature freeze yes 12:31:37 sdague: yeah, that's why I did it all in one patch :-) 12:31:42 :) 12:31:51 sdague: should we generally try do those one week earlier ? 12:31:59 ttx: probably 12:32:28 I expect we'll consume all the test nodes on the proposal bot changes for that one 12:32:31 note that those could technically land post-requirements freeze, since requirements freeze is mostly to guard against additions 12:32:32 I think we would have been ready a week earlier this cycle except for the deprecation warning issue 12:32:52 ttx: ok, as long as it lands before the branches are made I think we're ok 12:32:56 so I'm fine with those landing post-FF pre-RC1 12:33:07 I'm OK with that if we want to preserve test nodes 12:33:09 if we think approving this now is a bad idea 12:33:20 I can update the policy document to reflect that change, too 12:33:24 (requirements sync is part of the RC1 process) 12:33:31 zuul still has capacity this morning 12:33:47 so anyway, it's approved 12:33:54 ok, thanks, sdague 12:34:36 dhellmann: I also wanted to pick your brains on which projects actually could be considered havign a final release per cycle (for the release:at-6mo-cycle-end tag application) 12:34:48 First, looking at Oslo libs 12:35:25 I suspect some of them don't really apply 12:35:30 for example openstack-dev/pbr 12:35:33 I want them all to be considered as following that plan, although we had a couple of stragglers this time. pbr and taskflow, I think. 12:35:50 I need to get the pbr team inline with the rest of oslo 12:35:55 what about openstack-dev/cookiecutter ? 12:36:04 (and openstack-dev/oslo-cookiecutter) 12:36:33 oh, we don't release that, that's just a template for making new projects 12:36:41 you use it straight from the git url 12:36:46 right, so those would not have release:* tags 12:36:52 right 12:37:22 so what should I do about taskflow/pbr ? Consider them in ? 12:37:36 yes, let's consider them in and I'll talk to the maintainers 12:37:47 Second question is about python-*client, to piggyback on your discussion with David Lyle recently 12:38:05 I'm a bit confused if we should consider them in (as in having stable branches) 12:38:24 I think they might be release:independent, really 12:38:37 well, we need to start doing that, I think. We're capping the versions we use in the stable branches, so if we have to fix something the only want to release a new version would be to have a stable branch of the client 12:39:06 I believe the d-g tests are already installing the services and tempest in separate site-packages (one is in a virtualenv, I'm not sure which) 12:39:22 that lets us test new clients from the outside of the cloud while using stable clients inside the cloud 12:39:47 hmm, ok. So they should be at-6mo-end 12:39:57 even if technically they don't have stable branches until needed 12:40:09 yes, unless someone else comes up with an alternative to my stable-library spec 12:40:31 Applying tags results in uncovering interesting corner cases to say the least 12:40:44 true 12:41:01 at the very least helps clarify things, even for us 12:41:18 ok, I think that's all I had in store for you 12:41:27 ok, I had a couple for you 12:41:58 do we have any lib left to release for kilo ? You mentioned pbr ? 12:42:03 that policy on stable branches for libs should be prioritized: https://review.openstack.org/155072 12:42:22 yeah, I added it to todays meeting agenda 12:42:23 I have a backported fix for oslo.db that I will release today, but I don't anticipate any other releases 12:42:27 ok, good 12:42:35 (cross-project meeting) 12:42:37 and do we know the dates for the fall summit yet? 12:42:47 in silence is consent mode 12:43:00 I think we do, let me check 12:43:17 I didn't see them on the event calendar on openstack.org, but I don't know how far out that actually goes 12:43:27 https://www.openstack.org/summit/tokyo-2015/ 12:43:40 October 26 - 30, 2015 12:43:41 ah, ok thanks 12:43:54 That may not be linked to yet, but the page is public 12:44:01 and the URL is not that hard to guess 12:44:10 yeah, I should have tried that :-) 12:44:20 actually it's linked 12:44:25 https://www.openstack.org/summit/ 12:44:47 so consider it public :) 12:45:08 I was looking on https://www.openstack.org/community/events/ 12:45:31 the calendar includes the spring summit, so I thought if the dates were set for fall that would be there, too, but maybe it's too far out 12:45:38 anyway, thanks, I should have looked harder :-) 12:45:59 that's all I had for this week 12:46:48 dhellmann: so in summary, next steps are merging requirements patch (but that's arguably non-oslo-specific 12:46:55 and get that policy approved 12:47:05 otherwise you're mostly done release-wise ? 12:47:20 yes, I believe that's it 12:47:34 dhellmann: we could push the kilo-3 tag on oslo-incubator now 12:47:37 we'll obviously do patch releases if we need to, but the branches are created so we can do that 12:47:44 since we did a 2015.1.0b2 12:48:19 sure, we can push the tag. I think we said we would wait to do the stable branch on the incubator until the other projects had theirs, but the tag is easy 12:48:33 do you want to do that, or shall I? 12:48:39 I'll do it now 12:48:41 ok 12:49:41 done 12:49:47 cool, thanks 12:50:07 ok, thx and have a good day! 12:50:12 you, too! 13:05:22 ttx, /me ready when you are 13:05:47 SergeyLukjanov: hi! 13:05:53 #topic Sahara 13:05:56 ttx, hello! 13:05:58 #link https://launchpad.net/sahara/+milestone/kilo-3 13:06:14 so, we have mostly everything planned done 13:06:20 8 still up 13:06:27 except few things that are now candidates for FFE 13:06:44 stuff that is under review now is expected to be merged today/tomorrow 13:07:02 SergeyLukjanov: ok 13:07:16 What are the FFE candidates ? edp-job-types-endpoint ? 13:07:51 ttx, yup, this one for sure, there is a core stuff merged already, but plugin-related things not yet merged now, so, we'll need some more time for it 13:08:19 #info edp-job-types-endpoint likely FFE 13:08:20 the same for "Provide default templates for each plugin" https://blueprints.launchpad.net/sahara/+spec/default-templates 13:08:53 #info default-templates likely FFE 13:08:53 and probably for edp-datasource-placeholders as well 13:09:07 #info edp-datasource-placeholders likely FFE 13:09:16 the others shall merge before the k3 date 13:09:22 yup 13:09:40 OK, any k3-blocking bug ? Or can I just defer them to rc1 isf they miss target , 13:09:42 ? 13:09:43 I expect sahara to be ready for tag cut on Thu EU morning 13:09:59 no k3-blocking bugs 13:10:33 I've cleaned up the bugs list, so, there is only few of them and I think they will be merged till the k3 13:10:42 OK. I had a question for you on dashboard, extra and image-elements 13:10:50 and anyway they aren't importnat 13:10:53 yeah 13:10:53 how many of those are actually still in use ? 13:11:05 dashboard is only for stable/icehouse 13:11:13 starting from Juno it's moved to horizon repo 13:11:26 extra and s-i-e are in use 13:11:29 what is the release model for extra and image-elements ? 13:11:39 once at the end of the 6-month cycle ? 13:11:54 the same as for sahara - tagging each 6 month ~ the same day with sahara 13:12:05 ok, that works 13:12:11 it's needed for coupling images builded for sahara and tools used inside of them 13:12:19 I'll apply the corresponding tag as needed :) 13:12:25 okay, thx :) 13:12:38 you dont use intermediary tags on those, right ? 13:12:41 * SergeyLukjanov lurking for gov changes :) 13:12:58 ttx, we're using the kilo-X tags as well 13:13:05 ok 13:13:19 That's all the questions I had, thx 13:13:23 ttx, in fact all tags done for sahara are done for both extra and image-elements repos as well 13:13:34 ack. have a good day! 13:13:35 oh, I have a thing to share 13:13:44 listening 13:13:50 we have an issue with src juno job for sahara client 13:14:01 maybe #info it 13:14:20 we're now investigating it and we'd like to make it non-voting 13:14:27 for some time to avoid blocking client dev for kilo 13:14:50 heat-engine fails to load because saharaclient installed from source and heat checks versions 13:15:05 and that's extremely strange why only sahara job is failing 13:15:16 we're not trying to understand it 13:15:34 https://review.openstack.org/#/c/165038/ 13:16:24 ok 13:16:27 #info gate-tempest-dsvm-neutron-src-python-saharaclient-juno is now failing due to the very strange issue - heat-engine fails to load because saharaclient installed from source and heat checks versions and finds incorrect version (not <=0.7.6 defined in requirements.txt) 13:16:38 nothing else from my side 13:16:39 thx 13:16:44 cheers 13:17:08 ttx, have a good day! 13:36:27 dhellmann: https://review.openstack.org/#/c/162656/ failed gate on some horizon startup failure 13:38:36 Interesting: http://logstash.openstack.org/#eyJzZWFyY2giOiJcIm9wdC9zdGFjay9uZXcvZGV2c3RhY2svZXhlcmNpc2VzL2hvcml6b24uc2g6Mzk6ZGllXCIiLCJmaWVsZHMiOltdLCJvZmZzZXQiOjAsInRpbWVmcmFtZSI6IjE3MjgwMCIsImdyYXBobW9kZSI6ImNvdW50IiwidGltZSI6eyJ1c2VyX2ludGVydmFsIjowfSwic3RhbXAiOjE0MjY1OTk1MDQ0ODd9 14:25:53 thingee: if you want to go at your usual USA DST time, we can do it now 14:26:11 since I'm around anyway 14:26:40 ha I was just thinking of that 14:26:47 your call 14:27:01 sure sounds good to me 14:27:06 #topic Cinder 14:27:12 #link https://launchpad.net/cinder/+milestone/kilo-3 14:27:25 So you seem to be all-set featurewise 14:27:47 but you might want to include a few more bugfixes before tagging �? 14:28:27 yea, we still got some we have to take of. 14:28:32 and me as well ;) 14:28:40 Are any of those targeted bugs k3-critical ? Or can I just move them if they happen not to make it at tag time ? 14:29:17 I just went through the ones targeted and think they have a fair shot. They're all in review. 14:29:32 I removed the ones that have no reviews at this point. 14:29:37 OK, let's try to use the list as targets then 14:29:52 and tag when list is empty, and revise that on Thursday if necessary 14:30:03 sounds good 14:32:46 see PM for a few security issues pings 14:32:52 That's all I had 14:33:06 it's the calmest Cinder FF since a long time :) 14:33:46 :) 14:34:07 see the ML. I'm not a popular person 14:34:08 I'll let you go back to debunking security issues and review bugfix patches. 14:34:23 though I'm not sad to say no to patches coming in on March 10th for new features. 14:35:13 ok, have a great day then 14:35:41 have a great rest of your day! 14:35:44 nikhil_k: interested in speaking now ? I might have a call interrupting our usual time 14:35:53 mestery: same 14:36:05 ttx: sure, trying to do double meeting 14:36:13 another is audio 14:36:20 I'll go slowly 14:36:23 #topic Glance 14:36:27 thanks 14:36:37 #link https://launchpad.net/glance/+milestone/kilo-3 14:36:44 3 BPs left on the docket 14:37:06 you sent an email on those, let me check 14:37:50 ok, so those are all candidates for a FFE if they don't make it 14:38:01 How far are they from completion ? 14:38:43 artifact-repository seems to be half-dozen patches away, but mostly proposed 14:38:48 ttx: yes, I'm tentative on all three at this point. Catalog Index Service is still pending some last minute discussion/blockers. Artifacts is stuck on lot of reviews and the policy strenghthening is waiting on one confirmation due to a possible conflict/overlap with another feature. 14:39:20 ok, would be good to have at least one of those in by the deadline to limit the number of exceptions 14:39:28 #info last 3 BPs are likely to require exceptions 14:39:54 ttx: how long can we go with FFE? 14:40:02 nikhil_k: on the targeted bugs side, I assume nothing is k3-critical and everything can be deferred to rc1 if not fixed by Thursday 14:40:09 1 week / 2 weeks? 14:40:10 nikhil_k: ideally max 10 days 14:40:15 gotcha 14:40:22 (i.e. end of month) 14:40:29 sure thing 14:40:42 ttx: here when ready 14:40:43 ttx: about the bugs 14:40:44 #info Bugs can be deferred if needed 14:40:59 ttx: this one https://review.openstack.org/#/c/119824/11 might be tricy https://launchpad.net/bugs/1365436 14:41:00 Launchpad bug 1365436 in Glance "Database schema differs from migrations" [High,In progress] - Assigned to Oleksii Chuprykov (ochuprykov) 14:41:13 I will try to review soon 14:41:20 (that's it from me) 14:41:48 nikhil_k: ok, keeping an eye on it 14:42:26 #info Bug 1365436 should be solved before k3 ideally 14:42:27 bug 1365436 in Glance "Database schema differs from migrations" [High,In progress] https://launchpad.net/bugs/1365436 - Assigned to Oleksii Chuprykov (ochuprykov) 14:42:59 thanks! 14:42:59 nikhil_k: that is all I had. We'll revisit progress on Thursday 14:43:03 cheers 14:43:11 sure 14:43:13 mestery: o/ 14:43:19 #topic Neutron 14:43:23 ttx: o/ 14:43:40 #link https://launchpad.net/neutron/+milestone/kilo-3 14:43:52 hmm, lots of open stuff there 14:44:01 Yup, so many are very close 14:44:08 But the reality is I need to start moving some of them out 14:44:23 We had a gate issue which killed our Fri-Mon the past weekend which was awful :( 14:44:30 you need to sort them so that those that are 99% there get priority reviews 14:44:39 Yes. 14:45:05 because otherwise they will all be at 98% on Thursday and just generate more pain 14:45:26 #info 34 open BPs 14:45:38 Indeed, I'm on that post this meeting and will get it down to the actual 5-8 that will make it 14:45:52 #info need to sort between FFE candidates, nearly-there and wont-make-it 14:46:32 How about we discuss again tomorrow and see how much you managed to clean up ? 14:46:43 Works for me. 14:46:53 The javelin/grenade issue just killed us, apologies for being behind a bit 14:47:02 I spent yesterday sheparding that and trying to recheck some stuff. *sigh* 14:47:06 On the bugs side, anything that should be k3-blocking ? 14:47:15 * mestery finds the one 14:47:15 Or just move them all to RC1 as-needed ? 14:47:23 Well, actually, no none are K3 blocking 14:47:27 We have a few that may be RC blockers, but not K3 14:47:35 they can be RC1-blocking alright 14:47:43 ok, so I'll just defer the open ones when we tag 14:48:02 Thanks ttx! 14:48:04 ++ to that 14:48:12 That is all I had 14:48:18 Talk to you tomorrow then 14:48:23 Me too, back to cleanup now 14:48:23 yes sir 14:48:25 later! 14:48:30 cheers 15:40:42 david-lyle: I'll be 5min late 15:40:55 ttx: no worries 15:44:29 #topic Horizon 15:44:33 david-lyle: hi! 15:44:46 ttx: hi 15:44:49 #link https://launchpad.net/horizon/+milestone/kilo-3 15:45:12 22 still on the docket, time for some pre-FF cleanup 15:45:33 yes indeed 15:45:41 will cull today, the process continues 15:45:52 had to resharpen cleaver 15:46:10 yeah, would be good to split between FFE candidates, nearly-there and wont-make-it 15:46:28 so that you can actually focus on the 2nd category rather than dilute review effort 15:46:51 I have some status updates to roll in too 15:47:13 How about you process that and we discuss status of the remaining items tomorrow ? 15:47:16 I believe the launch instance bp finished merging last night 15:47:22 sure 15:47:40 On the bugs side I can't find anything k3-critical 15:47:47 so i'll just defer whatever didn't make it at tag time 15:48:03 yeah, nothing critical in the current list 15:48:10 I'm sure something will spring up 15:48:50 I had a question about tuskar-ui release model, if any 15:48:56 since it's a repo listed under horizon 15:49:27 IIRC I just tag it around release time 15:49:52 ok, so one tag every 6 months around cycle end ? 15:50:12 that's been the cadence 15:50:17 ok thx 15:51:03 david-lyle: and in that discussion here yesterday we said we'd probably need stable branches for django_openstack_ath, right 15:51:49 yes, trying to sort out what that means global-requirements wise 15:51:58 ok 15:52:26 so I'll be talking to you again tomorrow, in the mean time please refine and cut the k3 list 15:52:44 I could create a stable branch for 1.1.7 and use that for icehouse and juno, but the keystoneclient dependency isn't exact with the global-requirements for those branches 15:52:48 I created l1 milestones if you want to defer stuff 15:53:06 does that have to precisely match? 15:53:30 it requires keystoneclient slightly newer than the minimum 15:53:36 david-lyle: I think it does, but might want to cross-check with dhellmann 15:53:51 that's going to be a difficult shift then 15:54:00 will continue to look into it 15:54:31 might be nice to pin in requirements for current stable and cut a stable branch for kilo moving forward 15:54:31 ack, off-sync discussion 15:54:35 sure 15:54:48 sure, for kilo we can certainly converge 15:55:43 david-lyle: will talk to you more tomorrow 15:55:57 ttx: sounds good, thanks! 16:51:23 morganfainberg: around, 16:51:25 ? 16:51:41 ttx: o/ 16:51:47 #topic Keystone 16:51:54 #link https://launchpad.net/keystone/+milestone/kilo-3 16:52:01 3 Bps on the list 16:52:09 https://blueprints.launchpad.net/keystone/+spec/klw-tokens 16:52:57 Thought that was mostly in 16:53:01 Klw-tokens is almost done. We need a couple items reviewed should gate today. 16:53:06 ok 16:53:14 https://blueprints.launchpad.net/keystone/+spec/mapping-enhancements 16:53:35 looks done to me ? 16:53:46 or maybe pending https://review.openstack.org/#/c/163172/ 16:53:50 Not sure. Might be 1 patch 16:53:57 It's also very close. 16:54:07 https://blueprints.launchpad.net/keystone/+spec/model-timestamps 16:54:17 also seem�s one patch away 16:54:27 The last one is a cleanup bp and should have no impact to api. It can move to post k3 16:54:46 and it will move to post k3 or l1 16:54:56 #info 2 BPs very close should make it today/tomorrow, last one (model-timestamps) can be deferred 16:55:05 Yep. 16:55:29 OK, sounds pretty close 16:55:45 On the bug side, I assume nothing is k3-critical and all non-merged can be deferred to rc1 16:56:05 There will be 2 FFEs requested. One has all the code ready just needs review, can't land it all before freeze though. The second is landing a couple final patches for a feature. 16:56:23 I suggest we reconvene on Thursday to tag, or ping me if earlier 16:56:26 Bugs are fairly clear, nothing critical on my radar yet. 16:56:38 I had a few questions for you 16:56:44 Sure. 16:57:08 first would you be available at the cross-project meeting today to discuss progress (or absence thereof) on new client in openstack-sdk 16:57:23 we discussed that back in December and promised a status update in Feb/Mar 16:57:35 can be quick :) 16:57:52 Sure. And the answer is lack of progress afaik. I'll 2x check with jamielennox though. 16:58:07 second, I need to clarify a few release models for your secondary repositories 16:58:22 keystonemiddleware, pycadf and python-keystoneclient-* 16:58:52 keystonemiddleware seems released as-needed + one at the end of the cycle, is that a fair assumption ? 16:59:04 Pycadf is like Oslo packages 16:59:13 ok so same as above 16:59:18 Middleware is like client and typically released as needed. 16:59:32 And yes, one at end of cycle. 16:59:39 and same for the client-* 16:59:49 Yep. 17:00:01 OK makes all sense 17:00:05 That is all I had 17:00:10 anythgin on your side ? 17:00:21 Nope. 17:00:30 alright then, talk to you Thursday if not earlier 17:00:32 notmyname: around? 17:00:40 good morning 17:00:42 #topic Swift 17:00:45 notmyname: hi! 17:00:53 hello 17:00:54 Same question as for Morgan 17:01:03 would you be available at the cross-project meeting today to discuss progress (or absence thereof) on new client in openstack-sdk 17:01:14 we discussed that back in December and promised a status update in Feb/Mar 17:01:25 can be quick 17:01:38 if it's the first agenda item. 17:01:46 ok, that's the plan 17:01:48 I was double booked with something else for the bottom of the hour 17:02:22 notmyname: second question would be on swift-bench -- I assume it's using the release:independent model, is that a good assumption ? 17:02:38 same as python-swiftclient 17:02:51 fwiw swift seems to be release:managed, release:at-end-of-6mo-cycle *and* release:independent 17:03:13 notmyname: ok 17:03:19 the managed one is surprising. is that because of the end of cycle one? 17:03:38 I'd expected end-of-6mo and independent 17:03:42 notmyname: well, currently I still do the swift releases 17:03:48 ah, I see 17:04:02 that makes sense then 17:04:07 I agree that "managed" is tied to milestones, so I might not apply that 17:04:14 and do courtesy handling 17:04:24 and/or give you the tools and have you run with them 17:04:30 we'll refine as we go 17:04:41 ok. that's something to consider (after kilo) 17:04:52 my added value in the swift release process is pretty slim 17:05:10 if I can get my tools properly documented, that would be an option 17:05:18 anywat, post-kilo 17:05:37 hmm... rotating that among swift-core would be interesting 17:05:40 anyway, post-kilo 17:05:42 how is development coming up ? 17:05:49 since last week ? 17:05:57 for now, we're furiously working on EC stuff 17:06:04 tons to do 17:06:07 but here's the plan: 17:06:29 when it's ready for master (end of next week (-ish), we'll freeze master and work on the merge to master 17:07:03 like with storage policies, we'll tag the current feature branch (for history) and build a new patchset for the merge. ie cleaned up work 17:07:29 and, like with storage policies, I'd like to work with -infra to land it as one merge commit instead of a ff merge 17:07:49 and that gets us to the april 10 (-ish) target for a RC 17:07:56 sounds good 17:08:02 we'll keep track of that 17:08:09 there's a few other non-ec things I'm tracking 17:08:16 but the focus is ec 17:08:39 OK... anything else ? 17:08:59 I proposed a bunch of tests to defcore 17:09:07 the in-tree functional tests 17:09:15 to the objectstore capability 17:09:44 in addition to the existing 15 tempest tests listed, I added references to 279 swift functional tests 17:09:54 heh, ok 17:10:14 #link https://review.openstack.org/#/c/164865/ 17:10:32 any other news ? 17:10:36 so I think that's an important pattern for the TC->code->BoD interaction 17:11:06 no, I think that's it. tons of work on EC. scary. committed to a beta. scope will be cut before the feature 17:11:20 alright, I'll let you run then 17:11:28 devananda: you around ? 17:11:34 notmyname: have a good week 17:11:38 thanks. have a good day 17:11:39 ttx: hi! 17:11:43 here he is 17:11:46 #topic Ironic 17:11:53 #link https://launchpad.net/ironic/+milestone/kilo-3 17:12:45 reading your email 17:13:09 I can paste the cliff notes here for record keeping 17:13:15 sure 17:13:29 we now have 7 BP open still 17:14:00 https://blueprints.launchpad.net/ironic/+spec/implement-cleaning-states hit an issue with devstack configuration that's blocking landing the code in ironic 17:14:24 once we can get that in (or work around it temporarily) that should be ready to merge 17:14:50 we can easily grant FFE to cover for the landing issue if that needs a few days 17:15:04 3 more BPs are driver-specific and I'd really like to get them in, if possible 17:15:14 we've already bumped several other driver-specific pieces to Liberty 17:15:30 devananda: are they likely to hit the Thursday deadline ? 17:15:36 or will require FFE 17:15:59 right now, I'm about 50/50 on whether all of this will land by thursday 17:16:33 but - if it's not in by friday, I may just cut things that are driver-specific 17:16:34 the driver-specific ones are ilo-cleaning-support ilo-properties-capabilities-discovery and... ? 17:16:46 uefi-secure-boot 17:17:20 actually, sorry, uefi secure boot touches all the drivers 17:17:34 what about the last two (local-boot-support-with-partition-images & ipa-as-default-ramdisk) ? How close are they ? 17:17:34 it's the other one that might delay the release 17:17:55 automate-uefi-bios-iso-creation is the other driver-specific one, but it's also very small at this point and will probably land today 17:18:25 those two, I think, are done, though they are tagged on a UEFI-related review, and I'm waiting for feedback from the developers as to why 17:19:04 ok, let's discuss progress tomorrow (your morning) and see where we are at that point 17:19:11 sounds good 17:19:31 Had a question about ironic-pythonagent 17:19:37 sure 17:19:45 I suspect it's released as-needed 17:20:00 any reason we'd do a stable branch for it ? 17:20:00 indeed 17:20:08 possibly 17:20:09 i.e. have a release coinciding with the 6-month cycle ? 17:20:24 one of the story arcs of this cycle has been adding micro version support 17:20:43 there are two open reviews related to that on the server side, and a -lot- of work still in the client to fully utilize it 17:21:09 as we're now versioning the server API, it might make sense to tag clients at the same time 17:21:19 hmm, ok 17:21:27 OTOH, no future client release should ever be unable to talk to an old server 17:21:34 (within the supported window, which is TBD) 17:21:56 so making a statement like "to use the juno server, you need th ejuno client" -- we expressly do not want to do that 17:22:01 trying to see if release:at-6mo-cycle-end would apply to it 17:22:42 or if we'd need something finer for clients/libraries/agents 17:22:50 I will no doubt tag a client somewhere around the time we tag kilo rc, and probably again around kilo-final, and after that i'll tag it again ... 17:23:21 right, sounds like "having a stable branch" should be separated from "releasing at the end of the cycle" 17:23:49 I don't see a need to release the client and server in lock-step. and I don't want to, explicilty or implicitly, require folks to install an older client to talk to an older server 17:24:07 the more i think through it, the less i think a stable branch of python-ironicclient makes sense 17:24:20 yeah, so including clients on the tag might communicate the wrong intent 17:24:26 right 17:24:27 anyway, side discussion 17:24:31 thx for the input though 17:24:35 sure! 17:24:51 devananda: questions on your side ? 17:25:12 ttx: nope. I'm going to keep at the status page, trying to get everything green 17:25:20 tty tmw. same time? 17:25:22 alright, talk to you tomorrow 17:25:24 yes 17:25:39 SlickNik: around? 17:27:01 ttx: o/ 17:27:04 #topic Trove 17:27:11 #link https://launchpad.net/trove/+milestone/kilo-3 17:27:21 Looks like 6 BPs still on the fence 17:27:31 https://blueprints.launchpad.net/trove/+spec/mgmt-show-deleted 17:27:44 This one looks relatively close 17:28:11 Yes, that should make it in by today, I'm guessing. 17:28:21 Close to the finish line. 17:28:22 https://blueprints.launchpad.net/trove/+spec/replication-v2 -- not very far either afaict 17:29:28 https://blueprints.launchpad.net/trove/+spec/vertica-db-support -- one patch pending too, but admittedly a large one ? 17:29:54 same for https://blueprints.launchpad.net/trove/+spec/couchdb-plugin-trove 17:30:31 https://blueprints.launchpad.net/trove/+spec/implement-vertica-cluster -- depends on the other, so likely to be late 17:30:50 https://blueprints.launchpad.net/trove/+spec/guest-rpc-ping-pong -- close to merge 17:30:52 https://blueprints.launchpad.net/trove/+spec/implement-vertica-cluster I'm concerned about. 17:31:11 yes, it's also a feature needing testing, so deferring might be the rigth call 17:31:18 rather than pushing it late in 17:32:03 How about we revisit tomorrow, once most of those will be in ? 17:32:11 ttx: agreed. I'll talk with the author (sushil) to see what's his take on that. 17:32:27 Would be great to push most in in the mean time 17:32:30 ttx: sounds good. I think at least 4-5 of those will have merged by then. 17:32:36 ack 17:32:50 On the bugs side, anything k3-critical ? 17:32:53 I'm working on getting folks to reviewthe patches by 17:32:58 Or can I just defer to RC1 anything that doesn't make it ? 17:33:41 ttx: We've fixed all the ones that are critical already. 17:34:06 So I'd defer anything in kilo-3 that doesn't make it (bugwise) to RC1 17:34:15 alright then, I'll let you babysit patches and talk to you tomorrow 17:34:52 sounds good. will ping you if something comes up. 17:35:07 wfm 17:35:08 #endmeeting