09:00:28 <ttx> #startmeeting ptl_sync
09:00:29 <openstack> Meeting started Tue Mar 17 09:00:28 2015 UTC and is due to finish in 60 minutes.  The chair is ttx. Information about MeetBot at http://wiki.debian.org/MeetBot.
09:00:30 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
09:00:33 <openstack> The meeting name has been set to 'ptl_sync'
09:00:36 <ttx> #topic Heat
09:00:43 <asalkeld> ttx sorry, i missed last weeks
09:00:48 <ttx> #link https://launchpad.net/heat/+milestone/kilo-3
09:00:56 <ttx> asalkeld: not a problem, I skipped most of them anyway
09:01:08 <ttx> So... crunch time
09:01:16 <ttx> 6 BPs up
09:01:19 <asalkeld> so most are very close to getting  merged
09:01:29 <ttx> Let's go through them one by one
09:01:30 <asalkeld> so i am not very stressed about them
09:01:34 <asalkeld> ok
09:01:36 <ttx> https://blueprints.launchpad.net/heat/+spec/convergence-resource-table
09:01:51 <ttx> Only needing https://review.openstack.org/#/c/156693/ ?
09:01:58 <asalkeld> that's a one patch spec
09:02:00 <asalkeld> yip
09:02:05 <asalkeld> should get in soon
09:02:08 <ttx> https://blueprints.launchpad.net/heat/+spec/decouple-nested
09:02:14 <asalkeld> that's mine
09:02:25 <asalkeld> *should* get it in time
09:02:39 <asalkeld> 3 patches up, all working well
09:02:54 <asalkeld> some disent amounst the minions;)
09:03:06 <ttx> https://blueprints.launchpad.net/heat/+spec/convergence-lightweight-stack
09:03:18 <asalkeld> that might well miss
09:03:32 <asalkeld> but it't only one patch
09:03:43 <asalkeld> so willing to wait until the death
09:03:49 <ttx> I see two patches
09:04:11 <ttx> If that one misses, do you intend to request FFE for it ?
09:04:26 <asalkeld> ttx: it can just wait
09:04:27 <ttx> If not, maybe prioritize to Medium
09:04:31 <asalkeld> ok
09:04:40 <ttx> I assume the first two would get exceptions
09:04:46 <ttx> if they missed
09:04:58 <ttx> https://blueprints.launchpad.net/heat/+spec/convergence-push-data
09:05:20 <asalkeld> that should be ok
09:05:26 <asalkeld> again one patch
09:05:44 <ttx> OK, it's linked to the other
09:05:56 <ttx> https://blueprints.launchpad.net/heat/+spec/mistral-resources-for-heat
09:06:06 <ttx> Looks like 2 patches
09:06:24 <asalkeld> so this is a bp for code that goes into contrib/
09:06:36 <asalkeld> so IMO it can go in any time
09:06:46 <asalkeld> (does not effect strings or deps)
09:06:51 <ttx> yeah, at least could get an exception as "self-contained"
09:07:00 <asalkeld> and gets installed seperately
09:07:00 <ttx> https://blueprints.launchpad.net/heat/+spec/stack-breakpoint
09:07:23 <asalkeld> there are 2 or 3 guys trying hard to get that in
09:07:29 <asalkeld> it should make it
09:07:44 <asalkeld> needed for tripelo
09:07:52 <ttx> ok we'll see how it goes
09:08:10 <asalkeld> yeah, we can wait
09:08:20 <ttx> On the bugs side... anything k3-critical that we would not just defer to rc1 ?
09:08:24 <asalkeld> they are all on the edge of getting merged
09:08:47 <asalkeld> i think just defer everything that's not merged
09:09:04 <ttx> #info k3 bugs can be deferred to rc1 if incomplete
09:09:26 <ttx> #info 6 BPs left, all still standing a reasonable chance of making it
09:09:37 * asalkeld is hopeful
09:10:11 <ttx> OK, that's all I guess. We'll make a new status point on Thursday (your evening, my morning)
09:10:26 <ttx> hopefully for the final tag / early FFE decisions
09:10:29 <asalkeld> ok - same time?
09:10:46 <asalkeld> (just want to make a reminder)
09:10:46 <ttx> sounds good. Ping me if I don't talk
09:10:59 <asalkeld> ok, chat then
09:11:21 <ttx> see you!
09:11:40 <ttx> zz_johnthetubagu: around?
09:17:06 <ttx> johnthetubaguy: o/
09:17:25 <johnthetubaguy> ttx: hi
09:17:28 <ttx> #topic Nova
09:17:35 <ttx> #link https://launchpad.net/nova/+milestone/kilo-3
09:17:52 <ttx> That's still 12 BPs in progress
09:18:05 <ttx> Let's see how far they are
09:18:11 <ttx> https://blueprints.launchpad.net/nova/+spec/v2-on-v3-api
09:18:29 <ttx> I see 4 patches on https://review.openstack.org/#/q/topic:bp/v2-on-v3-api,n,z
09:18:53 <ttx> But the last says it partially implements so it looks like there is more to it
09:19:49 <johnthetubaguy> hmm, I thought they were done there now, pretty much, but I am rarely online the same time as those folks so its tricky relaying info
09:20:19 <ttx> Those might be add-ons patches
09:20:20 <johnthetubaguy> ah, yes, I think a lot of this is now unit test clean up
09:20:52 <ttx> ok, if that's the case it would be good to stop mentioning the BP in the patches at some point and consider it "finished"
09:20:53 <johnthetubaguy> well, general cleanups anyways
09:21:02 <ttx> https://blueprints.launchpad.net/nova/+spec/api-microversions
09:21:05 <johnthetubaguy> agreed, but folks do like the tracking
09:21:07 <ttx> seems to be in the same situation
09:21:12 <johnthetubaguy> yes
09:21:37 <ttx> #info v2-on-v3-api and api-microversions are "mostly finished"
09:21:46 <ttx> https://blueprints.launchpad.net/nova/+spec/cells-v2-mapping
09:22:00 <ttx> This one will be partial IIUC
09:22:24 <ttx> in which case it is "mostly finished" too, I suspect
09:22:51 <johnthetubaguy> yes, there are a few patches, I think they wanted, but as you say, its kinda partial anyways
09:23:00 <ttx> https://blueprints.launchpad.net/nova/+spec/kilo-objects
09:23:15 <johnthetubaguy> this one is basically tracking cleanups
09:23:21 <ttx> Feels like another "what's in kilo is kilo"
09:23:27 <johnthetubaguy> right
09:23:47 <johnthetubaguy> its a tracking placeholder to help with reviews more than anything
09:23:51 <ttx> #info cells-v2-mapping and kilo-objects are partial work anyway, so what's in kilo is kilo
09:24:02 <ttx> https://blueprints.launchpad.net/nova/+spec/online-schema-changes
09:24:12 <johnthetubaguy> one last patch there I think
09:24:26 <ttx> pending on https://review.openstack.org/#/c/154521/
09:24:43 <ttx> #info online-schema-changes pending on https://review.openstack.org/#/c/154521/ -- looking good
09:24:55 <ttx> it says "partially implements" though
09:25:57 <ttx> https://blueprints.launchpad.net/nova/+spec/make-resource-tracker-use-objects
09:26:11 <johnthetubaguy> yeah, thats true, I will check, but I thought that was it
09:26:27 <ttx> Lots of patches up
09:26:46 <ttx> This one sounds more at risk
09:27:00 <ttx> although it's marked partial too
09:27:17 <johnthetubaguy> oh yes, I think you are right, that one was close to getting kicked out already
09:27:37 <ttx> #info make-resource-tracker-use-objects might be so partial it could be considered deferred
09:27:46 <ttx> https://blueprints.launchpad.net/nova/+spec/detach-service-from-computenode
09:28:07 <johnthetubaguy> I will check out that one closer after the meeting
09:28:23 <ttx> Not sure what's waiting on
09:28:37 <johnthetubaguy> it might be done now, as much as it will be in kilo, let me check
09:29:05 <ttx> you can check after meeting, no worry
09:29:16 <johnthetubaguy> yep is done, although its technically, a bit partial
09:29:22 <ttx> #info detach-service-from-computenode might be considered finished
09:29:38 <ttx> OK, medium stuff now
09:29:43 <ttx> https://blueprints.launchpad.net/nova/+spec/keypair-x509-certificates
09:30:21 <ttx> Remaining patches are novaclient and a bug fix
09:30:37 <johnthetubaguy> yes, they need work really
09:30:43 <ttx> so we could move them to RC1 if necessary
09:30:56 <ttx> and consider this finished
09:31:12 <johnthetubaguy> yes, I think thats probably easiest, I might open bugs for those patches
09:31:14 <ttx> #info keypair-x509-certificates could just be considered finished and remaining bugs pushed to RC1 if need be
09:31:26 <ttx> https://blueprints.launchpad.net/nova/+spec/v3-api-policy
09:31:39 <ttx> This one is marked partial, so probably another of those "what's in kilo is kilo"
09:32:08 <johnthetubaguy> possibly, there are a few patches they really want in kilo, its on the etherpad I think
09:32:08 <ttx> Loads of patches there
09:32:22 <johnthetubaguy> yeah, its a streatch to get those all in
09:32:33 <ttx> #info v3-api-policy needs work, will be partial
09:32:46 <ttx> https://blueprints.launchpad.net/nova/+spec/ec2-api
09:33:04 <ttx> Isn't that finished?
09:33:53 <johnthetubaguy> I think it is
09:33:56 <ttx> #info ec2-api is probably finished
09:34:02 <ttx> https://blueprints.launchpad.net/nova/+spec/servicegroup-api-is-up-host-topic
09:34:34 <ttx> A bit unclear what it's waiting on
09:34:42 <ttx> also marked partial
09:35:22 <johnthetubaguy> I think its this one: https://review.openstack.org/#/c/149924/
09:36:00 <ttx> #info servicegroup-api-is-up-host-topic pending on https://review.openstack.org/#/c/149924/
09:36:08 <johnthetubaguy> its a clean up though, so I might just remove the blueprint tag, lets see
09:36:29 <ttx> https://blueprints.launchpad.net/nova/+spec/hyper-v-test-refactoring
09:36:37 <ttx> That one looks forever partial
09:37:05 <ttx> and candidate for deferral
09:37:15 <ttx> #info hyper-v-test-refactoring may or may not make it
09:37:29 <johnthetubaguy> yeah, I vote we defer the BP, I mean its unit tests, so they can merge later if needed I guess
09:37:45 <ttx> yeah, I'm more afraid of the distration than of the impact
09:37:50 <ttx> distraction
09:38:34 <ttx> #info 10 blueprints left, mostly partial & "what's in kilo will be kilo" work anyway
09:38:56 <ttx> Quick look on the bugs side, see if any of those "critical" are actually k3-critical
09:39:09 <ttx> like regressions
09:39:14 <ttx> or showstoppers
09:40:12 <ttx> https://bugs.launchpad.net/nova/+bug/1383465 is unlikely to be solved in the next two days
09:40:13 <openstack> Launchpad bug 1383465 in OpenStack Compute (nova) "[pci-passthrough] nova-compute fails to start" [Critical,In progress] - Assigned to Yongli He (yongli-he)
09:40:26 <johnthetubaguy> nothing that stops kilo-3 from a quick look, right, thats the only big one
09:40:35 <johnthetubaguy> it got discussed in the nova meeting again
09:40:43 <johnthetubaguy> trying to move that forward before kilo ships
09:40:57 <johnthetubaguy> it was targeted to kilo-1 I think
09:41:16 <ttx> and passed forward
09:41:34 <ttx> https://bugs.launchpad.net/nova/+bug/1296414 is another of those long-standing issues
09:41:35 <openstack> Launchpad bug 1296414 in OpenStack Compute (nova) "quotas not updated when periodic tasks or startup finish deletes" [Critical,In progress] - Assigned to Rajesh Tailor (rajesh-tailor)
09:42:06 <ttx> ok, so I'll push to RC1 anything not merged by tag time
09:42:14 <ttx> #info Bugs to be deferred if not merged
09:42:34 <johnthetubaguy> sounds good
09:42:44 <johnthetubaguy> I will ask around to see if there are any exceptions to that
09:42:48 <ttx> johnthetubaguy: how about we meet again on Thursday and formally close all those partial things, then see if anything in the pipe justifies waiting ?
09:42:59 <johnthetubaguy> that sounds like a plan
09:43:04 <ttx> same time ?
09:43:13 <johnthetubaguy> yes, I will pop that in my calendar now
09:43:23 <ttx> sounds good, have a good day!
09:43:37 <johnthetubaguy> you too!
12:24:46 <dhellmann> ttx: ready when you are
12:25:16 <ttx> dhellmann: you might be one hour early, but we can probably do it nevertheless
12:25:37 <dhellmann> ttx: oh, right, forgot you're not on dst yet
12:25:57 <ttx> no pb, if you're available now, let's do this
12:26:03 <dhellmann> sure thing
12:26:05 <ttx> #topic Oslo
12:26:24 <ttx> A quick look at milestone pages
12:26:30 <ttx> #link https://launchpad.net/oslo/+milestone/kilo-3
12:26:58 <ttx> there is a bug targeted: https://bugs.launchpad.net/cinder/+bug/1417018
12:26:59 <openstack> Launchpad bug 1417018 in oslo.db "Cinder encounters dbapi error: NoSuchColumnError: "Could not locate column in row for column '%(140070887032400 anon)s.volumes_id'"" [Medium,In progress]
12:27:07 <ttx> is that something we want to fix and sync in kilo ?
12:27:10 <dhellmann> yeah, that one is actually done in oslo.db -- I'll be making a release later today
12:27:30 <dhellmann> we haven't settled on the best way to fix it in the incubator, but the oslo.db fix should remove the issue from cinder
12:27:35 <dhellmann> I'll update the targets
12:27:40 <ttx> ok cool
12:28:17 <ttx> https://launchpad.net/oslo/+milestone/next-kilo shows a bunch of PBR bugfixes
12:28:23 <ttx> is there a PBR release planned ?
12:28:43 <ttx> or would those actually appear in next-liberty ?
12:28:46 <dhellmann> ttx: not to include those items. I'll retarget them too
12:28:54 <ttx> ok
12:29:09 <ttx> Looking at https://review.openstack.org/#/c/162656/
12:29:12 <dhellmann> actually, I don't think whoever has been doing pbr releases has been using the script, so I'll verify that they're not already done
12:29:37 <ttx> do you think we should collect more +2s ? Or should I just W+1 it ?
12:30:09 <ttx> not sure why the subsequent reviewers didn't w+1
12:30:32 <dhellmann> yeah, sdague expressed some reservations and that may have made people hold off on the w+1
12:30:45 <dhellmann> I would like to get that in asap so we can also land the subsequent requirements updates to the projects
12:31:02 <dhellmann> the requirements freeze is this week, right?
12:31:03 <ttx> yeah, timing is tight
12:31:04 <sdague> jogo is +1 on that now, I'll approve
12:31:17 <dhellmann> sdague: thanks, I was going to check in with you on that in a bit anyway
12:31:22 <sdague> that's going to trigger a ton of patch / test churn, btw
12:31:23 <ttx> requirements freeze is same as feature freeze yes
12:31:37 <dhellmann> sdague: yeah, that's why I did it all in one patch :-)
12:31:42 <sdague> :)
12:31:51 <ttx> sdague: should we generally try do those one week earlier ?
12:31:59 <sdague> ttx: probably
12:32:28 <sdague> I expect we'll consume all the test nodes on the proposal bot changes for that one
12:32:31 <ttx> note that those could technically land post-requirements freeze, since requirements freeze is mostly to guard against additions
12:32:32 <dhellmann> I think we would have been ready a week earlier this cycle except for the deprecation warning issue
12:32:52 <dhellmann> ttx: ok, as long as it lands before the branches are made I think we're ok
12:32:56 <ttx> so I'm fine with those landing post-FF pre-RC1
12:33:07 <dhellmann> I'm OK with that if we want to preserve test nodes
12:33:09 <ttx> if we think approving this now is a bad idea
12:33:20 <dhellmann> I can update the policy document to reflect that change, too
12:33:24 <ttx> (requirements sync is part of the RC1 process)
12:33:31 <sdague> zuul still has capacity this morning
12:33:47 <sdague> so anyway, it's approved
12:33:54 <dhellmann> ok, thanks, sdague
12:34:36 <ttx> dhellmann: I also wanted to pick your brains on which projects actually could be considered havign a final release per cycle (for the release:at-6mo-cycle-end tag application)
12:34:48 <ttx> First, looking at Oslo libs
12:35:25 <ttx> I suspect some of them don't really apply
12:35:30 <ttx> for example openstack-dev/pbr
12:35:33 <dhellmann> I want them all to be considered as following that plan, although we had a couple of stragglers this time. pbr and taskflow, I think.
12:35:50 <dhellmann> I need to get the pbr team inline with the rest of oslo
12:35:55 <ttx> what about openstack-dev/cookiecutter ?
12:36:04 <ttx> (and openstack-dev/oslo-cookiecutter)
12:36:33 <dhellmann> oh, we don't release that, that's just a template for making new projects
12:36:41 <dhellmann> you use it straight from the git url
12:36:46 <ttx> right, so those would not have release:* tags
12:36:52 <dhellmann> right
12:37:22 <ttx> so what should I do about taskflow/pbr ? Consider them in ?
12:37:36 <dhellmann> yes, let's consider them in and I'll talk to the maintainers
12:37:47 <ttx> Second question is about python-*client, to piggyback on your discussion with David Lyle recently
12:38:05 <ttx> I'm a bit confused if we should consider them in (as in having stable branches)
12:38:24 <ttx> I think they might be release:independent, really
12:38:37 <dhellmann> well, we need to start doing that, I think. We're capping the versions we use in the stable branches, so if we have to fix something the only want to release a new version would be to have a stable branch of the client
12:39:06 <dhellmann> I believe the d-g tests are already installing the services and tempest in separate site-packages (one is in a virtualenv, I'm not sure which)
12:39:22 <dhellmann> that lets us test  new clients from the outside of the cloud while using stable clients inside the cloud
12:39:47 <ttx> hmm, ok. So they should be at-6mo-end
12:39:57 <ttx> even if technically they don't have stable branches until needed
12:40:09 <dhellmann> yes, unless someone else comes up with an alternative to my stable-library spec
12:40:31 <ttx> Applying tags results in uncovering interesting corner cases to say the least
12:40:44 <dhellmann> true
12:41:01 <ttx> at the very least helps clarify things, even for us
12:41:18 <ttx> ok, I think that's all I had in store for you
12:41:27 <dhellmann> ok, I had a couple for you
12:41:58 <ttx> do we have any lib left to release for kilo ? You mentioned pbr ?
12:42:03 <dhellmann> that policy on stable branches for libs should be prioritized: https://review.openstack.org/155072
12:42:22 <ttx> yeah, I added it to todays meeting agenda
12:42:23 <dhellmann> I have a backported fix for oslo.db that I will release today, but I don't anticipate any other releases
12:42:27 <dhellmann> ok, good
12:42:35 <ttx> (cross-project meeting)
12:42:37 <dhellmann> and do we know the dates for the fall summit yet?
12:42:47 <ttx> in silence is consent mode
12:43:00 <ttx> I think we do, let me check
12:43:17 <dhellmann> I didn't see them on the event calendar on openstack.org, but I don't know how far out that actually goes
12:43:27 <ttx> https://www.openstack.org/summit/tokyo-2015/
12:43:40 <ttx> October 26 - 30, 2015
12:43:41 <dhellmann> ah, ok thanks
12:43:54 <ttx> That may not be linked to yet, but the page is public
12:44:01 <ttx> and the URL is not that hard to guess
12:44:10 <dhellmann> yeah, I should have tried that :-)
12:44:20 <ttx> actually it's linked
12:44:25 <ttx> https://www.openstack.org/summit/
12:44:47 <ttx> so consider it public :)
12:45:08 <dhellmann> I was looking on https://www.openstack.org/community/events/
12:45:31 <dhellmann> the calendar includes the spring summit, so I thought if the dates were set for fall that would be there, too, but maybe it's too far out
12:45:38 <dhellmann> anyway, thanks, I should have looked harder :-)
12:45:59 <dhellmann> that's all I had for this week
12:46:48 <ttx> dhellmann: so in summary, next steps are merging requirements patch (but that's arguably non-oslo-specific
12:46:55 <ttx> and get that policy approved
12:47:05 <ttx> otherwise you're mostly done release-wise ?
12:47:20 <dhellmann> yes, I believe that's it
12:47:34 <ttx> dhellmann: we could push the kilo-3 tag on oslo-incubator now
12:47:37 <dhellmann> we'll obviously do patch releases if we need to, but the branches are created so we can do that
12:47:44 <ttx> since we did a 2015.1.0b2
12:48:19 <dhellmann> sure, we can push the tag. I think we said we would wait to do the stable branch on the incubator until the other projects had theirs, but the tag is easy
12:48:33 <dhellmann> do you want to do that, or shall I?
12:48:39 <ttx> I'll do it now
12:48:41 <dhellmann> ok
12:49:41 <ttx> done
12:49:47 <dhellmann> cool, thanks
12:50:07 <ttx> ok, thx and have a good day!
12:50:12 <dhellmann> you, too!
13:05:22 <SergeyLukjanov> ttx, /me ready when you are
13:05:47 <ttx> SergeyLukjanov: hi!
13:05:53 <ttx> #topic Sahara
13:05:56 <SergeyLukjanov> ttx, hello!
13:05:58 <SergeyLukjanov> #link https://launchpad.net/sahara/+milestone/kilo-3
13:06:14 <SergeyLukjanov> so, we have mostly everything planned done
13:06:20 <ttx> 8 still up
13:06:27 <SergeyLukjanov> except few things that are now candidates for FFE
13:06:44 <SergeyLukjanov> stuff that is under review now is expected to be merged today/tomorrow
13:07:02 <ttx> SergeyLukjanov: ok
13:07:16 <ttx> What are the FFE candidates ? edp-job-types-endpoint ?
13:07:51 <SergeyLukjanov> ttx, yup, this one for sure, there is a core stuff merged already, but plugin-related things not yet merged now, so, we'll need some more time for it
13:08:19 <ttx> #info edp-job-types-endpoint likely FFE
13:08:20 <SergeyLukjanov> the same for "Provide default templates for each plugin" https://blueprints.launchpad.net/sahara/+spec/default-templates
13:08:53 <ttx> #info default-templates likely FFE
13:08:53 <SergeyLukjanov> and probably for edp-datasource-placeholders as well
13:09:07 <ttx> #info edp-datasource-placeholders likely FFE
13:09:16 <ttx> the others shall merge before the k3 date
13:09:22 <SergeyLukjanov> yup
13:09:40 <ttx> OK, any k3-blocking bug ? Or can I just defer them to rc1 isf they miss target ,
13:09:42 <ttx> ?
13:09:43 <SergeyLukjanov> I expect sahara to be ready for tag cut on Thu EU morning
13:09:59 <SergeyLukjanov> no k3-blocking bugs
13:10:33 <SergeyLukjanov> I've cleaned up the bugs list, so, there is only few of them and I think they will  be merged till the k3
13:10:42 <ttx> OK. I had a question for you on dashboard, extra and image-elements
13:10:50 <SergeyLukjanov> and anyway they aren't importnat
13:10:53 <SergeyLukjanov> yeah
13:10:53 <ttx> how many of those are actually still in use ?
13:11:05 <SergeyLukjanov> dashboard is only for stable/icehouse
13:11:13 <SergeyLukjanov> starting from Juno it's moved to horizon repo
13:11:26 <SergeyLukjanov> extra and s-i-e are in use
13:11:29 <ttx> what is the release model for extra and image-elements ?
13:11:39 <ttx> once at the end of the 6-month cycle ?
13:11:54 <SergeyLukjanov> the same as for sahara - tagging each 6 month ~ the same day with sahara
13:12:05 <ttx> ok, that works
13:12:11 <SergeyLukjanov> it's needed for coupling images builded for sahara and tools used inside of them
13:12:19 <ttx> I'll apply the corresponding tag as needed :)
13:12:25 <SergeyLukjanov> okay, thx :)
13:12:38 <ttx> you dont use intermediary tags on those, right ?
13:12:41 * SergeyLukjanov lurking for gov changes :)
13:12:58 <SergeyLukjanov> ttx, we're using the kilo-X tags as well
13:13:05 <ttx> ok
13:13:19 <ttx> That's all the questions I had, thx
13:13:23 <SergeyLukjanov> ttx, in fact all tags done for sahara are done for both extra and image-elements repos as well
13:13:34 <ttx> ack. have a good day!
13:13:35 <SergeyLukjanov> oh, I have a thing to share
13:13:44 <ttx> listening
13:13:50 <SergeyLukjanov> we have an issue with src juno job for sahara client
13:14:01 <ttx> maybe #info it
13:14:20 <SergeyLukjanov> we're now investigating it and we'd like to make it non-voting
13:14:27 <SergeyLukjanov> for some time to avoid blocking client dev for kilo
13:14:50 <SergeyLukjanov> heat-engine fails to load because saharaclient installed from source and heat checks versions
13:15:05 <SergeyLukjanov> and that's extremely strange why only sahara job is failing
13:15:16 <SergeyLukjanov> we're not trying to understand it
13:15:34 <SergeyLukjanov> https://review.openstack.org/#/c/165038/
13:16:24 <ttx> ok
13:16:27 <SergeyLukjanov> #info gate-tempest-dsvm-neutron-src-python-saharaclient-juno is now failing due to the very strange issue - heat-engine fails to load because saharaclient installed from source and heat checks versions and finds incorrect version (not <=0.7.6 defined in requirements.txt)
13:16:38 <SergeyLukjanov> nothing else from my side
13:16:39 <SergeyLukjanov> thx
13:16:44 <ttx> cheers
13:17:08 <SergeyLukjanov> ttx, have a good day!
13:36:27 <ttx> dhellmann: https://review.openstack.org/#/c/162656/ failed gate on some horizon startup failure
13:38:36 <ttx> Interesting: http://logstash.openstack.org/#eyJzZWFyY2giOiJcIm9wdC9zdGFjay9uZXcvZGV2c3RhY2svZXhlcmNpc2VzL2hvcml6b24uc2g6Mzk6ZGllXCIiLCJmaWVsZHMiOltdLCJvZmZzZXQiOjAsInRpbWVmcmFtZSI6IjE3MjgwMCIsImdyYXBobW9kZSI6ImNvdW50IiwidGltZSI6eyJ1c2VyX2ludGVydmFsIjowfSwic3RhbXAiOjE0MjY1OTk1MDQ0ODd9
14:25:53 <ttx> thingee: if you want to go at your usual USA DST time, we can do it now
14:26:11 <ttx> since I'm around anyway
14:26:40 <thingee> ha I was just thinking of that
14:26:47 <ttx> your call
14:27:01 <thingee> sure sounds good to me
14:27:06 <ttx> #topic Cinder
14:27:12 <ttx> #link https://launchpad.net/cinder/+milestone/kilo-3
14:27:25 <ttx> So you seem to be all-set featurewise
14:27:47 <ttx> but you might want to include a few more bugfixes before tagging �?
14:28:27 <thingee> yea, we still got some we have to take of.
14:28:32 <thingee> and me as well ;)
14:28:40 <ttx> Are any of those targeted bugs k3-critical ? Or can I just move them if they happen not to make it at tag time ?
14:29:17 <thingee> I just went through the ones targeted and think they have a fair shot. They're all in review.
14:29:32 <thingee> I removed the ones that have no reviews at this point.
14:29:37 <ttx> OK, let's try to use the list as targets then
14:29:52 <ttx> and tag when list is empty, and revise that on Thursday if necessary
14:30:03 <thingee> sounds good
14:32:46 <ttx> see PM for a few security issues pings
14:32:52 <ttx> That's all I had
14:33:06 <ttx> it's the calmest Cinder FF since a long time :)
14:33:46 <thingee> :)
14:34:07 <thingee> see the ML. I'm not a popular person
14:34:08 <ttx> I'll let you go back to debunking security issues and review bugfix patches.
14:34:23 <thingee> though I'm not sad to say no to patches coming in on March 10th for new features.
14:35:13 <ttx> ok, have a great day then
14:35:41 <thingee> have a great rest of your day!
14:35:44 <ttx> nikhil_k: interested in speaking now ? I might have a call interrupting our usual time
14:35:53 <ttx> mestery: same
14:36:05 <nikhil_k> ttx: sure, trying to do double meeting
14:36:13 <nikhil_k> another is audio
14:36:20 <ttx> I'll go slowly
14:36:23 <ttx> #topic Glance
14:36:27 <nikhil_k> thanks
14:36:37 <ttx> #link https://launchpad.net/glance/+milestone/kilo-3
14:36:44 <ttx> 3 BPs left on the docket
14:37:06 <ttx> you sent an email on those, let me check
14:37:50 <ttx> ok, so those are all candidates for a FFE if they don't make it
14:38:01 <ttx> How far are they from completion ?
14:38:43 <ttx> artifact-repository seems to be half-dozen patches away, but mostly proposed
14:38:48 <nikhil_k> ttx: yes, I'm tentative on all three at this point. Catalog Index Service is still pending some last minute discussion/blockers. Artifacts is stuck on lot of reviews and the policy strenghthening is waiting on one confirmation due to a possible conflict/overlap with another feature.
14:39:20 <ttx> ok, would be good to have at least one of those in by the deadline to limit the number of exceptions
14:39:28 <ttx> #info last 3 BPs are likely to require exceptions
14:39:54 <nikhil_k> ttx: how long can we go with FFE?
14:40:02 <ttx> nikhil_k: on the targeted bugs side, I assume nothing is k3-critical and everything can be deferred to rc1 if not fixed by Thursday
14:40:09 <nikhil_k> 1 week / 2 weeks?
14:40:10 <ttx> nikhil_k: ideally max 10 days
14:40:15 <nikhil_k> gotcha
14:40:22 <ttx> (i.e. end of month)
14:40:29 <nikhil_k> sure thing
14:40:42 <mestery> ttx: here when ready
14:40:43 <nikhil_k> ttx: about the bugs
14:40:44 <ttx> #info Bugs can be deferred if needed
14:40:59 <nikhil_k> ttx: this one https://review.openstack.org/#/c/119824/11 might be tricy https://launchpad.net/bugs/1365436
14:41:00 <openstack> Launchpad bug 1365436 in Glance "Database schema differs from migrations" [High,In progress] - Assigned to Oleksii Chuprykov (ochuprykov)
14:41:13 <nikhil_k> I will try to review soon
14:41:20 <nikhil_k> (that's it from me)
14:41:48 <ttx> nikhil_k: ok, keeping an eye on it
14:42:26 <ttx> #info Bug 1365436 should be solved before k3 ideally
14:42:27 <openstack> bug 1365436 in Glance "Database schema differs from migrations" [High,In progress] https://launchpad.net/bugs/1365436 - Assigned to Oleksii Chuprykov (ochuprykov)
14:42:59 <nikhil_k> thanks!
14:42:59 <ttx> nikhil_k: that is all I had. We'll revisit progress on Thursday
14:43:03 <ttx> cheers
14:43:11 <nikhil_k> sure
14:43:13 <ttx> mestery: o/
14:43:19 <ttx> #topic Neutron
14:43:23 <mestery> ttx: o/
14:43:40 <ttx> #link https://launchpad.net/neutron/+milestone/kilo-3
14:43:52 <ttx> hmm, lots of open stuff there
14:44:01 <mestery> Yup, so many are very close
14:44:08 <mestery> But the reality is I need to start moving some of them out
14:44:23 <mestery> We had a gate issue which killed our Fri-Mon the past weekend which was awful :(
14:44:30 <ttx> you need to sort them so that those that are 99% there get priority reviews
14:44:39 <mestery> Yes.
14:45:05 <ttx> because otherwise they will all be at 98% on Thursday and just generate more pain
14:45:26 <ttx> #info 34 open BPs
14:45:38 <mestery> Indeed, I'm on that post this meeting and will get it down to the actual 5-8 that will make it
14:45:52 <ttx> #info need to sort between FFE candidates, nearly-there and wont-make-it
14:46:32 <ttx> How about we discuss again tomorrow and see how much you managed to clean up ?
14:46:43 <mestery> Works for me.
14:46:53 <mestery> The javelin/grenade issue just killed us, apologies for being behind a bit
14:47:02 <mestery> I spent yesterday sheparding that and trying to recheck some stuff. *sigh*
14:47:06 <ttx> On the bugs side, anything that should be k3-blocking ?
14:47:15 * mestery finds the one
14:47:15 <ttx> Or just move them all to RC1 as-needed ?
14:47:23 <mestery> Well, actually, no none are K3 blocking
14:47:27 <mestery> We have a few that may be RC blockers, but not K3
14:47:35 <ttx> they can be RC1-blocking alright
14:47:43 <ttx> ok, so I'll just defer the open ones when we tag
14:48:02 <mestery> Thanks ttx!
14:48:04 <mestery> ++ to that
14:48:12 <ttx> That is all I had
14:48:18 <ttx> Talk to you tomorrow then
14:48:23 <mestery> Me too, back to cleanup now
14:48:23 <mestery> yes sir
14:48:25 <mestery> later!
14:48:30 <ttx> cheers
15:40:42 <ttx> david-lyle: I'll be 5min late
15:40:55 <david-lyle> ttx: no worries
15:44:29 <ttx> #topic Horizon
15:44:33 <ttx> david-lyle: hi!
15:44:46 <david-lyle> ttx: hi
15:44:49 <ttx> #link https://launchpad.net/horizon/+milestone/kilo-3
15:45:12 <ttx> 22 still on the docket, time for some pre-FF cleanup
15:45:33 <david-lyle> yes indeed
15:45:41 <david-lyle> will cull today, the process continues
15:45:52 <david-lyle> had to resharpen cleaver
15:46:10 <ttx> yeah, would be good to split between FFE candidates, nearly-there and wont-make-it
15:46:28 <ttx> so that you can actually focus on the 2nd category rather than dilute review effort
15:46:51 <david-lyle> I have some status updates to roll in too
15:47:13 <ttx> How about you process that and we discuss status of the remaining items tomorrow ?
15:47:16 <david-lyle> I believe  the launch instance bp finished merging last night
15:47:22 <david-lyle> sure
15:47:40 <ttx> On the bugs side I can't find anything k3-critical
15:47:47 <ttx> so i'll just defer whatever didn't make it at tag time
15:48:03 <david-lyle> yeah, nothing critical in the current list
15:48:10 <david-lyle> I'm sure something will spring up
15:48:50 <ttx> I had a question about tuskar-ui release model, if any
15:48:56 <ttx> since it's a repo listed under horizon
15:49:27 <david-lyle> IIRC I just tag it around release time
15:49:52 <ttx> ok, so one tag every 6 months around cycle end ?
15:50:12 <david-lyle> that's been the cadence
15:50:17 <ttx> ok thx
15:51:03 <ttx> david-lyle: and in that discussion here yesterday we said we'd probably need stable branches for django_openstack_ath, right
15:51:49 <david-lyle> yes, trying to sort out what that means global-requirements wise
15:51:58 <ttx> ok
15:52:26 <ttx> so I'll be talking to you again tomorrow, in the mean time please refine and cut the k3 list
15:52:44 <david-lyle> I could create a stable branch for 1.1.7 and use that for icehouse and juno, but the keystoneclient dependency isn't exact with the global-requirements for those branches
15:52:48 <ttx> I created l1 milestones if you want to defer stuff
15:53:06 <david-lyle> does that have to precisely match?
15:53:30 <david-lyle> it requires keystoneclient slightly newer than the minimum
15:53:36 <ttx> david-lyle: I think it does, but might want to cross-check with dhellmann
15:53:51 <david-lyle> that's going to be a difficult shift then
15:54:00 <david-lyle> will continue to look into it
15:54:31 <david-lyle> might be nice to pin in requirements for current stable and cut a stable branch for kilo moving forward
15:54:31 <ttx> ack, off-sync discussion
15:54:35 <david-lyle> sure
15:54:48 <ttx> sure, for kilo we can certainly converge
15:55:43 <ttx> david-lyle: will talk to you more tomorrow
15:55:57 <david-lyle> ttx: sounds good, thanks!
16:51:23 <ttx> morganfainberg: around,
16:51:25 <ttx> ?
16:51:41 <morganfainberg> ttx: o/
16:51:47 <ttx> #topic Keystone
16:51:54 <ttx> #link https://launchpad.net/keystone/+milestone/kilo-3
16:52:01 <ttx> 3 Bps on the list
16:52:09 <ttx> https://blueprints.launchpad.net/keystone/+spec/klw-tokens
16:52:57 <ttx> Thought that was mostly in
16:53:01 <morganfainberg> Klw-tokens is almost done. We need a couple items reviewed should gate today.
16:53:06 <ttx> ok
16:53:14 <ttx> https://blueprints.launchpad.net/keystone/+spec/mapping-enhancements
16:53:35 <ttx> looks done to me ?
16:53:46 <ttx> or maybe pending https://review.openstack.org/#/c/163172/
16:53:50 <morganfainberg> Not sure. Might be 1 patch
16:53:57 <morganfainberg> It's also very close.
16:54:07 <ttx> https://blueprints.launchpad.net/keystone/+spec/model-timestamps
16:54:17 <ttx> also seem�s one patch away
16:54:27 <morganfainberg> The last one is a cleanup bp and should have no impact to api. It can move to post k3
16:54:46 <morganfainberg> and it will move to post k3 or l1
16:54:56 <ttx> #info 2 BPs very close should make it today/tomorrow, last one (model-timestamps) can be deferred
16:55:05 <morganfainberg> Yep.
16:55:29 <ttx> OK, sounds pretty close
16:55:45 <ttx> On the bug side, I assume nothing is k3-critical and all non-merged can be deferred to rc1
16:56:05 <morganfainberg> There will be 2 FFEs requested. One has all the code ready just needs review, can't land it all before freeze though. The second is landing a couple final patches for a feature.
16:56:23 <ttx> I suggest we reconvene on Thursday to tag, or ping me if earlier
16:56:26 <morganfainberg> Bugs are fairly clear, nothing critical on my radar yet.
16:56:38 <ttx> I had a few questions for you
16:56:44 <morganfainberg> Sure.
16:57:08 <ttx> first would you be available at the cross-project meeting today to discuss progress (or absence thereof) on new client in openstack-sdk
16:57:23 <ttx> we discussed that back in December and promised a status update in Feb/Mar
16:57:35 <ttx> can be quick :)
16:57:52 <morganfainberg> Sure. And the answer is lack of progress afaik. I'll 2x check with jamielennox though.
16:58:07 <ttx> second, I need to clarify a few release models for your secondary repositories
16:58:22 <ttx> keystonemiddleware, pycadf and python-keystoneclient-*
16:58:52 <ttx> keystonemiddleware seems released as-needed + one at the end of the cycle, is that a fair assumption ?
16:59:04 <morganfainberg> Pycadf is like Oslo packages
16:59:13 <ttx> ok so same as above
16:59:18 <morganfainberg> Middleware is like client and typically released as needed.
16:59:32 <morganfainberg> And yes, one at end of cycle.
16:59:39 <ttx> and same for the client-*
16:59:49 <morganfainberg> Yep.
17:00:01 <ttx> OK makes all sense
17:00:05 <ttx> That is all I had
17:00:10 <ttx> anythgin on your side ?
17:00:21 <morganfainberg> Nope.
17:00:30 <ttx> alright then, talk to you Thursday if not earlier
17:00:32 <ttx> notmyname: around?
17:00:40 <notmyname> good morning
17:00:42 <ttx> #topic Swift
17:00:45 <ttx> notmyname: hi!
17:00:53 <notmyname> hello
17:00:54 <ttx> Same question as for Morgan
17:01:03 <ttx> would you be available at the cross-project meeting today to discuss progress (or absence thereof) on new client in openstack-sdk
17:01:14 <ttx> we discussed that back in December and promised a status update in Feb/Mar
17:01:25 <ttx> can be quick
17:01:38 <notmyname> if it's the first agenda item.
17:01:46 <ttx> ok, that's the plan
17:01:48 <notmyname> I was double booked with something else for the bottom of the hour
17:02:22 <ttx> notmyname: second question would be on swift-bench -- I assume it's using the release:independent model, is that a good assumption ?
17:02:38 <notmyname> same as python-swiftclient
17:02:51 <ttx> fwiw swift seems to be release:managed, release:at-end-of-6mo-cycle *and* release:independent
17:03:13 <ttx> notmyname: ok
17:03:19 <notmyname> the managed one is surprising. is that because of the end of cycle one?
17:03:38 <notmyname> I'd expected end-of-6mo and independent
17:03:42 <ttx> notmyname: well, currently I still do the swift releases
17:03:48 <notmyname> ah, I see
17:04:02 <notmyname> that makes sense then
17:04:07 <ttx> I agree that "managed" is tied to milestones, so I might not apply that
17:04:14 <ttx> and do courtesy handling
17:04:24 <ttx> and/or give you the tools and have you run with them
17:04:30 <ttx> we'll refine as we go
17:04:41 <notmyname> ok. that's something to consider (after kilo)
17:04:52 <ttx> my added value in the swift release process is pretty slim
17:05:10 <ttx> if I can get my tools properly documented, that would be an option
17:05:18 <ttx> anywat, post-kilo
17:05:37 <notmyname> hmm... rotating that among swift-core would be interesting
17:05:40 <notmyname> anyway, post-kilo
17:05:42 <ttx> how is development coming up ?
17:05:49 <ttx> since last week ?
17:05:57 <notmyname> for now, we're furiously working on EC stuff
17:06:04 <notmyname> tons to do
17:06:07 <notmyname> but here's the plan:
17:06:29 <notmyname> when it's ready for master (end of next week (-ish), we'll freeze master and work on the merge to master
17:07:03 <notmyname> like with storage policies, we'll tag the current feature branch (for history) and build a new patchset for the merge. ie cleaned up work
17:07:29 <notmyname> and, like with storage policies, I'd like to work with -infra to land it as one merge commit instead of a ff merge
17:07:49 <notmyname> and that gets us to the april 10 (-ish) target for a RC
17:07:56 <ttx> sounds good
17:08:02 <ttx> we'll keep track of that
17:08:09 <notmyname> there's a few other non-ec things I'm tracking
17:08:16 <notmyname> but the focus is ec
17:08:39 <ttx> OK... anything else ?
17:08:59 <notmyname> I proposed a bunch of tests to defcore
17:09:07 <notmyname> the in-tree functional tests
17:09:15 <notmyname> to the objectstore capability
17:09:44 <notmyname> in addition to the existing 15 tempest tests listed, I added references to 279 swift functional tests
17:09:54 <ttx> heh, ok
17:10:14 <notmyname> #link https://review.openstack.org/#/c/164865/
17:10:32 <ttx> any other news ?
17:10:36 <notmyname> so I think that's an important pattern for the TC->code->BoD interaction
17:11:06 <notmyname> no, I think that's it. tons of work on EC. scary. committed to a beta. scope will be cut before the feature
17:11:20 <ttx> alright, I'll let you run then
17:11:28 <ttx> devananda: you around ?
17:11:34 <ttx> notmyname: have a good week
17:11:38 <notmyname> thanks. have a good day
17:11:39 <devananda> ttx: hi!
17:11:43 <ttx> here he is
17:11:46 <ttx> #topic Ironic
17:11:53 <ttx> #link https://launchpad.net/ironic/+milestone/kilo-3
17:12:45 <ttx> reading your email
17:13:09 <devananda> I can paste the cliff notes here for record keeping
17:13:15 <ttx> sure
17:13:29 <devananda> we now have 7 BP open still
17:14:00 <devananda> https://blueprints.launchpad.net/ironic/+spec/implement-cleaning-states hit an issue with devstack configuration that's blocking landing the code in ironic
17:14:24 <devananda> once we can get that in (or work around it temporarily) that should be ready to merge
17:14:50 <ttx> we can easily grant FFE to cover for the landing issue if that needs a few days
17:15:04 <devananda> 3 more BPs are driver-specific and I'd really like to get them in, if possible
17:15:14 <devananda> we've already bumped several other driver-specific pieces to Liberty
17:15:30 <ttx> devananda: are they likely to hit the Thursday deadline ?
17:15:36 <ttx> or will require FFE
17:15:59 <devananda> right now, I'm about 50/50 on whether all of this will land by thursday
17:16:33 <devananda> but - if it's not in by friday, I may just cut things that are driver-specific
17:16:34 <ttx> the driver-specific ones are ilo-cleaning-support ilo-properties-capabilities-discovery and... ?
17:16:46 <devananda> uefi-secure-boot
17:17:20 <devananda> actually, sorry, uefi secure boot touches all the drivers
17:17:34 <ttx> what about the last two (local-boot-support-with-partition-images & ipa-as-default-ramdisk) ? How close are they ?
17:17:34 <devananda> it's the other one that might delay the release
17:17:55 <devananda> automate-uefi-bios-iso-creation is the other driver-specific one, but it's also very small at this point and will probably land today
17:18:25 <devananda> those two, I think, are done, though they are tagged on a UEFI-related review, and I'm waiting for feedback from the developers as to why
17:19:04 <ttx> ok, let's discuss progress tomorrow (your morning) and see where we are at that point
17:19:11 <devananda> sounds good
17:19:31 <ttx> Had a question about ironic-pythonagent
17:19:37 <devananda> sure
17:19:45 <ttx> I suspect it's released as-needed
17:20:00 <ttx> any reason we'd do a stable branch for it ?
17:20:00 <devananda> indeed
17:20:08 <devananda> possibly
17:20:09 <ttx> i.e. have a release coinciding with the 6-month cycle ?
17:20:24 <devananda> one of the story arcs of this cycle has been adding micro version support
17:20:43 <devananda> there are two open reviews related to that on the server side, and a -lot- of work still in the client to fully utilize it
17:21:09 <devananda> as we're now versioning the server API, it might make sense to tag clients at the same time
17:21:19 <ttx> hmm, ok
17:21:27 <devananda> OTOH, no future client release should ever be unable to talk to an old server
17:21:34 <devananda> (within the supported window, which is TBD)
17:21:56 <devananda> so making a statement like "to use the juno server, you need th ejuno client" -- we expressly do not want to do that
17:22:01 <ttx> trying to see if release:at-6mo-cycle-end would apply to it
17:22:42 <ttx> or if we'd need something finer for clients/libraries/agents
17:22:50 <devananda> I will no doubt tag a client somewhere around the time we tag kilo rc, and probably again around kilo-final, and after that i'll tag it again ...
17:23:21 <ttx> right, sounds like "having a stable branch" should be separated from "releasing at the end of the cycle"
17:23:49 <devananda> I don't see a need to release the client and server in lock-step. and I don't want to, explicilty or implicitly, require folks to install an older client to talk to an older server
17:24:07 <devananda> the more i think through it, the less i think a stable branch of python-ironicclient makes sense
17:24:20 <ttx> yeah, so including clients on the tag might communicate the wrong intent
17:24:26 <devananda> right
17:24:27 <ttx> anyway, side discussion
17:24:31 <ttx> thx for the input though
17:24:35 <devananda> sure!
17:24:51 <ttx> devananda: questions on your side ?
17:25:12 <devananda> ttx: nope. I'm going to keep at the status page, trying to get everything green
17:25:20 <devananda> tty tmw. same time?
17:25:22 <ttx> alright, talk to you tomorrow
17:25:24 <ttx> yes
17:25:39 <ttx> SlickNik: around?
17:27:01 <SlickNik> ttx: o/
17:27:04 <ttx> #topic Trove
17:27:11 <ttx> #link https://launchpad.net/trove/+milestone/kilo-3
17:27:21 <ttx> Looks like 6 BPs still on the fence
17:27:31 <ttx> https://blueprints.launchpad.net/trove/+spec/mgmt-show-deleted
17:27:44 <ttx> This one looks relatively close
17:28:11 <SlickNik> Yes, that should make it in by today, I'm guessing.
17:28:21 <SlickNik> Close to the finish line.
17:28:22 <ttx> https://blueprints.launchpad.net/trove/+spec/replication-v2 -- not very far either afaict
17:29:28 <ttx> https://blueprints.launchpad.net/trove/+spec/vertica-db-support -- one patch pending too, but admittedly a large one ?
17:29:54 <ttx> same for https://blueprints.launchpad.net/trove/+spec/couchdb-plugin-trove
17:30:31 <ttx> https://blueprints.launchpad.net/trove/+spec/implement-vertica-cluster -- depends on the other, so likely to be late
17:30:50 <ttx> https://blueprints.launchpad.net/trove/+spec/guest-rpc-ping-pong -- close to merge
17:30:52 <SlickNik> https://blueprints.launchpad.net/trove/+spec/implement-vertica-cluster I'm concerned about.
17:31:11 <ttx> yes, it's also a feature needing testing, so deferring might be the rigth call
17:31:18 <ttx> rather than pushing it late in
17:32:03 <ttx> How about we revisit tomorrow, once most of those will be in ?
17:32:11 <SlickNik> ttx: agreed. I'll talk with the author (sushil) to see what's his take on that.
17:32:27 <ttx> Would be great to push most in in the mean time
17:32:30 <SlickNik> ttx: sounds good. I think at least 4-5 of those will have merged by then.
17:32:36 <ttx> ack
17:32:50 <ttx> On the bugs side, anything k3-critical ?
17:32:53 <SlickNik> I'm working on getting folks to reviewthe patches by
17:32:58 <ttx> Or can I just defer to RC1 anything that doesn't make it ?
17:33:41 <SlickNik> ttx: We've fixed all the ones that are critical already.
17:34:06 <SlickNik> So I'd defer anything in kilo-3 that doesn't make it (bugwise) to RC1
17:34:15 <ttx> alright then, I'll let you babysit patches and talk to you tomorrow
17:34:52 <SlickNik> sounds good. will ping you if something comes up.
17:35:07 <ttx> wfm
17:35:08 <ttx> #endmeeting