21:01:58 <ttx> #startmeeting project
21:01:59 <openstack> Meeting started Tue Aug  6 21:01:58 2013 UTC and is due to finish in 60 minutes.  The chair is ttx. Information about MeetBot at http://wiki.debian.org/MeetBot.
21:02:00 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
21:02:02 <openstack> The meeting name has been set to 'project'
21:02:06 <ttx> #link http://wiki.openstack.org/Meetings/ProjectMeeting
21:02:08 <mordred> can redhat pay for hot hot heat to play at the summit?
21:02:31 <ttx> #topic General stuff
21:02:36 <ttx> redhot?
21:02:43 <ttx> I documented the optional FeatureProposalFreeze at:
21:02:48 <ttx> #link https://wiki.openstack.org/wiki/FeatureProposalFreeze
21:02:59 <ttx> So far only Nova (Aug 21) and Neutron (Aug 23) declared they would use this
21:03:08 <ttx> Let me know if you plan to have one and I'll make it appear on https://wiki.openstack.org/wiki/Havana_Release_Schedule
21:03:11 <dolphm> interesting...
21:03:29 <ttx> apevec: status for the 2013.1.3 release at this point ?
21:03:48 <apevec> ttx, on track for Thursday release
21:04:03 <apevec> call-for-testing sent to openstack-dev
21:04:09 <dolphm> ttx: the wiki makes no mention of how the date for this relates to the rest of the cycle?
21:04:28 <dolphm> nvm! "generally happens one or two weeks ahead of FeatureFreeze."
21:04:33 <ttx> dolphm: you pick the date.
21:04:33 <apevec> no feedback yet, assuming no news==good news
21:05:01 <ttx> apevec: no news==no tester in my book
21:05:33 <ttx> sdague, annegentle, mordred: news from QA/Docs/Infra programs ?
21:05:39 <mordred> requirements - requirements are gating now, and devstack is homogenizing them. the list is a little bit behind because we were waiting on gating for it... soon you'll all be getting automatic proposals on requirements updates
21:05:39 <jeblair> our test jobs are now running on possibly the world's first multi-master jenkins setup!  this explains 'jenkins0[12].openstack.org' urls from zuul if you see them.  I'll write a blog post later this week.
21:05:47 <sdague> global requirements is now in devstack
21:05:59 <sdague> per mordred's comment ^^^^
21:06:05 <jeblair> we'll begin making use of requirements repo branches (eg for stable/) soon (thanks fungi!)
21:06:15 <mordred> client libs - setuptools upgrade nightmare is almost over - but it would be really great if everyone could land the recent client lib sync requests and then cut new releases
21:06:32 <annegentle> o/
21:06:45 <mordred> which removes the d2to1 requirement, which will unbreak lots of thigns out in the world
21:06:49 <sdague> testr for tempest runs is getting close. We'll probably switch to testr single threaded this week for the regular jobs, and the parallel as soon as the races are fixed
21:06:58 <mordred> sdague: w00t
21:07:08 <jd__> mordred: noted
21:07:14 <sdague> still optimistic we get parallel for h3
21:07:21 <ttx> annegentle: anything to report ?
21:07:25 <jgriffith> mordred: in progress on cinder side
21:07:28 <annegentle> Just 2 things - 1) Next Tues. is the monthly doc team meeting.
21:08:00 <annegentle> 2) I've been asked to post my "What's Up Doc?" status report to openstack-docs and openstack-dev, it's basically a roundup of the week (or weeks). Sound okay to cross post?
21:08:19 <annegentle> or do I just post to -dev?
21:08:23 <mordred> jgriffith: thank you
21:09:00 <ttx> annegentle: it's fine to cross-post... just make sure you point follow-ups to one list only
21:09:10 <annegentle> ttx: ah good guidance, thanks
21:09:13 <ttx> i.e. "please follow-up on $list"
21:09:23 <fungi> if only more muas supported the mft header
21:09:39 <ttx> sigh yes
21:09:44 <ttx> #topic Oslo status
21:09:48 <ttx> markmc: hi!
21:09:48 <markmc> yo
21:09:56 <ttx> #link https://launchpad.net/oslo/+milestone/havana-3
21:10:01 <markmc> I don't think I've much to report since last week
21:10:06 <ttx> 40% done, 40% under review, 20% in progress, 0% not started
21:10:09 <ttx> Looking good
21:10:18 <markmc> main thing is I've started porting Nova to oslo.messaging
21:10:21 <ttx> Would be great to be all done before Nova and Neutron's FeatureProposalFreeze so that the final syncs can get in
21:10:33 <markmc> so there's reasonable hope oslo.messaging will get done
21:10:41 <markmc> when is FeatureProposalFreeze?
21:10:55 <ttx> for Nova (Aug 21) and Neutron (Aug 23)
21:10:55 <mordred> we added oslo.messaging git head to devstack, but have not added it to the gate
21:11:07 <markmc> #link https://review.openstack.org/39929 - nova port to oslo.messaging
21:11:08 <mordred> if you're adding it to nova now, should we add it to the gate?
21:11:12 <markmc> mordred, yep, that's awesome
21:11:22 <ttx> markmc: trusted-messaging is marked 'Needs code review' but I couldn't find a review about it ?
21:11:29 <ttx> I'm becoming a bit skeptical on our ability to deliver such a key feature so late in the cycle, especially while still chasing the key distribution server part
21:11:32 <markmc> mordred, I promised sdague getting it in the gate is a pre-req for that patch going out of WIP
21:11:44 <mordred> markmc: cool
21:11:51 <markmc> ttx, I updated the links in the secure messaging BP a few minutes ago
21:11:57 <markmc> ttx, the patches are actually really close now
21:12:01 <ttx> looking
21:12:13 <markmc> ttx, several rounds of review, I'm close to being ready to merge the big one
21:12:23 * markmc checks on the kds patch
21:13:00 <markmc> hmm, https://review.openstack.org/39350
21:13:02 <ttx> markmc: I fear the KDS patch will take time
21:13:24 <markmc> oh wait, that's ayoung's copy of the patch?
21:13:50 <markmc> https://review.openstack.org/#/c/37118/
21:14:00 <ttx> right, that one
21:14:13 <markmc> dunno what to say
21:14:19 <markmc> I'm not ready to write it off yet
21:14:24 <markmc> maybe dolphm is
21:14:54 <markmc> apart from that, ...
21:14:58 <ttx> markmc: does it make sense to land anothing in oslo if that's not going in ?
21:15:04 <ttx> anything*
21:15:05 <markmc> I just realized I don't know if any new bps have come in lately
21:15:15 <markmc> don't really know how to start triaging
21:15:27 <ttx> Given how much post-integration it requires (think: https://blueprints.launchpad.net/ceilometer/+spec/use-new-rpc-messsage), time is running short for this on havana
21:15:55 <ttx> no new stuff apparently
21:16:00 <markmc> time is running short, yes
21:16:12 <ttx> ok, will talk to dolphm in keystone sectin
21:16:13 <ttx> o
21:16:17 <ttx> markmc: anything you wanted to raise ?
21:16:21 <markmc> it probably makes sense to merge into oslo-incubator, keystone and then nova
21:16:23 <markmc> in that order
21:16:32 <ttx> Questions about Oslo ?
21:16:35 <markmc> if it goes into oslo-incubator and kds doesn't go in, no big deal
21:16:35 <dolphm> (not sure i have much to add -- haven't been following KDS too closely)
21:16:51 <ttx> markmc: ack
21:16:58 <ttx> #topic Keystone status
21:17:02 <ttx> dolphm: hello!
21:17:05 <dolphm> o/
21:17:05 <ttx> #link https://launchpad.net/keystone/+milestone/havana-3
21:17:25 <ttx> 0% done, 28% under review, 57% in progress, 14% not started
21:17:38 <ttx> Getting a bit late
21:17:48 <ttx> If endpoint-filtering has no assignee it should probably be removed from havana-3 at this point ?
21:18:07 <dolphm> ah, i can fix that
21:18:19 <dolphm> it's been making good progress, and an impl is in review
21:18:28 <ttx> ok, fix assignee and status then
21:18:30 <ttx> Two proposed blueprints need triaging (priority set): domain-quota-management-and-enforcement and unified-logging-in-keystone
21:19:15 <ttx> dolphm: anything you wanted to raise ?
21:19:40 <dolphm> yes..
21:19:53 <dolphm> i'm definitely interested in pursuing FreatureProposalFreeze for keystone (today is the first i've heard of it)
21:20:12 <dolphm> i'll need to discuss with keystone-core and pick a date that makes sense for us
21:20:18 <dolphm> but it seems to be a perfect for for keystone
21:20:22 <ttx> dolphm: was discussed at last summit during the release cycle session
21:20:35 <dolphm> sad i missed it :(
21:20:40 <ttx> dolphm: ok, just let me know (and communicate to everyone on your blueprints)
21:20:56 <dolphm> will do, and that is all from me
21:21:01 <ttx> Questions anyone ?
21:21:16 <ttx> #topic Ceilometer status
21:21:19 <ttx> jd__: hey
21:21:22 <jd__> o
21:21:23 <ttx> #link https://launchpad.net/ceilometer/+milestone/havana-3
21:21:43 <ttx> 14% done, 7% under review, 57% in progress, 21% not started
21:21:49 <ttx> Not a lot of progress since last week... still feeling lucky ?
21:22:19 <jd__> yes, I've stolen a blueprint from eglynn to make sure we'll make it
21:22:28 <jd__> so I'm increasing bandwidth :)
21:22:40 <ttx> ...not... sure ...I should be happy with that
21:23:14 <jd__> well I say I've stolen, but I won't handle it, sileht will actually :)
21:23:23 <jd__> that's the bandwitdh increase
21:23:24 <ttx> jd__: next week if you don't land stuff we'll definitely need to cut stuff
21:23:35 <jd__> ttx: fair enough
21:23:55 <ttx> jd__: I see, it's your name on it... but your minions are doing it for you :)
21:24:00 <ttx> jd__: anything you wanted to mention ?
21:24:10 <jd__> hehe
21:24:14 <ttx> Questions on Ceilometer ?
21:24:15 <jd__> nop, all good
21:24:24 <ttx> #topic Swift status
21:24:27 <notmyname> hi
21:24:27 <ttx> notmyname: o/
21:24:30 <ttx> #link https://launchpad.net/swift/+milestone/1.9.1
21:24:38 <ttx> notmyname: the "missing" patch should be proposed tomorrow before you get up...
21:24:49 <ttx> That leaves you with the rest of the day to push in whatever else you want and get the changelog aligned
21:24:57 <notmyname> ttx: I've got all that ready to go
21:25:02 <ttx> Then I can cut milestone-proposed and tag rc1 early Thursday morning (or late Wednesday depending on your TZ), if you give me the go-ahead before going to bed
21:25:16 <notmyname> ttx: I think it will be proposed around 8am pacific tomorrow
21:25:32 <notmyname> ttx: I'll give you the sha ASAP
21:25:49 <notmyname> looks like I need to update some LP stuff maybe
21:26:06 <ttx> notmyname: then I /might/ be able to cut earlier, we'll see if I can jump on irc during the evening
21:26:14 <notmyname> and so I guess it's not a secret now that we'll cut 1.9.1 RC tomorrow?
21:26:16 <notmyname> ;-)
21:26:23 <ttx> notmyname: no it's not :)
21:26:43 <ttx> notmyname: anything you wanted to raise ?
21:26:48 <notmyname> ya, one thing
21:27:11 <notmyname> SwiftStack is sponsoring a Swift hackathon in October in Austin. Details coming shortly
21:27:23 <ttx> Cool. Questions about Swift ?
21:27:42 <ttx> #topic Glance status
21:27:43 * markwash blacks out
21:27:45 <ttx> markwash: o/
21:27:46 <notmyname> heh
21:27:52 <ttx> WAKE UP
21:27:56 <ttx> #link https://launchpad.net/glance/+milestone/havana-3
21:27:58 <markwash> o/
21:28:02 <ttx> 28% done, 14% under review, 42% in progress, 14% not started
21:28:10 <ttx> This looks on track so far...
21:28:18 <markwash> but not a ton of improvement since last week
21:28:21 <ttx> I'm growing a bit worried about async-glance-workers, since it appears to be a prerequisite of new-upload-workflow... what's the status of that ?
21:28:27 <markwash> so I"ll look at adjusting
21:28:39 <markwash> a lot of active discussion on async glance workers, if you've seen the ML
21:28:47 <markwash> however, I'm not sure folks are quite branch ready
21:28:58 <ttx> seen it, it just is a bit late for discussion now :)
21:29:10 <markwash> so we'll be discussing this week and next to see what we can actually get done now that we see how much still needs to be resolved
21:29:30 <markwash> ttx: I guess that's why its "high" and not "critical" ;-)
21:29:37 <ttx> markwash: if code is not proposed by next week, we'll probably have to skip new-upload-workflow
21:29:42 <markwash> I agree
21:29:50 <ttx> How is api-v2-property-protection progressing ?
21:29:57 <markwash> a bit better, active coding
21:30:14 <ttx> NB: glance-tests-code-duplication needs to be triaged
21:30:19 <ttx> markwash: we still need a python-glanceclient release to ship out the recent security fix
21:30:28 <ttx> let me know if I can help you with that
21:30:46 <markwash> ttx: I know :-( I've been hopelessly derelict, but I've been getting some help recently from @DeanTroyer
21:30:51 <markwash> and trove docs
21:30:57 <ttx> markwash: anything you wanted to mention ?
21:30:59 <markwash> should not be hard, probably just need to feel comfortable writing the commit message
21:31:10 <markwash> nothing at this time
21:31:15 <ttx> Questions on Glance ?
21:31:28 <ttx> markwash: thx!
21:31:28 <devananda> question on glanecclient review status
21:31:38 <ttx> devananda: go for it
21:31:47 <devananda> anything we can do to facilitate https://review.openstack.org/#/c/33327/ being reviewed more actively?
21:32:12 <markwash> devananda: I've been in talk with some other glance-core about that recently
21:32:26 <markwash> I'll add it to the agenda for this weeks glance meeting
21:32:30 <devananda> thanks :)
21:32:35 <ttx> #topic Neutron status
21:32:39 <ttx> markmcclain: hi!
21:32:40 <markwash> it would be great if someone could help us with any questions we have
21:32:41 <markmcclain> hi
21:32:48 <ttx> Good news is that the neutron gate has been reenabled, thanks to nati_ueno's work
21:32:59 <ttx> It needs to be closely watched though as it may quickly degrade again (think bug 1208661)
21:33:00 <uvirtbot> Launchpad bug 1208661 in neutron "floating ip exercise fails because IP not available" [Critical,In progress] https://launchpad.net/bugs/1208661
21:33:12 <ttx> #link https://launchpad.net/neutron/+milestone/havana-3
21:33:21 <ttx> 23% done, 51% under review, 18% in progress, 6% not started
21:33:29 <ttx> looks like you might pull it off if you continue to review as fast
21:34:01 <markmcclain> yeah.. today's gate brokeness has been slowing us down
21:34:14 <ttx> Removing a few targets couldn't hurt though:
21:34:29 <ttx> like the few "not started" you still have*
21:34:41 <markmcclain> I'm really close to deferring those
21:34:54 <markmcclain> I don't think there will be consensus fast enough
21:34:56 <ttx> ok, do it before next week if there is no movement on those
21:35:07 <ttx> they are unlikely to make it (but likely to create a disturbance in the review queue when proposed later)
21:35:30 <ttx> 10 proposed blueprints are still in need of triaging... my suggestion if you accept them is to set most of them to "Low" priority
21:35:36 <ttx> markmcclain: anything you wanted to raise ?
21:36:04 <markmcclain> will triage them.. nothing new to raise
21:36:06 <ttx> Questions on Neutron ?
21:36:21 <ttx> #topic Cinder status
21:36:24 <ttx> jgriffith: hola!
21:36:28 <jgriffith> ttx: hey ya
21:36:31 <ttx> #link https://launchpad.net/cinder/+milestone/havana-3
21:36:35 <ttx> 26% done, 6% under review, 46% in progress, 20% not started
21:36:45 <ttx> Need more code proposed, otherwise there will be a review traffic jam in a few weeks
21:36:56 <jgriffith> ttx: already have that :(
21:36:59 <jgriffith> ttx: but yes
21:37:03 <ttx> read-only-volumes (Medium) is marked as depending on volume-acl (Low)... if its a true dep, volume-acl should be >=Medium
21:37:05 <jgriffith> ttx: workign on stepping things up
21:37:18 <jgriffith> ttx: that's been changed
21:37:31 <jgriffith> ttx: there's a r/o impl under review that doesn't rely on acl
21:37:40 <jgriffith> ttx: trying to resolve some infighting on that one though
21:37:46 <ttx> jgriffith: ok, I'll remove the dep then
21:37:51 <jgriffith> ttx: thanks
21:38:02 <ttx> clone-image-imageid is marked implemented but has https://review.openstack.org/#/c/38037/ going on
21:38:39 <jgriffith> ttx: I'll fix that
21:38:42 <ttx> thx
21:38:49 <ttx> fwiw 3 proposed blueprints need triaging: coraid-driver-refactoring-for-havana, cinder-volume-driver-optional-iscsi-support, windows-storage-driver-extended
21:38:56 <ttx> jgriffith: anything on your mind ?
21:39:07 <jgriffith> ttx: nope, just hitting the rush
21:39:14 <ttx> Questions on Cinder ?
21:39:30 <ttx> #topic Nova status
21:39:33 <ttx> russellb: hey
21:39:39 <jgriffith> ttx: hold on a sec
21:39:45 <russellb> hey
21:39:47 * russellb holds
21:39:50 * ttx holds
21:39:54 <jgriffith> ttx: so the problem is there are multiples of that clone_image
21:40:01 <jgriffith> ttx: I'll sort out the deltas and update
21:40:04 <jgriffith> ttx: that's all
21:40:09 <ttx> ok, thx :)
21:40:09 <jgriffith> russellb: sorry :)
21:40:14 <russellb> all good
21:40:17 <ttx> #link https://launchpad.net/nova/+milestone/havana-3
21:40:22 <ttx> 10% done, 30% under review, 59% in progress, 0% not started
21:40:32 <ttx> Not too bad, but we need more implemented and more under review if you want to hit FeatureProposalFreeze in good shape
21:40:34 <russellb> i've moved some to low, but more are still coming in
21:40:44 <russellb> yeah, at least a lot are under review
21:40:51 <russellb> so hopefully we can do a review push on those in the near term
21:40:51 <ttx> A few other remarks:
21:40:58 <ttx> https://blueprints.launchpad.net/nova/+spec/baremetal-havana looks a bit sketchy
21:41:06 <ttx> Can't really see what needs to be done there... and it depends on cinder/bare-metal-volumes which is not targeted to any milestone nor assigned to anyone
21:41:27 * ttx hugs his UPS in the middle of the storm
21:41:54 <russellb> ttx: yeah, i can't figure that one out either
21:42:03 <russellb> ttx: actually, there used to be nova blueprints
21:42:04 <devananda> i'm not aware of any work on baremetal-volumes
21:42:06 <russellb> and they all got deferred
21:42:18 <russellb> devananda: can you look at that nova blueprint and let me know if we should mark it implemented?  or what?
21:42:21 <ttx> russellb: maybe clarify with devananda and remove if not relevant anymore
21:42:26 <russellb> ttx: ack
21:42:31 <ttx> in other news I'm not confident that encrypt-cinder-volumes will be unblocked -- https://review.openstack.org/#/c/30974/ has been stalling forever now
21:42:31 <devananda> looking
21:42:47 <ttx> db-slave-handle seems to be struggling in review too.
21:42:58 <ttx> (just in case you look for potential defers)
21:43:06 <russellb> we're pushing db-slave-handle pretty hard, i feel ok about that one
21:43:13 <russellb> the other ... not as much, especially since it's blocked
21:43:27 <ttx> Only one blueprint left needing triage db-compute-node-stats-remove
21:43:30 <russellb> i've been checking in with the cinder encryption owner each week
21:43:45 <russellb> he seems confident each time, or at least hopeful ...
21:43:52 <devananda> nova/pxeboot-ports is relevant and in progress, but the going has been very slow. different negative feedback at each turn, since most of the work is in neutron but it does tie into nova
21:44:10 <ttx> russellb: maybe lower the priority to indicate you have no idea if it will make it
21:44:20 <russellb> ttx: ok, will do so
21:44:21 <devananda> but that's the only nova BP still under the barmetal-havana umbrella.
21:44:36 <devananda> may be worth just dropping the umbrella since everything else was defferred?
21:44:38 <russellb> devananda: if it's the only one, maybe we should just cloe the tracker
21:44:39 <russellb> heh
21:44:41 <ttx> devananda: yes
21:44:44 <ttx> russellb: anything else you wanted to mention ?
21:44:49 <russellb> nope, thanks
21:44:52 <ttx> Any question on Nova ?
21:45:10 <ttx> #topic Heat status
21:45:15 <ttx> stevebaker: o/
21:45:19 <stevebaker> yop
21:45:19 <ttx> #link https://launchpad.net/heat/+milestone/havana-3
21:45:25 <ttx> 32% done, 16% under review, 48% in progress, 4% not started
21:45:32 <ttx> Same remark as Cinder, not looking too bad but need more code proposed :)
21:45:46 <ttx> Has work started on multiple-engines yet ?
21:45:52 <stevebaker> yes, we'll decide on a FeatureProposalFreeze tomorrow, which will give us a new stick to wave
21:46:04 <ttx> more sticks!
21:46:19 <stevebaker> I think some multiple-engines pre-work has happened, but will confirm tomorrow if it has a chance
21:46:33 <ttx> Is there more to be done on watch-ceilometer ?
21:46:41 <stevebaker> we could force-defer some bps, but possibly not without offending some first contributors
21:46:43 <ttx> got reviews linked but all landed
21:47:01 <stevebaker> I think watch-ceilometer still has some mopping up
21:47:05 <ttx> ok
21:47:16 <ttx> open-api-dsl (targeted to h3) depends on template-inputes and param-constraints, which have been deferred to 'next' ?
21:47:41 <ttx> or is that not a strong dep ?
21:48:01 <stevebaker> open-api-dsl is a bit if a catch-all anyway. I'll discuss with randal tomorrow
21:48:20 <ttx> it's better to have actionable bits as blueprints
21:48:35 <ttx> themes and catch-alls don't work very well when it comes to "landing" them
21:48:43 <ttx> heat-trusts looks in jeopardy with delegation-impersonation-support not being completed yet
21:48:44 <stevebaker> yes, that one needs to be more fine-grained
21:49:05 <ttx> but I guess that's shardyland
21:49:26 <stevebaker> indeed it is, i believe there are some keystoneclient reviews in
21:49:56 <ttx> as a general rule of thumb, stuff depending on other projects don't work very well in te last milestone of the cycle
21:50:15 <ttx> it's better to nail them all by X-2
21:50:17 <stevebaker> yeah
21:50:39 <ttx> because in X-3 everyone has their own cats to care for
21:50:46 <ttx> stevebaker: anything else you want to raise ?
21:50:58 <stevebaker> At least FeatureProposalFreeze gives us a concrete date to make defer decisions
21:51:11 <stevebaker> that is it I think
21:51:15 <ttx> right, let me know your chosen date and I'll document it
21:51:19 <stevebaker> ok
21:51:19 <ttx> Questions about Heat ?
21:51:35 <ttx> #topic Horizon status
21:51:39 <ttx> gabrielhurley: o/
21:51:44 <gabrielhurley> \o
21:51:45 <ttx> #link https://launchpad.net/horizon/+milestone/havana-3
21:51:50 <ttx> 6% done, 43% under review, 37% in progress, 12% not started
21:51:59 <ttx> Need to make progress on reviews and start landing stuff, otherwise the following weeks will be a bit busy
21:52:04 <gabrielhurley> agreed
21:52:18 <gabrielhurley> there are definitely some that will start merging this week
21:52:21 <ttx> didn't have anything special scaring me though.
21:52:40 <gabrielhurley> the openstack-requirements fiasco for django-openstack-auth has been a long road
21:52:57 <gabrielhurley> we're at the end though
21:53:03 <ttx> gabrielhurley: indeed. anything else you wanted to mention ?
21:53:17 <gabrielhurley> not this week. I'll start clamping down on things next week
21:53:24 <ttx> Questions on Horizon ?
21:53:50 <ttx> #topic Incubated projects
21:54:02 <ttx> hub_cap, devananda: news, questions ?
21:54:32 <hub_cap> hello. no news really. got 3 essential bps and 2 are in code review, 1 is heat and will be done by sep1
21:54:37 <ttx> hub_cap: how is the heat integration going ?
21:54:59 <hub_cap> its been siderailed for a while due to onboarding / other work
21:55:05 <hub_cap> but i expect to pick it up this wk
21:55:07 <ttx> "will be done by sep1" ok
21:55:13 <hub_cap> correct :)
21:55:17 <hub_cap> done _and_ merged
21:55:29 <hub_cap> do we have do feature freeze this go round?
21:55:30 <ttx> heh
21:55:35 <devananda> ttx: neither here. things are coming along, but nothing deployable yet
21:55:53 <ttx> hub_cap: you don't *have* to
21:55:57 <devananda> ttx: i wont be putting in a feature freeze or any such, as it wouldn't make sense for ironic at this point
21:56:13 <hub_cap> i might after heat lands, lets say sep1 for us
21:56:35 <ttx> hub_cap: it's sep 4 for almost everyone else.
21:56:48 <hub_cap> oh well i guess im an overachiever
21:56:51 <ttx> (feature freeze)
21:56:51 <hub_cap> sep4 it is!
21:57:07 <hub_cap> so feature freeze is 1 day before h3 is cut?
21:57:08 <ttx> feature proposal freeze is an extra deadline some projects will use to defer features early
21:57:22 <ttx> two days before, yes
21:57:37 <hub_cap> ok cool
21:57:40 <ttx> we do a friday cut due to some holiday in some country that week
21:57:44 <hub_cap> put me down for a feature freeze
21:57:58 <hub_cap> ill take a double shot of freeze plz
21:58:22 <ttx> "Sep 2	Labor Day"
21:58:36 <med_> US Holiday at least
21:58:50 <ttx> "feature freeze" should be the name of a cocktail.
21:59:00 <hub_cap> def
21:59:05 <hub_cap> lets make it so at HK
21:59:07 <med_> Mike's Hard Feature Freeze
21:59:16 <ttx> ok, if you don't have more questions we'll close now
21:59:24 <hub_cap> just hugs
21:59:29 <ttx> #endmeeting