17:00:50 <jlvillal> #startmeeting ironic_qa
17:00:50 <openstack> Meeting started Wed Jan 20 17:00:50 2016 UTC and is due to finish in 60 minutes.  The chair is jlvillal. Information about MeetBot at http://wiki.debian.org/MeetBot.
17:00:51 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
17:00:54 <openstack> The meeting name has been set to 'ironic_qa'
17:01:14 <rajinir> o/
17:01:15 <cdearborn> o/
17:01:19 <krtaylor> o/
17:01:24 <jlvillal> Hello all.
17:01:26 <rpioso> o/
17:01:32 <jlvillal> As always the agenda is at: https://wiki.openstack.org/wiki/Meetings/Ironic-QA
17:01:46 <jlvillal> #topic Announcements
17:02:02 <jlvillal> I don't have any announcements. Does anyone else?
17:02:33 <mjturek1> hey \o
17:02:39 <jlvillal> Okay. Moving on in 5
17:02:42 <jlvillal> 4
17:02:43 <jlvillal> 3
17:02:45 <jlvillal> 2
17:02:46 <jlvillal> 1
17:02:54 <jlvillal> #topic Grenade testing
17:04:03 <jlvillal> So I am continuing to work on Grenade testing. Currently at the point where five tempest tests fail for stable/liberty portion of Grenade. All say node unavailable.
17:04:26 <jlvillal> #info jlvillal continues to work on Grenade testing. Currently at the point where five tempest tests fail for stable/liberty portion of Grenade. All say node unavailable.
17:04:30 * jroll lurking
17:04:47 <jroll> jlvillal: node unavailable as in out of capacity or?
17:05:06 <jlvillal> jroll: Let me see if I can find message
17:05:31 <jroll> ok we can dig in after the meeting too if you like
17:06:21 <jlvillal> jroll: Okay, lets do that.
17:06:31 <jlvillal> Rather than me hunting around for log files right now :)
17:06:44 <jlvillal> Any other comments/questions on grenade before moving on?
17:07:05 <jlvillal> Okay moving on
17:07:10 <jlvillal> #topic Functional testing
17:08:15 <jlvillal> So I haven't heard any updates since last week
17:08:38 <jlvillal> Does anyone have anything to add?
17:08:53 <jlvillal> I think Serge was investigating but he had some higher priority things to work on.
17:09:13 <jlvillal> #info No updates this week
17:09:18 <jlvillal> If nothing else, moving on
17:09:40 <jlvillal> #topic 3rd Party CI (krtaylor)
17:09:50 <jlvillal> krtaylor: For you :)
17:10:00 <krtaylor> No updates this week for me either
17:10:19 <krtaylor> unfortunately, I have had other priorities as well
17:10:24 <jlvillal> Okay.
17:10:32 <jlvillal> Anyone else have anything for 3rd Party CI?
17:10:34 <jroll> so, M-2 this week.
17:10:45 <jroll> supposedly third party CIs are supposed to have accounts
17:10:52 <jroll> and be sandbox commenting by M-3
17:11:05 <jroll> krtaylor: wondering if you can send a reminder to the dev list cc vendor peeps
17:11:33 <krtaylor> jroll, sure, but I don't know the list of drivers that have responded
17:11:40 <jlvillal> #info M-2 is this week. 3rd Party CIs should have accounts and be doing sandbox commenting by M-3. krtaylor to send out email reminder
17:11:47 <krtaylor> but I can send a global reminder
17:11:48 <rajinir> There is etherpad and we updated the status. https://etherpad.openstack.org/p/IronicCI
17:11:57 <jroll> krtaylor: yeah, thingee might know, not sure
17:12:30 <jlvillal> #info Etherpad available at: https://etherpad.openstack.org/p/IronicCI
17:12:33 <krtaylor> rajinir, y, but not sure how current that is
17:12:36 <thingee> yeah might need to double check sandbox and do some emailing
17:13:03 <thingee> let me get back to you. I won't be present next week, but can drop a note on the ML to reference in this meeting
17:13:22 <jlvillal> Anything else from anyone?
17:13:31 <krtaylor> thingee, great, and I'll follow up on that, thanks again
17:13:51 <jlvillal> krtaylor: You good?
17:13:56 <jlvillal> Okay to move on?
17:14:01 <krtaylor> yep, all for me
17:14:08 <jlvillal> Thanks
17:14:11 <jlvillal> Okay moving on
17:14:20 <jlvillal> #topic Open discussion / General QA topics
17:14:31 <jlvillal> If anyone has anything speak up :)
17:14:47 <jlvillal> I'll give it a couple minutes
17:14:51 <jroll> so how about that gate?
17:15:19 <jroll> http://tinyurl.com/j5yc4yr
17:15:47 <jroll> there's the devstack bug hurting us, but our fail rate is very high without that
17:15:50 <jlvillal> It doesn't look that much better. Maybe a little better than last week
17:16:17 <jroll> IMO if we're going to have a QA subteam, the gate should be that subteam's #1 focus
17:16:17 <mjturek1> jroll: is there a known root cause at the moment?
17:16:22 <jlvillal> Do we have any people who are interested in trying to figure out why are gate fails so much? People with free time?
17:16:29 <jroll> no sense in working on other things if the gate is falling apart
17:16:45 <jroll> mjturek1: timeouts, timeouts everywhere
17:16:53 <jroll> tl;dr nested virt is slow
17:17:01 <jlvillal> #info Looking for people to help troubleshoot gate issues as we have very high failure rate: http://tinyurl.com/j5yc4yr
17:17:03 <mjturek1> right
17:17:19 <krtaylor> mjturek1, re: timeouts  ^^
17:17:37 <jroll> it makes me sad to see people constantly rechecking things instead of actually working on the real issue :(
17:17:55 <jlvillal> #info Gate has many failures due to timeouts.
17:18:22 <jlvillal> part of this is nested virt, but Ironic has been using nested virt for a long time. So something else has also changed most likely
17:18:23 <mjturek1> fair enough, I have a patch in review that failed the gate I'll use that as an opportunity to see if I notice anything
17:19:01 <jroll> #link https://review.openstack.org/#/c/259089/
17:19:03 <jroll> #link https://review.openstack.org/#/c/234902/
17:19:11 <jroll> ^ tinyipa work in an effort to speed things up
17:19:26 <jlvillal> jroll: I thought they added new cloud providers for the build. Is that true?
17:19:34 <jlvillal> It used to just be RackSpace and HP
17:19:42 <jlvillal> But I think they have added OVH and maybe others?
17:19:47 <jroll> jlvillal: over the last 6 months or so, yes
17:19:58 <jroll> I think that may be slightly related, but there's nothing we can do about that
17:20:27 <jlvillal> Okay, so timeline doesn't quite line-up with our failures.
17:20:51 <jlvillal> Any other opens?
17:20:58 <mjturek1> new question: is there a timeframe for the tempest plugin to land/is that definitely going to happen?
17:21:05 <jroll> oh right
17:21:10 <jroll> devananda has been working on that a bit
17:21:15 <jroll> we'd like it in very soon
17:21:16 <devananda> ohhai
17:21:19 <mjturek1> :)
17:21:27 <jroll> a wild deva
17:21:28 <devananda> it's blocked on the devstack fix that has been in the gate
17:21:40 <mjturek1> devananda: which fix specifically??
17:21:52 <devananda> once our gate is unblocked, I want to land the tempestlib changes immediately
17:21:53 <jroll> oh, right, for the agent_ssh job
17:22:13 <devananda> mjturek1: https://review.openstack.org/#/c/268960/
17:22:24 <mjturek1> devananda: cool thanks!
17:22:27 <jlvillal> #info tempest plugin is waiting to land. Currently blocked waiting for fix to devstack to get merged. devananda will try to get it merged ASAP
17:22:38 <jlvillal> #link https://review.openstack.org/#/c/268960/
17:22:55 <jlvillal> Anything else?
17:23:07 <jroll> devananda: you still need reviews on tempest things yeah?
17:23:18 <devananda> jroll: the first two patches look good
17:23:36 <devananda> I am still a bit nervous that we haven't tested them inthe gate env yet, bu that's a chicken-and-egg problem right now
17:23:39 <jroll> devananda: let me rephrase, do they have +A? :)
17:23:56 <devananda> no - but they have enough +2's
17:24:01 <jroll> cool
17:24:14 <devananda> no need to put them in the queue until they have a chance of passing
17:24:16 <jroll> we need yuiko to un-wip the one it seems
17:24:18 <jroll> right
17:24:53 * jroll leaves a comment
17:25:05 <jlvillal> Feel free to add a #info if needed for meeting minutes. If not already covered.
17:25:20 <jroll> so going back a moment
17:25:35 <jlvillal> Any other comments?
17:25:38 <jroll> I would really like to request anyone working on QA things focus on the gate problems, rather than something else
17:25:45 <devananda> ++
17:26:08 <jlvillal> jroll: Okay. I can switch over to that from Grenade.
17:26:17 <jroll> it's the biggest detriment to our velocity
17:26:18 <devananda> our gate's failure rate from simple timeouts is really troubling
17:26:19 <jlvillal> See what I can figure out.
17:26:41 <devananda> do we have any indication how much tinyipa will help with that?
17:26:43 <jlvillal> But would love to have multiple people collaborating on this! :)
17:26:58 <jroll> devananda: I don't have data but it's a non-negligible improvement
17:27:02 <devananda> cool
17:27:30 <devananda> we should, you know, get data on that :)
17:27:58 <jroll> yah
17:28:02 <jlvillal> Of course the devstack gate breakage is making it hard to troubleshoot at the moment :(
17:28:07 <jroll> the n-v job will help with that
17:28:15 <jroll> jlvillal: that only affects the agent_ssh job
17:28:25 <jroll> or rather, things using the agent driver
17:28:29 <jroll> pxe driver is fine
17:28:58 <jlvillal> Ah right. I guess that is good than that they affect two different jobs.
17:29:11 <jlvillal> 'good' being debatable...
17:29:24 <jlvillal> Anything else?
17:29:28 <jroll> rather, the devstack breakage only affects agent driver
17:29:36 <jroll> no driver is safe from the timeout stuff
17:29:45 <jlvillal> #info devstack breakage only affects agent driver
17:30:06 <jlvillal> #info timeout issues are seen across multiple drivers
17:30:27 <jlvillal> Okay. I think we are done.
17:30:34 <jlvillal> Any objections to ending the meeting?
17:30:53 <jlvillal> Thanks everyone! Talk to you next week
17:31:04 <jlvillal> #endmeeting