17:00:07 <mtreinish> #startmeeting qa
17:00:08 <openstack> Meeting started Thu Aug 15 17:00:07 2013 UTC and is due to finish in 60 minutes.  The chair is mtreinish. Information about MeetBot at http://wiki.debian.org/MeetBot.
17:00:09 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
17:00:11 <openstack> The meeting name has been set to 'qa'
17:00:19 <mtreinish> who do we have here?
17:00:24 <mkoderer> Hi
17:00:26 <mlavalle> hi
17:00:43 <adalbas> hi
17:00:46 <mtreinish> today's agenda:
17:00:49 <mtreinish> #link https://wiki.openstack.org/wiki/Meetings/QATeamMeeting
17:01:12 <mtreinish> it's pretty short today just the usual things no one added anything extra
17:01:22 <mtreinish> so I guess lets dive into it
17:01:27 <mtreinish> #topic testr status
17:01:41 <mtreinish> so my topic is up first
17:02:02 <mtreinish> earlier this week I got the fix for the last big blocking race condition merged
17:02:09 <mtreinish> and the runs seem fairly stable
17:02:22 <mtreinish> so I added a parallel run nonvoting to the gate queue on zuul
17:02:36 <mtreinish> I'm going to watch it for a while to see how stable it seems
17:02:46 <mkoderer> cool
17:02:51 <mtreinish> and hopefully make it the default sometime in the next week or 2
17:03:04 <mtreinish> right now we're tracking down some other races that have been popping up
17:03:21 <mtreinish> and I hope to get tenant isolation on swift and keystone testing before we green light it
17:03:31 <mtreinish> but we're really close here
17:03:33 <mkoderer> is there a plan to delete all the nose things in the code?
17:04:07 <mtreinish> mkoderer: yeah at some point, just right now I've been too distracted with trying to get it gating parallel to push the patch
17:04:30 <mkoderer> mtreinish: ok cool
17:04:32 <mtreinish> mkoderer: feel free to push it yourself if you'd like
17:04:51 <mkoderer> mtreinish: ok shouldn't be hard
17:05:17 <mtreinish> ok, does anyone have anything else on testr?
17:05:59 <adalbas> mtreinish, i saw you get the tenant_isolation for swift
17:06:20 <adalbas> do you still need to get the swift tempest tests locked up ?
17:06:20 <mtreinish> adalbas: yeah I pushed it out, but it's probably going to need a devstack change to get it working
17:06:44 <mtreinish> adalbas: probably not if it was a user conflict that was causing the fail
17:06:55 <adalbas> ok
17:07:08 <mtreinish> adalbas: we can pick this up on the qa channel after the meeting though
17:07:13 <adalbas> sure
17:07:20 <mtreinish> ok, then moving on to the next topic
17:07:26 <mtreinish> #topic stress tests status
17:07:32 <mtreinish> mkoderer: you're up
17:07:49 <mkoderer> so I want to introduce this decorator "stresstest"
17:08:01 <mkoderer> it will automatically set attr type=stress
17:08:24 <mkoderer> with this I could use subunit to discover all stress test inside the tempest tests
17:08:36 <mkoderer> thats my plan and I am working on that
17:09:13 <mkoderer> if we have this we could go through the test and search for good candidates
17:09:32 <mtreinish> ok, cool so things are in motion with that then
17:09:45 <mkoderer> yes
17:09:56 <mkoderer> about this https://review.openstack.org/#/c/38980/
17:09:56 <mtreinish> I imagine a lot of tests won't be good candidates for stress tests
17:10:02 <mtreinish> like the negative tests :)
17:10:18 <mtreinish> afazekas: ^^^ are you around?
17:10:21 <mkoderer> mtreinish: thats right.. but I am quite sure we will have some good ones
17:10:58 <mkoderer> could be that https://review.openstack.org/#/c/38980/ is not needed after my fix
17:11:13 <mkoderer> but anyway we could use it in the meanwhile
17:11:26 <mtreinish> mkoderer: ok, yeah might be, you should coordinate with afazekas about that
17:11:33 <mtreinish> but he doesn't seem to be around right now
17:11:37 <mkoderer> I think I will chat with afazekas when hes around
17:11:45 <mkoderer> np
17:12:05 <mkoderer> any other question?
17:12:09 <mtreinish> mkoderer: ok all sounds good to me. Anything else on the stress test front?
17:12:16 <mtreinish> mkoderer: nothing from me :)
17:12:39 <mkoderer> ok cool :)
17:13:02 <afazekas> re
17:13:02 <mtreinish> ok then we'll move on to the next topic
17:13:10 <mlavalle> mtreinish: I want to report status on https://blueprints.launchpad.net/tempest/+spec/fix-gate-tempest-devstack-vm-quantum-full
17:13:20 <mtreinish> #topic other blueprint status
17:13:31 <mtreinish> mlavalle: ok you're up
17:13:55 <mlavalle> mtreinish: we have achieved good progress on this. Several of the items in the BP are already fixed
17:14:16 <mlavalle> mtreinish: we have a shot at getting this done by H-3
17:14:37 <mtreinish> mlavalle: ok cool, that would be great to get the quantum full jobs passing
17:14:45 <mlavalle> mtreinish: I am working now on a nova patch, that is required by one of the tests I am fixing
17:14:53 <afazekas> mlavalle: do you know about a bug related to the failing fixed ip or interface tests ?
17:15:23 <mlavalle> mtresinish: any help I can get with this would be great: https://review.openstack.org/#/c/41329
17:15:35 <mlavalle> afazekas: no i don't
17:15:59 <mtreinish> mlavalle: sure I'll take a look
17:16:09 <mlavalle> that's it
17:16:09 <mtreinish> and point some nova cores at it too
17:16:55 <mtreinish> mlavalle: ok cool, one quick thing about neutron is last week we had to skip another neutron test
17:17:03 <afazekas> https://bugs.launchpad.net/neutron/+bug/1189671 another interesting related bug
17:17:06 <uvirtbot> Launchpad bug 1189671 in neutron "default quota driver not suitable for production" [Wishlist,In progress]
17:17:25 <mtreinish> because a broken change got merged while the gate was messed up and passing all the tests
17:17:26 <mlavalle> mtreinish: ok, i'll take a look
17:17:52 <mlavalle> afazekas: i'l take a look
17:18:23 <afazekas> unfortunately  when thet test case was skipped several additional introduced..
17:18:31 <mtreinish> mlavalle: I can dig up a link but it's one of the bugs marked as critical
17:18:45 <mtreinish> marun was working on it yesterday I think
17:18:54 <afazekas> tho know ones are fixed, but something still not ok with the ssh connectivity
17:19:11 <marun> afazekas: I am looking at it
17:19:11 <mlavalle> I'll ping marun
17:19:24 <marun> It's not ssh that's the problem - it's the metadata service.
17:19:33 <afazekas> marun: can you reproduce it ?
17:19:40 <marun> afazekas: trivially
17:19:46 <marun> but i don't know why it's happening.
17:19:56 <marun> working on it now
17:20:18 <afazekas> marun>: cool
17:20:20 <mlavalle> marun: i'll let you run with it,…..
17:20:59 <mtreinish> mlavalle: ok, is there anything else on neutron status?
17:21:07 <mlavalle> i'm done
17:21:37 <mtreinish> ok then are there any other blueprints that need to be discussed
17:21:59 <afazekas> leaking
17:22:14 <mtreinish> afazekas: ok what's going on with that?
17:22:34 <afazekas> looks like the original connect  did not liked, so I will introduce different one
17:23:17 <afazekas> Which will be designed to clean up at run time, and also report the issues
17:23:45 <afazekas> but it will not cover the leakage what is not visible via the api
17:24:01 <afazekas> and it will relay on the tenant isolation
17:24:22 <mtreinish> afazekas: will it ensure that the isolated tenants will be cleaned up too?
17:24:50 <afazekas> just about the resources in the tenant
17:25:10 <mtreinish> afazekas: ok
17:26:01 <mtreinish> is there anything else about resource leakage detection?
17:27:01 <afazekas> I can restore the previous patch if anybody interested in the global leakage
17:27:22 <mtreinish> afazekas: do you have a link?
17:28:49 <afazekas> https://review.openstack.org/#/c/35516/
17:29:08 <mtreinish> afazekas: ok, I'll take a look at it later and let you know
17:29:33 <afazekas> thx
17:29:55 <mtreinish> ok then, moving on:
17:30:00 <mtreinish> #topic critical reviews
17:30:13 <mtreinish> does anyone have any reviews that they like to bring up?
17:30:51 <Ravikumar_hp> I have one concern
17:30:55 <Ravikumar_hp> on review process
17:31:02 <Ravikumar_hp> please bear with mee
17:31:16 <Ravikumar_hp> I want to put up my point
17:31:25 <Ravikumar_hp> I want to raise one concern.
17:31:25 <Ravikumar_hp> Test developement and contribution seems to be really pain.
17:31:25 <Ravikumar_hp> It takes 10 patches to get through even for people contributing to Tempest for more than one year.
17:31:25 <Ravikumar_hp> We need to have some policy on reviews.
17:31:27 <Ravikumar_hp> It appears many times , some late entrants offer new comments/suggestions when it seems like code review done and one more review cycle.
17:31:30 <Ravikumar_hp> we need to refine the process .. Otherwise , it slows down test developement cycle and difficult to maintain contributor's motivation
17:32:32 <afazekas> Ravikumar_hp: I agree, do you have a recommendation how to do it?
17:32:33 <mkoderer> Ravikumar_hp: I understand that point
17:32:41 <mkoderer> but it's hard to solve that
17:32:47 <mtreinish> Ravikumar_hp: I understand what you're saying but the review process is needed
17:32:53 <mtreinish> we need to ensure code quality
17:32:59 <mtreinish> and review resources are limited
17:33:11 <Ravikumar_hp> as a group, we need to fix this
17:33:12 <mtreinish> so sometimes it takes time
17:33:21 <Ravikumar_hp> now a days , all the reviews
17:33:33 <Ravikumar_hp> I can agree for framework design
17:33:52 <Ravikumar_hp> if test developement takes 10 reviews, then something is wrong
17:33:58 <Ravikumar_hp> It is not working well
17:34:06 <afazekas> mtreinish: the small nits can be fixed later
17:34:12 <marun> disagree
17:34:20 <marun> You fix it before merge or it doesn't get fixed
17:34:33 <mtreinish> marun: +1 that has been my experience as well
17:34:42 <Ravikumar_hp> we cannot move forward if one test contribution takes one month / 10 patch reviews
17:34:47 <marun> My suggestion is to have a guide for what needs to be done, so at least the criteria is clear.
17:34:55 <marun> If the submitter does not follow the guide, then it takes 10 reviews.
17:34:56 <patelna> thiis is over engineering test code
17:34:57 <mkoderer> the question is why does it take 10 patchsets
17:35:05 <marun> If they do follow the guide, it gets in faster.
17:35:15 <patelna> in your own company, do qa has 10 or more reviews?
17:35:24 <mkoderer> it's just because of nits.. then it's too much
17:35:26 <afazekas> marun: IMHO not the 10 or 100 review is the issue, the 1 month
17:35:27 <Ravikumar_hp> many times , reviewers change from patch to patch
17:35:33 <marun> In your own company, do you have clear criteria for what is good?
17:35:42 <afazekas> marun: waiting on review reply
17:35:42 <patelna> we need to be agile in our test development process...I suggest we have 2 to 3 reviews
17:35:55 <marun> That's a different issue, though.
17:36:02 <Ravikumar_hp> if two reviwers take care of one code submission , we can finish in max 3 patches
17:36:10 <marun> afazekas: timely review vs quality of review
17:36:19 <patelna> yes - we trust our QA
17:36:31 <marun> Well, we need that same criteria in tempest.
17:36:41 <Ravikumar_hp> just 20 lines of code takes one month - ten patches
17:36:42 <marun> We need it in writing, so it isn't just in some people's head.
17:36:58 <afazekas> marun: yes
17:37:01 <patelna> so lets do 2 things to improve this process (a) define what will code review consists off? a checklist (b) reduce the reviewer to 2 to 3
17:37:02 <marun> Ravikumar_hp: So think of a way to fix the problem that does not result in lower review quality.
17:37:15 <mkoderer> I don't think that limiting the number of reviewer will be a good solution
17:37:27 <mkoderer> ^reviewers
17:37:28 <patelna> it is
17:37:30 <afazekas> IMHO every typical review issue should be in the HACKING,rst
17:37:38 <marun> +1000
17:37:46 <Ravikumar_hp> marun: my suggestion - only TWO reviewers per one submission
17:37:47 <marun> And over time that list should evolve
17:37:48 <mtreinish> afazekas: yes that should be the case
17:37:52 <patelna> do anyone know how many reviews dev code goes thru before merge?
17:37:58 <mtreinish> but I think we might have some gaps I'm not sure
17:38:01 <mkoderer> Ravikumar_hp: only two core reviewers?
17:38:02 <marun> Ravikumar_hp: I don't think that's workable.
17:38:02 <afazekas> some additional style issue can be tested by flake8
17:38:06 <marun> Two core people, fine.
17:38:18 <patelna> +1 Ravikumar_hp
17:38:20 <marun> But there are often stakeholders outside of those two
17:38:29 <marun> Look, tempest is not unique
17:38:32 <mkoderer> -1
17:38:36 <marun> There are tons of core projects that have the same challenges.
17:38:38 <mtreinish> patelna, Ravikumar_hp: limiting the number of reviews is not the solution
17:38:40 <patelna> we have less QA contributors
17:38:44 <marun> Thinking that we are special and need to do something different?  Just silly.
17:39:22 <Ravikumar_hp> mtreinish: we want good quality , but need to refine so as to minimize patches/cycles, duration
17:39:33 <marun> How about maximizing patch quality?
17:39:36 <patelna> do u want to add more coverage or do less tests and more reviewers -- and then turnign people away as they been frustrated
17:39:49 <marun> Tough
17:40:00 <marun> If we don't screen for quality, things fall apart.
17:40:09 <mtreinish> ok I think that this topic has been played out enough. I think we should move on
17:40:09 <patelna> we really need to draft a guideline for reviewers/checklist
17:40:11 <marun> So let's focus on improving patch quality
17:40:13 <marun> NOT
17:40:18 <marun> reducing review quantity
17:40:19 <mtreinish> does anyone have any reviews they want to bring up
17:40:32 <patelna> no one is saying don't screen for the quality -- you are missing the point
17:40:35 <mkoderer> I think if someone feels fruststarted it the best way to have direct communication via IRC...
17:40:48 <afazekas> https://review.openstack.org/#/c/35165/
17:40:49 <Ravikumar_hp> mtreinish: Thanks
17:41:39 <afazekas> looks like I got an opposite  review response at the end, can I return to something  closer to the original version ?
17:41:40 <mtreinish> afazekas: that's marked as abandonded
17:42:27 <mtreinish> are there any other reviews? Otherwise we'll move on
17:42:43 <afazekas> mtreinish: restored
17:43:05 <mtreinish> afazekas: ok cool
17:43:07 <mkoderer> ;)
17:44:54 <afazekas> I will move back to the original new module / function style unless otherwise requested
17:44:57 <mtreinish> afazekas: I'll have to take a look in detail after the meeting
17:45:14 <mtreinish> ok if there aren't any other reviews that people want to bring attention to then let move on to the open discussion
17:45:21 <mtreinish> #topic open discussion
17:45:27 <marun> I have a testr question
17:45:31 <mtreinish> marun: ok
17:45:44 <marun> As per discussion yesterday, it appears tempest is broken on py26
17:46:01 <marun> The fix would be moving away from setupClass?
17:46:20 <marun> Is there a plan/effort underway to accomplish that?
17:46:26 <afazekas> marun: IMHO the fixing patch is merged
17:46:35 <marun> since yesterday?
17:46:46 <mtreinish> marun: not currently we use setupClass fairly extensively throughout tempest
17:46:59 <mtreinish> reworking things to avoid using it would be a huge undertaking
17:47:14 <afazekas> marun: the patch was older, but it contained the py 2.6 compatibility step
17:47:35 <marun> afazekas: are you running successfully on 2.6 then?
17:47:46 <marun> This is really important for Red Hat.
17:48:15 <marun> We need to run Tempest on RHEL 6.4 with py26.
17:48:36 <marun> If anyone is running on 2.6 then I'd be happy to talk offline about what I might be doing wrong.
17:49:04 <marun> But if not, then Red Hat will likely want to see a move away from setupClass and will devote resources to making it happen.
17:49:42 <mtreinish> marun: I know there have been troubles with py26 lately especially after we've been moving to testr
17:49:59 <mtreinish> but I don't have py26 on any of my systems so I haven't been able to test things
17:50:03 <afazekas> marun: are you using nosetests or testr ?
17:50:33 <marun> afazekas: testr + py26 is broken because it doesn't seem to invoke setupClass
17:50:50 <marun> afazekas: nose appears to work, but has to be manually invoked
17:50:52 <mtreinish> marun: but if you switch back to nose would it work?
17:50:57 <afazekas> marun: you can run it with nose on py 2.6
17:51:25 <marun> So manually run?
17:51:26 <mtreinish> marun: ok, then would adding a nondefault job in tox and run tests to use nose solve this
17:51:35 <marun> mtreinish: +1
17:51:49 <marun> It's not so much for me, but allowing non-developers to be able to run tempest trivially.
17:52:00 <mtreinish> marun: ok that's simple enough
17:52:03 <marun> Ok, cool.
17:52:29 <mtreinish> #action mtreinish to add options to use nose instead of testr for py26 compat
17:52:38 <mtreinish> mkoderer: so much for pulling out all the nose references then
17:52:48 <marun> :)
17:52:49 <mkoderer> mtreinish: yeah
17:53:12 <marun> rhel'd again ;)
17:53:31 <mtreinish> ok, are there any other topics to discuss in the last 7 min?
17:53:48 <malini1> do we really want both testr and nose to be invoked, guessing they do not cover exactly the same stuff
17:54:14 <marun> malini1: I think the alternative is removing the use of setupClass, which is desirable but costly.
17:54:39 <mtreinish> malini1: it would either or. Testr is still the default but if you run with py26 you'll have to use nose
17:55:00 <malini1> got it -- thanks
17:55:07 <mkoderer> is it a know bug in testr?
17:55:13 <mkoderer> ^known
17:55:14 <uvirtbot> mkoderer: Error: "known" is not a valid command.
17:55:36 <marun> mkoderer: yes, I talked with lifeless about it yesterday.
17:56:11 <mkoderer> ok - so if this would be fixed we could switch to testr
17:56:28 <afazekas> marun: the numbered tests also should be fixed when they are start working
17:57:18 <marun> mkoderer: if setupClass was not used, then testr + py26 would play nicely
17:57:31 <marun> afazekas: Ah, yes.
17:57:31 <afazekas> marun: it is required to run as a stress test
17:57:51 <marun> afazekas: Why required for a stress test?
17:58:04 <marun> afazekas: I thought the way to handle order was simply to inline the functionality?
17:58:27 <afazekas> because you specify exactly on test case IE  test method
17:58:51 <mtreinish> so we've got ~1 min left. afazekas, marun do you want to pick this up on qa?
17:59:00 <marun> mtreinish: sounds good
17:59:04 <afazekas> ok
17:59:22 <mtreinish> ok I guess a good as point as any to stop
17:59:38 <mtreinish> #endmeeting