04:00:35 <krtaylor> #startmeeting third-party
04:00:36 <openstack> Meeting started Wed Apr 22 04:00:35 2015 UTC and is due to finish in 60 minutes.  The chair is krtaylor. Information about MeetBot at http://wiki.debian.org/MeetBot.
04:00:37 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
04:00:39 <openstack> The meeting name has been set to 'third_party'
04:00:58 <krtaylor> anyone here for third party ci working group meeting?
04:01:34 <asselin_> hi
04:01:45 <krtaylor> hi asselin_
04:01:53 <asselin_> hi krtaylor
04:05:11 <asselin_> i guess it's just us?
04:11:44 <krtaylor> asselin_, ?
04:12:02 <asselin_> krtaylor, hi
04:12:32 <krtaylor> what did you miss? what  was the last I said?
04:12:44 <asselin_> <krtaylor> hi asselin_
04:12:53 <krtaylor> wow
04:12:55 <asselin_> I guess I missed everything
04:12:55 <krtaylor> I'd really like to figure out who is kicking me right when I have a meeting
04:13:23 <krtaylor> I never have a problem with openstack meetings until its my turn
04:14:05 <krtaylor> I didn't lose my network connection, just with freenode #openstack-meeting-4
04:14:10 * asselin_ wonders if the bot 'heard' anything
04:14:14 <krtaylor> whatever
04:14:31 <krtaylor> ok, replay
04:14:42 <krtaylor> well this might be a short meeting, since we are lacking a quorum form non-US timezones once again
04:14:51 <krtaylor> after summit we should re-examine all the different third party meetings and times
04:14:57 <asselin_> agree
04:15:08 <krtaylor> thanks for your reviews to https://review.openstack.org/#/c/175520
04:15:16 <krtaylor> I am excited to get the repo in place
04:15:25 <krtaylor> we have 4 or 5 different tools that we'll push (IBM PowerKVM)
04:15:48 <krtaylor> asselin_, did you have anything you wanted to bring up?
04:15:57 <asselin_> yes, looking forward to check that out
04:16:10 <patrickeast> hi, sorry im late
04:16:12 <asselin_> nothing since the last 3rd party meeting :)
04:16:21 <krtaylor> hi patrickeast
04:16:24 <asselin_> we do have a zuul refactor patch up
04:16:26 <asselin_> up patrickeast
04:16:39 <krtaylor> I got kicked again, you didnt miss much  :)
04:16:50 <asselin_> #link https://review.openstack.org/#/c/175970/
04:17:11 <asselin_> need to figure out the best way to test these changes
04:17:25 <asselin_> automated test I mean
04:18:01 <patrickeast> are there frameworks for puppet script testing kind of things?
04:18:42 <asselin_> yes, but clarkb doesn't like them b/c they basically make you rewrite the entire puppet script in another file
04:18:58 <asselin_> nibalizer's looking into envassert.
04:19:07 <patrickeast> ew yea if you have to rewrite the whole thing that sucks
04:19:18 <asselin_> I don't know the full details...I think he had mixed feelings about it
04:19:56 <asselin_> my approach is more functional, via bash script https://review.openstack.org/#/c/169117/
04:20:06 <asselin_> clarkb likes it
04:20:22 <asselin_> but it needs to be refined a bit to simplify and be more reusable
04:20:38 <patrickeast> interesting
04:20:43 <patrickeast> that looks like it would work pretty well
04:20:48 <krtaylor> nice
04:21:15 <patrickeast> that envassert module looks pretty nifty too
04:22:00 <patrickeast> so it looks like we have some options then
04:22:01 <asselin_> I didn't get a chance to look at it, but it would simplify a lot of verification
04:22:40 <asselin_> sorry need to step away for 1-2 minutes
04:23:09 <krtaylor> np, not much happening this meeting
04:23:25 <krtaylor> patrickeast, did you have anything you needed to discuss?
04:23:47 <patrickeast> nope, nothing new from me
04:24:05 <patrickeast> day job takes most of my time from thirdparty stuff :(
04:24:20 <patrickeast> and chasing ghosts with our ci system
04:24:23 <krtaylor> yeah, tell me about it
04:24:32 <patrickeast> something wrong with our networking or openvswitch config
04:24:47 <krtaylor> oh
04:24:48 <asselin_> i'm back
04:24:50 <patrickeast> every once and a while the tests just can’t ssh into an instance
04:24:59 <patrickeast> but no errors in any log that i’ve found yet
04:25:09 <patrickeast> just a fixture timeout and test failure
04:25:38 <patrickeast> haven’t been able to reproduce it with a node that i have held onto yet either
04:26:00 <krtaylor> held onto?
04:26:06 <asselin_> nodepool hold
04:26:19 <krtaylor> ah, right
04:26:20 <patrickeast> yea keep the vm around so nodepool doesn’t recycle it
04:27:13 <patrickeast> it started happening when we switched our providers public network from vxlan to flat
04:27:18 <patrickeast> thats the only thing thats new
04:27:24 <patrickeast> unless its a new bug in openstack somewhere
04:28:20 <krtaylor> possible, we try to find any new problems in our development environment before we roll them out to production
04:28:40 <patrickeast> same here
04:28:53 <patrickeast> doesn’t happen frequently enough that we saw it until it was too late
04:28:57 <krtaylor> but our test targets are different
04:29:22 <patrickeast> i recently added additional test configurations so we have 3x the number of runs to smoke it out for each patchset
04:29:45 <patrickeast> soon to be 5x when we get our fc stuff underway
04:29:56 <patrickeast> so i have hope to piece together whats going on eventually
04:30:05 <krtaylor> you are running 3x on each patch? wow
04:30:25 <patrickeast> yea, one normal, one with multipath enabled, and one with CHAP authentication enabled
04:30:38 <patrickeast> different enough code paths in cinder that it can pass on one and fail the others
04:30:50 <patrickeast> had a bug that almost didn’t make it into kilo because we weren’t testing them all
04:31:12 <asselin_> we need to start testing with chap
04:31:38 <asselin_> but honestly the issue we have is that it doesn't scale
04:32:00 <patrickeast> yea scaling them is a pain
04:32:07 <patrickeast> such a big test matrix with all the config options
04:32:23 <patrickeast> toss in a handful of hypervisors and os targets…
04:32:25 <asselin_> ideally, all those paths would be in one job
04:32:37 <asselin_> b/c a lot of time is spent on setup
04:32:56 <patrickeast> hmm thats something worth looking into
04:33:14 <patrickeast> set up devstack gate, run one, reconfigure, run again, repeat
04:33:22 <asselin_> yes
04:33:44 <patrickeast> would definitely cut down the time quite a bit
04:33:48 <patrickeast> like 15 min per run
04:33:55 <patrickeast> per job*
04:34:18 <asselin_> yeah...for some reason our job times are back to over 1 hour. they used to be 30 minutes....
04:34:51 <asselin_> that's another issue we should add to the summit
04:35:04 <asselin_> 1. multiple test configurations in a single run
04:35:06 <patrickeast> i recently cut out the boto tests from ours and it helped reduce the time
04:35:12 <patrickeast> but i think you already skip them, right?
04:35:26 <krtaylor> hm, ours have been dropping
04:35:36 <asselin_> 2. profiling ci jobs
04:35:47 <krtaylor> ++
04:36:12 <asselin_> patrickeast, yes, we're skipping. they fail anyways...not sure why, and not sure how important they are
04:36:46 <patrickeast> yea they were very prone to failing with the timeouts i’ve seen in our system
04:36:56 <patrickeast> i too am unsure how important they are
04:37:19 <patrickeast> my understanding is there is some compatibility layer so you can do openstack stuff via a boto interface
04:38:05 <patrickeast> but it looks like they end up just creating volumes/instances and what not like any of the other tests
04:40:12 <asselin_> I think the boto service is just not comming up /accessible in our case. but I need to take another look to see what's really going on.
04:40:39 <asselin_> but I have more important things to do...so skipping
04:41:06 <krtaylor> asselin_,  do you know if there is a cinder ci testing session proposed?
04:41:10 <asselin_> I added those 2 items to the etherpad
04:41:13 <patrickeast> haha yea… its on my backlog somewhere
04:41:26 <asselin_> krtaylor, I don't think so.....but could be wrong
04:41:37 <patrickeast> i don’t remember seeing one on the etherpad
04:41:42 <asselin_> I know there's a manila one
04:42:19 <krtaylor> so I guess we could discuss at the cross-project proposed session
04:42:43 <krtaylor> not sure if we'll get that and the infra session, but we'll see
04:43:32 <krtaylor> hehheh, so much for a short meeting  :)
04:44:40 <krtaylor> well, the only topics for discussion were topics for summit
04:44:49 <krtaylor> and the repo patchset
04:45:03 <krtaylor> #link https://review.openstack.org/#/c/175520
04:45:11 <krtaylor> thanks for your reviews
04:46:14 <patrickeast> looks like it is pretty much ready to go now, yea?
04:46:57 <krtaylor> I hope so, I can't imagine any more typos :)
04:47:05 <patrickeast> lol
04:49:23 <krtaylor> anything else to discuss asselin_  or patrickeast ?
04:49:34 <asselin_> that's it from me
04:49:41 <patrickeast> nothin else here
04:50:56 <krtaylor> ok, thanks everyone, I'll shut this down then
04:51:40 <krtaylor> #endmeeting