04:00:35 #startmeeting third-party 04:00:36 Meeting started Wed Apr 22 04:00:35 2015 UTC and is due to finish in 60 minutes. The chair is krtaylor. Information about MeetBot at http://wiki.debian.org/MeetBot. 04:00:37 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 04:00:39 The meeting name has been set to 'third_party' 04:00:58 anyone here for third party ci working group meeting? 04:01:34 hi 04:01:45 hi asselin_ 04:01:53 hi krtaylor 04:05:11 i guess it's just us? 04:11:44 asselin_, ? 04:12:02 krtaylor, hi 04:12:32 what did you miss? what was the last I said? 04:12:44 hi asselin_ 04:12:53 wow 04:12:55 I guess I missed everything 04:12:55 I'd really like to figure out who is kicking me right when I have a meeting 04:13:23 I never have a problem with openstack meetings until its my turn 04:14:05 I didn't lose my network connection, just with freenode #openstack-meeting-4 04:14:10 * asselin_ wonders if the bot 'heard' anything 04:14:14 whatever 04:14:31 ok, replay 04:14:42 well this might be a short meeting, since we are lacking a quorum form non-US timezones once again 04:14:51 after summit we should re-examine all the different third party meetings and times 04:14:57 agree 04:15:08 thanks for your reviews to https://review.openstack.org/#/c/175520 04:15:16 I am excited to get the repo in place 04:15:25 we have 4 or 5 different tools that we'll push (IBM PowerKVM) 04:15:48 asselin_, did you have anything you wanted to bring up? 04:15:57 yes, looking forward to check that out 04:16:10 hi, sorry im late 04:16:12 nothing since the last 3rd party meeting :) 04:16:21 hi patrickeast 04:16:24 we do have a zuul refactor patch up 04:16:26 up patrickeast 04:16:39 I got kicked again, you didnt miss much :) 04:16:50 #link https://review.openstack.org/#/c/175970/ 04:17:11 need to figure out the best way to test these changes 04:17:25 automated test I mean 04:18:01 are there frameworks for puppet script testing kind of things? 04:18:42 yes, but clarkb doesn't like them b/c they basically make you rewrite the entire puppet script in another file 04:18:58 nibalizer's looking into envassert. 04:19:07 ew yea if you have to rewrite the whole thing that sucks 04:19:18 I don't know the full details...I think he had mixed feelings about it 04:19:56 my approach is more functional, via bash script https://review.openstack.org/#/c/169117/ 04:20:06 clarkb likes it 04:20:22 but it needs to be refined a bit to simplify and be more reusable 04:20:38 interesting 04:20:43 that looks like it would work pretty well 04:20:48 nice 04:21:15 that envassert module looks pretty nifty too 04:22:00 so it looks like we have some options then 04:22:01 I didn't get a chance to look at it, but it would simplify a lot of verification 04:22:40 sorry need to step away for 1-2 minutes 04:23:09 np, not much happening this meeting 04:23:25 patrickeast, did you have anything you needed to discuss? 04:23:47 nope, nothing new from me 04:24:05 day job takes most of my time from thirdparty stuff :( 04:24:20 and chasing ghosts with our ci system 04:24:23 yeah, tell me about it 04:24:32 something wrong with our networking or openvswitch config 04:24:47 oh 04:24:48 i'm back 04:24:50 every once and a while the tests just can’t ssh into an instance 04:24:59 but no errors in any log that i’ve found yet 04:25:09 just a fixture timeout and test failure 04:25:38 haven’t been able to reproduce it with a node that i have held onto yet either 04:26:00 held onto? 04:26:06 nodepool hold 04:26:19 ah, right 04:26:20 yea keep the vm around so nodepool doesn’t recycle it 04:27:13 it started happening when we switched our providers public network from vxlan to flat 04:27:18 thats the only thing thats new 04:27:24 unless its a new bug in openstack somewhere 04:28:20 possible, we try to find any new problems in our development environment before we roll them out to production 04:28:40 same here 04:28:53 doesn’t happen frequently enough that we saw it until it was too late 04:28:57 but our test targets are different 04:29:22 i recently added additional test configurations so we have 3x the number of runs to smoke it out for each patchset 04:29:45 soon to be 5x when we get our fc stuff underway 04:29:56 so i have hope to piece together whats going on eventually 04:30:05 you are running 3x on each patch? wow 04:30:25 yea, one normal, one with multipath enabled, and one with CHAP authentication enabled 04:30:38 different enough code paths in cinder that it can pass on one and fail the others 04:30:50 had a bug that almost didn’t make it into kilo because we weren’t testing them all 04:31:12 we need to start testing with chap 04:31:38 but honestly the issue we have is that it doesn't scale 04:32:00 yea scaling them is a pain 04:32:07 such a big test matrix with all the config options 04:32:23 toss in a handful of hypervisors and os targets… 04:32:25 ideally, all those paths would be in one job 04:32:37 b/c a lot of time is spent on setup 04:32:56 hmm thats something worth looking into 04:33:14 set up devstack gate, run one, reconfigure, run again, repeat 04:33:22 yes 04:33:44 would definitely cut down the time quite a bit 04:33:48 like 15 min per run 04:33:55 per job* 04:34:18 yeah...for some reason our job times are back to over 1 hour. they used to be 30 minutes.... 04:34:51 that's another issue we should add to the summit 04:35:04 1. multiple test configurations in a single run 04:35:06 i recently cut out the boto tests from ours and it helped reduce the time 04:35:12 but i think you already skip them, right? 04:35:26 hm, ours have been dropping 04:35:36 2. profiling ci jobs 04:35:47 ++ 04:36:12 patrickeast, yes, we're skipping. they fail anyways...not sure why, and not sure how important they are 04:36:46 yea they were very prone to failing with the timeouts i’ve seen in our system 04:36:56 i too am unsure how important they are 04:37:19 my understanding is there is some compatibility layer so you can do openstack stuff via a boto interface 04:38:05 but it looks like they end up just creating volumes/instances and what not like any of the other tests 04:40:12 I think the boto service is just not comming up /accessible in our case. but I need to take another look to see what's really going on. 04:40:39 but I have more important things to do...so skipping 04:41:06 asselin_, do you know if there is a cinder ci testing session proposed? 04:41:10 I added those 2 items to the etherpad 04:41:13 haha yea… its on my backlog somewhere 04:41:26 krtaylor, I don't think so.....but could be wrong 04:41:37 i don’t remember seeing one on the etherpad 04:41:42 I know there's a manila one 04:42:19 so I guess we could discuss at the cross-project proposed session 04:42:43 not sure if we'll get that and the infra session, but we'll see 04:43:32 hehheh, so much for a short meeting :) 04:44:40 well, the only topics for discussion were topics for summit 04:44:49 and the repo patchset 04:45:03 #link https://review.openstack.org/#/c/175520 04:45:11 thanks for your reviews 04:46:14 looks like it is pretty much ready to go now, yea? 04:46:57 I hope so, I can't imagine any more typos :) 04:47:05 lol 04:49:23 anything else to discuss asselin_ or patrickeast ? 04:49:34 that's it from me 04:49:41 nothin else here 04:50:56 ok, thanks everyone, I'll shut this down then 04:51:40 #endmeeting