17:00:19 #startmeeting third-party 17:00:22 Meeting started Tue Mar 8 17:00:19 2016 UTC and is due to finish in 60 minutes. The chair is asselin. Information about MeetBot at http://wiki.debian.org/MeetBot. 17:00:23 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 17:00:26 The meeting name has been set to 'third_party' 17:01:03 anyone here for 3rd party working meeting? 17:01:51 hi asselin 17:02:08 * mmedvede was fighting fires 17:02:24 hi mmedvede 17:02:31 fires under control? 17:02:41 yes 17:03:21 #topic announcements 17:03:29 any announcements? 17:03:55 #topic CI Watch 17:04:30 no news since last meeting 17:04:47 #link ciwatch unittests review queue https://review.openstack.org/#/q/status:open+project:openstack-infra/ciwatch+branch:feature/unit-tests+topic:ci-dashboard 17:05:32 Anything specific? or just review all of them? 17:05:40 I was not merging any of it yet, asselin feel free +2 if you think it works well 17:05:48 will do 17:06:11 asselin: so I realized that we can not merge from master or back to master yet 17:06:40 oh? what does that mean exactly 17:06:50 there are fixes on master that I'd like to get backported 17:07:04 asselin: you can not push merge commits, need special permissions 17:07:19 * mmedvede looks for patch 17:07:20 I thought we enabled that 17:07:50 #link https://review.openstack.org/#q,Ie645ef86551de74e3f53b8405e0d00f73f9c9b41,n,z 17:08:40 asselin: no, we did not. And you also need to be a member of ciwatch-release group, which now only has infra in it 17:08:56 yeah seein ghtat 17:10:04 ok I can ask about adding us to that 17:10:15 anything else? 17:11:06 no, I think we can finally add some real unit tests next 17:11:39 ok cool! 17:11:57 #topic Common-CI Solution 17:13:06 only new issue is that the current docs basically ask users to reuse openstack-infra/system-config 17:13:34 and this points to some pip sites that aren't recommended or accessible outside of -infra. 17:13:36 for install_modules.sh and image building, correct? 17:13:42 yes 17:14:06 and install_puppet.sh 17:14:18 but that one is less of a concern than the ones you mention 17:14:45 I see no problems with reusing install_puppet and modules 17:14:47 so there's a need to perhaps make image building more easily reusable by 3rd party folks 17:14:53 image building is confusing matters though 17:15:54 there is dib image builds, and there is image update that e.g. uses prepare_node.sh 17:16:56 asselin: we could just say that image building is up to third-party CI maintainers, and see system-config for example 17:17:34 yeah....that's bascially what it says now....but folks are having trouble with 'how to debug and maintain' image builds. 17:17:35 another way is to create a separate spec for pulling image building out 17:17:46 (I think it is a big effort) 17:18:53 yeah. Not sure who'd do the work though....maybe if we write the spec we'll find some volunteers to drive the effort? 17:20:58 It is tricky to estimate what it would take. I am familiar with prepare_node way. Less so with dib 17:21:14 it's basically the same thing 17:21:52 yes, both simply running bunch of scripts to get images ready 17:22:12 one of them involves puppet though 17:22:23 they both do, actually 17:22:39 they do? I thought dib just used elements 17:22:47 yes, and the element calls puppet :) 17:22:54 do some of those elements apply some puppet manifests from system-config? 17:23:19 elements are just script snippets. DIB composes them wherease snapshot-images you have to compose them 17:23:28 yes 17:24:33 #link https://git.openstack.org/cgit/openstack-infra/project-config/tree/nodepool/elements/puppet/bin/prepare-node 17:24:42 you see, for me it is hard to imagine how to abstract image building puppet manifests from system-config. Our images have a lot of custom setup in them 17:25:40 bascially class {'openstack_project::single_use_slave': needs to be openstackci::signle_use_slave 17:26:09 asselin: yes, but it might pull in a whole lot of things with it 17:26:33 yes, probably need to strip it down a bit 17:26:39 it is also very specific for building only particular type of images 17:26:58 these are for the devstack images which (I think) is the common case for 3rd party ci 17:27:33 I mean, some people might use different OS, or arch (points finger at self) 17:28:10 actually maybe only this is needed: https://git.openstack.org/cgit/openstack-infra/system-config/tree/modules/openstack_project/manifests/single_use_slave.pp#n43 17:30:15 #link but slave_common.pp also looks important: https://git.openstack.org/cgit/openstack-infra/system-config/tree/modules/openstack_project/manifests/slave_common.pp 17:30:30 anyway perhaps these can be done in project-config-example 17:30:54 instead of asking users to copy from project-config 17:32:13 but we should write a spec to detail the work and get more feedback 17:32:39 #action asselin write a spec to create image builds reusable by 3rd party ci 17:33:48 #topic Open Discussion 17:33:53 asselin: maybe start with etherpad instead first, just to iterate initial work estimate quicker? 17:34:06 mmedvede, sure, good idea 17:34:54 anything else to discuss? 17:35:32 I have already discussed apparent zuul memory leak during Mon meeting 17:35:43 (there is also email thread) 17:36:34 So if anyone else experiences higher memory footprint of zuul-server, please speak up 17:37:13 I haven't noticed it on our end. I'll have to check which version we're using 17:37:41 it is slow (if it is a leak), about 500Mb a day 17:37:59 so on my VM with 8GB could take a whole week or longer 17:38:07 before crashing 17:38:49 we're on this version: https://review.openstack.org/#/c/262579/ 17:38:52 #link zuul memory leak thread http://lists.openstack.org/pipermail/openstack-infra/2016-March/003971.html 17:39:15 asselin: what version shows up on the bottom of zuul ui? 17:39:40 Zuul version: 17:39:48 Zuul version: 2.1.1.dev121 17:41:01 looks like the one before the patch that should fix the leak. So if anything, you should see more problems 17:41:23 asselin: do you know if your zuul gets restarted regurarly? 17:41:43 we did restart it 'regularly' but due to other issues 17:42:05 which patch fixes the leak? 17:42:29 asselin: allegedly fixes the leak 17:42:44 sorry....alegedly :) 17:42:53 #link https://review.openstack.org/#q,I81ee47524cda71a500c55a95a2280f491b1b63d9,n,z 17:43:08 asselin: ^ 17:43:48 ok thanks 17:45:23 anyone else around have something to discuss? 17:45:34 I think zuul.o.o just recently got updated to the most recent version, so would be interesting to see if they have the same problem 17:46:55 yeah. I'm going to pay more attention to ours. 17:48:34 mmedvede, anything else to discuss? 17:48:40 no 17:48:56 ok thanks for the good discussion 17:49:03 #endmeeting