15:01:30 <mwhahaha> #startmeeting puppet-openstack
15:01:30 <openstack> Meeting started Tue Nov 22 15:01:30 2016 UTC and is due to finish in 60 minutes.  The chair is mwhahaha. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:01:31 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:01:34 <openstack> The meeting name has been set to 'puppet_openstack'
15:01:36 <EmilienM> o/
15:01:37 <iurygregory> o/
15:01:43 <mwhahaha> #link https://etherpad.openstack.org/p/puppet-openstack-weekly-meeting-20161122
15:01:47 <mkarpin> Hi
15:01:47 <mwhahaha> ahoy peoples
15:01:56 <EmilienM> why google calendar says it's in one hour for me
15:02:05 <mwhahaha> daylight savings
15:02:07 <mwhahaha> or lack there of
15:02:12 <mwhahaha> we switched 2 weeks ago
15:02:13 <EmilienM> awesome
15:02:16 <mwhahaha> you know when you were on PTO :D
15:02:25 <zhongshengping> o/
15:02:29 <EmilienM> I was probably sleeping
15:02:44 <mwhahaha> which is terrible cause the tripleo meeting is at 7am now for me
15:02:47 <mwhahaha> anyway
15:02:56 <mwhahaha> #topic past action items
15:03:05 <mwhahaha> EmilienM to collect openstack deprecations and file launchpad bugs: need to be postponed
15:03:41 <EmilienM> yeah this one is on my list
15:03:47 <mwhahaha> mwhahaha to propose a virtual midcycle for next year - any one have any thoughts on when they'd like to do something? Feb/march?  there weren't any sprints listed on the page last time i checked for Ocata
15:03:48 <EmilienM> but honestly I won't do it soonish
15:04:02 <mwhahaha> or Jan might be better
15:04:10 <EmilienM> mwhahaha: before PTG maybe?
15:04:22 <mwhahaha> k maybe i'll just pick a date
15:04:27 <EmilienM> or after, dunno
15:04:39 <mwhahaha> EmilienM sync with RDO about puppet upgrade for packstack/tripleo: blocker in packaging, facter3 (can't find the BZ)
15:04:53 <mwhahaha> it was some boost package, i saw the bz yesterday
15:05:10 <EmilienM> ok
15:05:15 <mwhahaha> so we'll just have to keep pushing on that
15:05:41 <EmilienM> i'm on it
15:05:42 <mwhahaha> #topic Moving version bumps to the beginning of the cycle
15:05:49 <mwhahaha> Unfortunately due to various CI issues, attempting to land just a simple version bump took over 4 days.  Since the milestones are date based (and assume versioning after tag), I propose that we pre-land all the metadata.json changes either immediately after the tag or several weeks in advance.  Thoughts?
15:06:01 <EmilienM> +1
15:06:14 <mwhahaha> not sure which is better, but waiting to the last minute was terrible this last release
15:06:15 <EmilienM> we could even automatize it with the bot
15:06:28 <EmilienM> like a job that does it, in the pipeline of release
15:06:42 <mwhahaha> dhellmann wanted to chat about stopping the manual update, so i need to sync with him about that
15:06:42 <EmilienM> so everytime we push for the tag, right after we send update into metadata
15:06:51 <EmilienM> but it's hard to predic a new tag
15:06:56 <EmilienM> predict
15:06:58 <iurygregory> +1
15:07:10 <mwhahaha> well i've got scripts to do the minor/major version bump so maybe we can just hand those off
15:07:44 <mwhahaha> #action mwhahaha to sync with openstack release team about metadata.json updates
15:07:58 <mwhahaha> for now i'll probably propose them earlier than the final week
15:08:03 <EmilienM> yeah but how do you predict the tag
15:08:15 <EmilienM> it sounds like a first good iteration
15:08:30 <mwhahaha> yea i think it'll require some more work on the release side
15:08:33 <mwhahaha> we'll see
15:08:47 <EmilienM> we can use a file that set the tags / dates
15:08:49 <EmilienM> and query it, etc
15:09:34 <mwhahaha> well i guess techincally the release job could propose the version bumps after we tag the current stuff
15:09:56 <mwhahaha> there would be a slight period of desync between the tarbals and the repos
15:10:00 <EmilienM> right, but at some point we need to bump the major tag
15:10:28 <EmilienM> why that?
15:10:42 <EmilienM> tarballs are names -master anyway
15:10:49 <EmilienM> it ignores the metadata.json
15:10:50 <mwhahaha> if you have the release job do the metadata.json update, you have to propose where you want the tag to be
15:11:03 <mwhahaha> they aren't -master, we use version numbers i thought
15:11:03 <EmilienM> let me find the script I wrote
15:11:10 <EmilienM> yes, only for tags
15:11:13 <EmilienM> a sec
15:11:26 <mwhahaha> right i'm not talking about the intermediate tarbals, just the release ones
15:11:32 <EmilienM> https://github.com/openstack-infra/project-config/blob/master/jenkins/scripts/run-tarball.sh#L28
15:12:17 <EmilienM> just to be clear in case I misunderstood: what I mean is that whatever when you update metadata.json, the master tarball will be created at each new commit merged in master
15:12:27 <EmilienM> and the tagged tarball will only be created at the release time
15:12:39 <EmilienM> because of ZUUL_REFNAME in the tag job
15:13:10 <mwhahaha> yea i'm not sure of the exact ordering, i'm trying to figure out when we do the metadata.json update vs what's being released and what impact that has to the end user
15:13:27 <EmilienM> it shouldn't change anything
15:13:31 <mwhahaha> maybe we just need to get it so after a release we always update to the next number
15:13:35 <mwhahaha> so we're always ahead
15:13:38 <EmilienM> if you update metadata.json now versus later
15:13:40 <mwhahaha> i think it can if people are using librarian
15:13:54 <mwhahaha> instead of r10k
15:14:08 <mwhahaha> anyway I'll work on this a bit over the next couple of weeks
15:14:26 <EmilienM> but yeah, I would love a job that update this file for us
15:14:32 <mwhahaha> moving on
15:14:33 <EmilienM> (and also the releasenote conf file)
15:14:38 <mwhahaha> yes
15:14:42 <mwhahaha> #topic CI scenarios
15:14:53 <mwhahaha> so we're hitting the 1h timeout on some of the scenarios
15:15:11 <mwhahaha> it seems better yesterday but we're getting up there on normal runs like 45-55 mins
15:15:27 <EmilienM> it's not directly related but stackviz could help us to see what tempest tests take more time than before
15:15:32 <mwhahaha> do we move some functionality out of the existing ones or up the timeouts
15:15:40 <EmilienM> and stackviz is broken on centos7 because npm can't be found when buidling the image in nodepool
15:15:47 <EmilienM> it's in my list of things to do (npm/centos)
15:16:19 <mwhahaha> yea that'll be helpful
15:16:31 <EmilienM> before moving features out, I would like to spend time on comparing a CI job from now and 3 weeks ago
15:16:39 <EmilienM> and see what is taking more time
15:16:48 <EmilienM> it could be a networking thing with RDO mirror
15:16:54 <EmilienM> or a tempest test that takes longer
15:16:57 <EmilienM> etc
15:17:03 <EmilienM> it's also in my list
15:17:12 <mwhahaha> yea i tried to look into it a bit and didn't notice anything that really stuck out
15:17:17 <mwhahaha> sometimes the puppetrun would take longer
15:17:23 <mwhahaha> sometimes the tests would take longer
15:17:26 <mwhahaha> wasn't really consistent
15:17:41 <EmilienM> also we need to keep in mind our scenarios are really busy
15:17:48 <EmilienM> comparing to regular devstack jobs
15:18:17 <EmilienM> we run a lot of things and we activate ssl everywhere, etc... So we might have increased the load slowly over the last months
15:18:30 <EmilienM> and now reaching the limit randomly because we didn't see it growing
15:18:45 <EmilienM> maybe we could move some stuffs to scenario004?
15:18:51 <EmilienM> iiuc only 001 and 002 are timeouting?
15:18:52 <iurygregory> +1
15:19:05 <mwhahaha> yea so far i think i've only seen it on 1/2
15:19:16 <iurygregory> also new things should go to 004 i think
15:19:19 <mwhahaha> and primarily centos
15:19:23 <EmilienM> iurygregory: yes
15:19:47 <EmilienM> mwhahaha: right, so 2 areas to investigate: is it a mirror issue on centos? or just the fact we run more services on centos7 nodes
15:19:59 <mwhahaha> yup
15:20:00 <EmilienM> 1) is easy to find out
15:20:06 <EmilienM> 2) is more tricky
15:20:08 <EmilienM> i'll take some actions
15:20:30 <EmilienM> #action EmilienM to investigate scenario001/002 timeouts (mirror issue? too much services?)
15:20:35 <mwhahaha> cool thanks
15:20:42 <mwhahaha> moving on
15:20:43 <EmilienM> I don't think it's a networki ngissue
15:20:49 <EmilienM> otherwise scenario003 would hit it too
15:20:54 <mwhahaha> yea
15:20:55 <EmilienM> and afik nothing changed in networking
15:21:01 <EmilienM> or dmsimard would have told us
15:21:09 <EmilienM> yeah let's move on
15:21:11 <mwhahaha> #topic Liberty EOL
15:21:14 <mwhahaha> #link http://lists.openstack.org/pipermail/openstack-dev/2016-November/107717.html
15:21:25 <mwhahaha> so we've got some modules not on the eol list (aodh/barbican)?
15:21:31 <EmilienM> FYI we're EOLing TripleO Liberty
15:21:36 <EmilienM> iberezovskiy: what about Fuel? ^
15:21:48 <EmilienM> mwhahaha: yeah, I saw... why that??
15:21:48 <mwhahaha> #link https://gist.github.com/tbreeds/93cd346c37aa46269456f56649f0a4ac#file-liberty_eol_data-txt-L308-L310
15:21:59 <mwhahaha> no idea, i don't know where that list is generated from
15:22:00 <EmilienM> mwhahaha: barbican? lol
15:22:07 <EmilienM> mwhahaha: ask tonyb
15:22:25 <EmilienM> but puppet-aodh is valid, we ned to EOL it too
15:22:39 <mwhahaha> they might not have had a release for liberty
15:22:44 <mwhahaha> which maybe why it's on the list
15:22:55 <iurygregory> if we have a branch we should EOL
15:22:58 <EmilienM> ah, just a branch?
15:23:03 <mwhahaha> so i'm going to check into it a bit more, i added it a few mins before the meeting
15:23:06 <iberezovskiy> EmilienM, we didn't switch liberty Fuel on liberty puppets, so it's ok
15:23:15 <EmilienM> https://github.com/openstack/puppet-aodh/releases/tag/7.0.0
15:23:18 <EmilienM> it has a release ^
15:23:26 <EmilienM> iberezovskiy: ack
15:23:42 <mwhahaha> ok so i'll check those out and get them EOL'ed as needed
15:23:53 <mwhahaha> just wanted to make sure that there weren't any outstanding liberty issues
15:24:03 <mwhahaha> looks like there might be two liberty reviews out there
15:24:09 <mwhahaha> so i'll take a look at those as well
15:24:20 <EmilienM> mwhahaha: feel free to share them
15:24:24 <mwhahaha> #link https://gist.github.com/tbreeds/93cd346c37aa46269456f56649f0a4ac#file-liberty_eol_data-txt-L118
15:24:27 <mwhahaha> looks like puppet-neutron
15:24:53 <EmilienM> https://review.openstack.org/#/q/branch:stable/liberty+project:%22%255Eopenstack/puppet-.*%2524%22+status:open
15:24:56 <EmilienM> #link https://review.openstack.org/#/q/branch:stable/liberty+project:%22%255Eopenstack/puppet-.*%2524%22+status:open
15:25:10 <EmilienM> i'll ping Lukas
15:25:22 <mwhahaha> ok cool
15:25:26 <EmilienM> done on #puppet-openstack
15:25:32 <mwhahaha> anyway that's all i have on that, just a FYI
15:25:34 <EmilienM> the other one can be dropped
15:25:48 <mwhahaha> #topic Open Discussion, Bug and Review triage
15:26:00 <mwhahaha> anyone have any general things they would like to chat about?
15:26:07 <EmilienM> it's snowing here
15:26:11 <mwhahaha> ditto
15:26:17 <mkarpin> guys wanted to ask help with this strange error http://logs.openstack.org/01/397701/1/check/gate-puppet-ceph-puppet-unit-4.5-centos-7/1b52838/console.html.gz#_2016-11-15_11_52_36_640441
15:26:35 <mwhahaha> mkarpin: bad variable
15:26:54 <mwhahaha> something is using Package[$variable] and it's unset
15:27:00 <mkarpin> it's after switch to rspec-puppet-facts...
15:27:17 <mwhahaha> we might not be setting the package name in params for fedora22
15:27:24 <EmilienM> mkarpin: nice work (the switch)
15:27:39 <EmilienM> why fedora22?
15:28:03 <mkarpin> all supported os
15:28:05 <mwhahaha> https://github.com/openstack/puppet-ceph/blob/master/manifests/osd.pp#L138
15:28:15 <mwhahaha> be it's ceph::params::pkg_policycoreutils
15:28:22 <mwhahaha> i'd drop the fedora support
15:28:44 <mwhahaha> or might be https://github.com/openstack/puppet-ceph/blob/master/manifests/osd.pp#L142
15:29:06 <mkarpin> it looks like set https://github.com/openstack/puppet-ceph/blob/master/manifests/params.pp#L67
15:29:19 <mkarpin> for all redhat family
15:30:07 <mwhahaha> hmm we can look further, but my thoughts would be it's not being set for some reason
15:30:36 <mkarpin> ok. will try to dig these deeper
15:31:38 <mwhahaha> ok cool, anything else?
15:32:03 <iurygregory> np
15:32:12 <EmilienM> i'm digging https://review.openstack.org/#/c/400760/
15:32:36 <EmilienM> we haven't got packaging promotion since... long
15:32:42 <EmilienM> oh and I pinged canonical this morning
15:32:49 <EmilienM> they provide ocata packages by next week or so
15:32:55 <iurygregory> nice
15:32:55 <iurygregory> =D
15:33:00 <EmilienM> i'm done
15:33:13 <mwhahaha> cool thanks everyone
15:33:20 <mwhahaha> #endmeeting