15:00:44 <witek> #startmeeting monasca
15:00:45 <openstack> Meeting started Wed Jan 10 15:00:44 2018 UTC and is due to finish in 60 minutes.  The chair is witek. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:00:46 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:00:49 <openstack> The meeting name has been set to 'monasca'
15:01:16 <fouadben> Hi all,
15:01:21 <jgr_> Hello
15:01:27 <witek> Hello everyone
15:02:06 <witek> hello rhochmuth, Happy New Year
15:02:14 <rhochmuth> hi witek
15:02:36 <rhochmuth> i was out for a while
15:02:48 <witek> good to see you back
15:03:24 <witek> is joadavis around?
15:04:19 <witek> hi joadavis :)
15:04:46 <witek> #topic monasca-ceilometer
15:04:46 <joadavis> hello
15:04:57 <joadavis> ah, yes. :)
15:05:01 <witek> https://storyboard.openstack.org/#!/story/2001239
15:05:09 <witek> thanks for updating the story
15:05:26 <joadavis> So we do still have work to do in monasca-ceilometer to bring it fully up to date with pike
15:05:35 <witek> You write about incompatibility of Ceilosca with current Ceilometer middleware.
15:06:12 <witek> The last functional version was Newton, right?
15:06:43 <witek> It is over one year old now and has reached EOL.
15:06:53 <joadavis> Correct.  Before the holidays, aagate and I did some work to update the stable/ocata branch, but we haven't finished the polish on that yet
15:07:10 <witek> I'm wondering if it's the right approach to try to catch up in Ceilosca repo.
15:07:25 <joadavis> both of us have been devoting some time to holidays and cassandra work
15:07:55 <joadavis> We had been working on the approach of catching up to ocata then pike.
15:08:38 <witek> You write yourself about the 'moving target'.
15:08:40 <joadavis> The bulk of the time was in just getting cherry picks through zuul. :P
15:08:56 <witek> Instead we could try to do 'greenfield' implementation of Monasca publisher in Ceilometer repo.
15:09:27 <witek> And backport this implementation to monasca-ceilometer repo if needed.
15:09:33 <joadavis> The current master is a moving target, due to the changes in the ceilometer master to do a lot of simplification and remove functions that aren't seen as being used
15:10:02 <joadavis> But we should be able to line up with ceilometer stable/pike and get to a working state again soon
15:11:30 <joadavis> once we have a pike version of ceilosca, it should be a smaller set of changes to work with the ceilometer master
15:11:49 <witek> are you sure it's not too much effort? I would consider implementing it upstream (in ceilometer) more efficient
15:11:54 <joadavis> the other side of this is that the python-monascaclient has also changed and we need to update to match it
15:12:58 <joadavis> I was thinking we needed to update it in monasca-ceilometer before we could pick it up and submit the publisher to ceilometer.  Hadn't thought about just reimplementing it directly in ceilometer...
15:13:46 <joadavis> based on the latest version of the Publisher class
15:14:05 <witek> just point to consider, we could use monasca-ceilometer repo to host backports
15:16:34 <witek> Please think of it, I think it would be the quickest way of getting support in current and future versions
15:17:03 <joadavis> Will consider it.
15:17:06 <witek> thanks
15:18:05 <witek> #topic oslo healthcheck middleware
15:18:56 <witek> there is the effort to provide common healthcheck API for OpenStack services
15:19:14 <witek> https://review.openstack.org/531456
15:19:40 <witek> the idea is to implement core functionality in oslo.middleware
15:20:31 <witek> if you have ideas about how such API should look like, please take a look at spec review
15:21:21 <witek> after implemented, we could include it in our healtcheck APIs
15:21:47 <witek> and update detection plugins for OpenStack services
15:22:26 <jgu> does that also include agent plugins to use the o~s service health check?
15:23:05 <joadavis> interesting question. now I want to read the spec for any performance requirements
15:23:51 <witek> when services implement this API, we should reflect it in agent detection plugin
15:24:23 <jgu> yes our msgs crossed :-)
15:24:34 <witek> :)
15:25:06 <witek> rhochmuth: any thoughts on this?
15:25:27 <rhochmuth> no really
15:26:42 <witek> this topic is driven by self-healing SIG
15:27:09 <witek> #topic  CA cert and insecure fix
15:28:24 <witek> Stefan Nica has apparently fixed this
15:28:45 <jgu> I just wanted to bring this to atttention. The patch is from Stephan, an urgent fix for Keystone ssl verification in Pike Monasca agent.
15:29:08 <witek> is python client affected in similar way?
15:29:32 <jgu> in my testing, mon cli seems to be working fine
15:31:00 <witek> jgr_: could you have a look as well?
15:32:00 <jgr_> witek: sure.
15:32:08 <witek> thanks
15:32:47 <witek> no more items in agenda today
15:33:04 <witek> do you have anything else?
15:33:07 <jgr_> I've got one more thing, yes
15:33:30 <jgr_> I'm currently looking into why my patch breaks monasca-grafana-datasource on Devstack
15:33:40 <jgr_> The good news is: it breaks for me, too
15:33:48 <jgr_> The bad news: it breaks with a 502, not a 401 :-)
15:34:09 <jgr_> How exactly did you configure monasca-grafana-datasource and what credentials did you use?
15:34:53 <witek> default devstack env
15:35:00 <witek> mini-mon:password
15:35:07 <jgr_> (I'm using admin/admin, URL: http://192.168.10.6:8070, Access: proxy and Auth: Keystone)
15:35:18 <witek> correct
15:35:32 <witek> apart from the user
15:35:36 <jgr_> Ah, ok. So different user. That probably did it...
15:36:33 <jgr_> And one more thing: https://review.openstack.org/#/c/532486/ is a bit weird
15:36:34 <witek> it's strange, I mean I have tested this patch before merging
15:37:09 <jgr_> Tox jobs fail but I didn't even touch any of the Python parts (plus, the 'recheck' appears to get ignored)
15:37:33 <witek> there were some problems with zuul today
15:38:05 <witek> they have also restarted gerrit half an hour ago
15:38:07 <jgr_> Ok, that would explain it :-)
15:38:40 <jgu> jgr_: I saw the same weird tox failure in another patch :-)
15:40:25 <jgr_> jgu: alright. I guess I'll just try issuing another recheck tomorrow then... :-)
15:40:53 <timothyb89> hi witek, apologies for being late, but I'm available if you wanted to discuss the DCO for monasca-docker
15:41:46 <witek> jgr_: according to http://zuulv3.openstack.org/ the checks are passing now
15:42:08 <witek> hi timothyb89
15:42:26 <witek> we're done with agenda, so it's your stage
15:42:54 <timothyb89> sure, so we have our monasca-docker repo that currently lives on public github
15:43:11 <witek> https://github.com/monasca/monasca-docker
15:43:47 <timothyb89> our (=HPE's) open source legal team just finished a review and has asked us to start requiring a DCO or CLA for contributions
15:45:29 <timothyb89> a DCO would be the text at https://developercertificate.org/
15:45:39 <timothyb89> so far it seems that a DCO (as used by the Linux kernel, Docker, etc) is the preferred option of the two
15:45:52 <witek> my suggestion was, to move the repo to openstack namespace and use OpenStack CLA
15:45:59 <witek> would that be possible?
15:46:14 <timothyb89> sorry, was just typing the same :) did not mean to leave it out
15:47:01 <timothyb89> our team's feeling at the moment is that the amount of work required would be quite large
15:47:30 <timothyb89> particularly on the CI side of things, as to my knowledge openstack doesn't have facilities to e.g. publish docker images
15:48:10 <witek> I think Kolla does publish images
15:48:12 <timothyb89> obviously there are ways around that but in almost all cases that's additional work we don't really have the bandwidth to take on, unfortunately
15:48:54 <timothyb89> last I checked kolla explicitly didn't publish images, and the ones on dockerhub were somewhat unofficial?
15:50:03 <witek> I think integration of monasca-docker with the rest of repos is important for the whole community
15:50:36 <witek> and I suspect there would be parties interested in implementing missing CI parts
15:51:34 <witek> publishing images is one thing to be checked for sure
15:52:16 <witek> can you think of other points?
15:53:03 <timothyb89> there are some additional issues with perhaps stepping on the toes of other projects also trying to build docker images
15:53:13 <timothyb89> though I don't know how much of an issue that would really be
15:53:46 <witek> I don't understand
15:54:06 <timothyb89> monasca-docker has some overlap with kolla itself, for example
15:54:38 <witek> oh, I don't think that would be a problem
15:55:05 <witek> they have a completely different naming schema
15:56:01 <timothyb89> and there is also some personal preference from others on my team that we remain on github
15:56:39 <timothyb89> however amenable to openstack adoption I may personally be, it's a hard sell, unfortunately
15:57:19 <joadavis> Tell them I'll bake them some cookies if they come join the party. :)
15:58:27 <timothyb89> I can pass on the offer, but I can't guarantee it will change minds :)
15:59:17 <witek> I think we can gain a lot with reworking integration tests to use docker containers
16:00:16 <witek> publishing images has to be checked, but I think it can also be handled nicer with OpenStack CI
16:00:25 <witek> the time is over
16:00:31 <witek> but let's take it offline
16:00:39 <timothyb89> fair enough, thanks for hearing me out :)
16:00:47 <witek> timothyb89: will you be available tomorrow at irc?
16:01:10 <timothyb89> yes, that shouldn't be an issue. any particular time you'd prefer?
16:01:16 <witek> #endmeeting