15:01:15 <slaweq> #startmeeting neutron_ci
15:01:16 <openstack> Meeting started Wed May 20 15:01:15 2020 UTC and is due to finish in 60 minutes.  The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:01:17 <slaweq> hi
15:01:18 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:01:20 <openstack> The meeting name has been set to 'neutron_ci'
15:01:36 <slaweq> hi
15:01:53 <slaweq> (sorry, too many meetings at once)
15:02:21 <ralonsoh> hi
15:02:28 <bcafarel> o/
15:02:49 <maciejjozefczyk> \o
15:03:18 <slaweq> ok, lets do that fast :)
15:03:29 <slaweq> Grafana dashboard: http://grafana.openstack.org/dashboard/db/neutron-failure-rate
15:03:30 <slaweq> Please open now :)
15:04:42 <slaweq> #topic Actions from previous meetings
15:04:51 <slaweq> ralonsoh to continue checking ovn jobs timeouts
15:04:57 <ralonsoh> I'm on it
15:05:07 <ralonsoh> focusing in ovsdbapp and python-ovn
15:05:25 <slaweq> ok, thx ralonsoh
15:05:28 <ralonsoh> but no conclusions yet
15:05:41 <slaweq> so I will keep it for next week, just to track that
15:05:44 <ralonsoh> sure
15:05:47 <slaweq> #action ralonsoh to continue checking ovn jobs timeouts
15:05:49 <slaweq> thx a lot
15:05:55 <slaweq> next one
15:05:57 <slaweq> bcafarel to update stable branches grafana dashboards
15:07:05 <bcafarel> #link https://review.opendev.org/#/c/729291/
15:07:26 <bcafarel> as we had updated these recently, this time it was easier
15:08:05 <slaweq> thx bcafarel
15:08:33 <slaweq> and the last one from previous week
15:08:35 <slaweq> slaweq to switch functional uwsgi job to be voting
15:08:43 <slaweq> Patch proposed https://review.opendev.org/729588
15:08:59 <slaweq> I will also update grafana when that will be merged
15:09:13 <slaweq> and that's all from last week
15:09:22 <bcafarel> ironic that this specific job failed on this patch :)
15:10:37 <slaweq> ouch
15:10:51 <slaweq> lets check it couple of times before we will merge it
15:11:11 <bcafarel> https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_9b8/729588/1/check/neutron-functional-with-uwsgi/9b8e96a/testr_results.html looks unrelated
15:12:14 <slaweq> yes, doesn't look like related to uwsgi
15:13:31 <slaweq> ok, lets move on
15:13:33 <slaweq> #topic Stadium projects
15:13:44 <slaweq> standardize on zuul v3
15:13:57 <slaweq> I started looking today at neutron-ovn-grenade job
15:14:05 <slaweq> and I saw that it's not run in neutron gate at all
15:14:27 <slaweq> so I proposed to add it as non-voting job for now
15:14:29 <slaweq> https://review.opendev.org/#/c/729591/
15:14:42 <slaweq> and then I will also work on migration to zuulv3
15:14:57 <slaweq> I just want to have some legacy job's results to compare during migration
15:16:18 <slaweq> I don't think there is any other update about that
15:16:28 <slaweq> but if I am wrong, please tell now :)
15:17:32 <bcafarel> looks like the silent crowd agrees
15:17:42 <slaweq> yeah
15:17:45 <slaweq> so lets move on
15:17:55 <slaweq> #topic Stable branches
15:18:01 <slaweq> Train dashboard: http://grafana.openstack.org/d/pM54U-Kiz/neutron-failure-rate-previous-stable-release?orgId=1
15:18:03 <slaweq> Stein dashboard: http://grafana.openstack.org/d/dCFVU-Kik/neutron-failure-rate-older-stable-release?orgId=1
15:18:48 <bcafarel> ^ that will change soon :)
15:19:14 <slaweq> :)
15:19:50 <bcafarel> I did not see many failures last week I think it was good on stable front
15:20:04 <slaweq> yes, that's also my impression
15:20:13 <slaweq> more recheckes are on master branch
15:21:09 <slaweq> anything else regarding stable branches for today?
15:22:49 <bcafarel> not from me
15:22:50 <slaweq> ok, so lets move on
15:22:58 <slaweq> #topic Grafana
15:23:07 <slaweq> #link http://grafana.openstack.org/dashboard/db/neutron-failure-rate
15:24:07 <slaweq> ovn related jobs are a bit high in check queue - around 40%
15:24:29 <slaweq> http://grafana.openstack.org/d/Hj5IHcSmz/neutron-failure-rate?viewPanel=16&orgId=1
15:24:58 <slaweq> now they are going down a bit but still are top on this graph
15:27:17 <slaweq> but I don't have now any specific failure to check
15:27:36 <slaweq> maybe it's just because there was recently pretty many WIP or DNM patches related to ovn driver
15:27:55 <slaweq> lets check that in next days and we will see what to do with it
15:28:07 <bcafarel> +1
15:28:18 <ralonsoh> probably
15:28:25 <slaweq> other than that all looks ok'ish IMO
15:29:08 <slaweq> ok
15:29:21 <slaweq> anything else regading grafana?
15:30:13 <ralonsoh> no
15:30:23 <slaweq> ok, lets move on
15:30:25 <slaweq> #topic fullstack/functional
15:30:50 <slaweq> I saw few issues in functional tests this week, like e.g.
15:30:53 <slaweq> https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_6d0/726168/2/check/neutron-functional/6d0b174/testr_results.html
15:31:15 <slaweq> but it was only once when I saw it
15:31:22 <maciejjozefczyk> Yeah I noticed that too.
15:33:37 <slaweq> I don't see anything obvious in log from this test: https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_6d0/726168/2/check/neutron-functional/6d0b174/controller/logs/dsvm-functional-logs/neutron.tests.functional.agent.l3.test_ha_router.L3HATestFailover.test_ha_router_failover.txt
15:34:06 <slaweq> but I will check it more deeply this week
15:34:22 <slaweq> #action slaweq to check failure in test_ha_router_failover: https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_6d0/726168/2/check/neutron-functional/6d0b174/testr_results.html
15:34:55 <slaweq> regarding fullstack tests, I still see failing firewall test, like https://93b88ea1fe64fba121c6-f42d955827477dc68a274454ee4340d5.ssl.cf2.rackcdn.com/726168/2/check/neutron-fullstack/cb8503e/testr_results.html
15:35:11 <slaweq> I think we need to reopen bug related to this and mark this test as unstable again
15:36:14 <slaweq> what do You think?
15:36:25 <ralonsoh> ok
15:36:55 <ralonsoh> shouldn't we have more debug info there?
15:37:24 <slaweq> yes, I will try to add some additional logging in this test
15:37:26 <ralonsoh> I remember we added the routes, the devices, etc.
15:37:32 <ralonsoh> (maybe I'm wrong)
15:37:44 <slaweq> ralonsoh: not to fullstack tests AFAIR
15:37:48 <slaweq> but I will check that
15:37:50 <ralonsoh> ups
15:38:10 <slaweq> #action slaweq to add additional logging for fullstack's firewall tests
15:38:26 <slaweq> #action slaweq to reopen bug related to failing fuillstack firewall tests
15:38:56 <slaweq> and that's all what I have regarding functional and fullstack tests for today
15:39:16 <slaweq> lets move on
15:39:18 <slaweq> #topic Tempest/Scenario
15:39:31 <slaweq> here I found one new issue, Address already allocated in subnet
15:39:37 <slaweq> https://2f302d35d7c2b9201857-14c47b0c762b46266aadd7f2c624d382.ssl.cf5.rackcdn.com/665467/70/check/neutron-tempest-plugin-scenario-openvswitch/9514b9a/testr_results.html
15:40:04 <slaweq> maybe it's just http request timeout on client's side and after recheck it was already done in neutron
15:40:12 <slaweq> but IMO worth to check that
15:40:17 <slaweq> any volunteer for that?
15:40:28 <ralonsoh> me
15:40:33 <ralonsoh> but on friday
15:40:48 <slaweq> ralonsoh: great, thx a lot
15:41:12 <slaweq> #action ralonsoh to check  Address already allocated in subnet issue in tempest job
15:41:24 <slaweq> ok, and the last thing for today
15:41:26 <slaweq> #topic Periodic
15:41:36 <slaweq> thx maciejjozefczyk for fixing fedora ovn job, it's now fine
15:41:49 <maciejjozefczyk> \o/
15:41:53 <slaweq> today I noticed that openstack-tox-py36-with-ovsdbapp-master is failing
15:42:05 <slaweq> Bug is reported https://bugs.launchpad.net/ovsdbapp/+bug/1879717
15:42:05 <openstack> Launchpad bug 1879717 in ovsdbapp "Neutron's unit tests are failing with ovsdbapp from master branch" [Undecided,New]
15:42:13 <slaweq> and otherwiseguy already proposed patch for that
15:42:36 <maciejjozefczyk> Yes, looks like its related to autoindex feature recently merged...
15:42:56 <bcafarel> https://review.opendev.org/728306 ?
15:42:57 <slaweq> we cached it in last minute because there is proposed new release of ovsdbapp and if we would release it, neutron gate would be doomed
15:43:36 <slaweq> bcafarel: https://review.opendev.org/729627
15:44:32 <slaweq> and basically that's all from me for today
15:44:38 <slaweq> anything else You want to discuss?
15:44:50 <bcafarel> thanks (and nice to see the -with-xx-master jobs catching this kind of issues!)
15:45:16 <slaweq> bcafarel: yes, good that we are checking them at least once a week :)
15:45:24 <bcafarel> that too :)
15:45:37 <bcafarel> and nothing else from me
15:45:38 <slaweq> btw. I started reorganisation of zuul jobs definitions in neutron-tempest-plugin https://review.opendev.org/#/c/729567/
15:45:59 <slaweq> there is some issue there now, so I have to check it but please be ready for review when it will pass zuul :)
15:47:33 <bcafarel> nice
15:47:33 <slaweq> if You don't have anything else, I will give You 13 minutes back
15:47:49 <slaweq> thx for attending and have a great rest of the week :)
15:47:51 <slaweq> o/
15:47:56 <bcafarel> o/
15:47:56 <slaweq> #endmeeting