17:02:02 #startmeeting refstack 17:02:03 Meeting started Thu Apr 24 17:02:02 2014 UTC and is due to finish in 60 minutes. The chair is davidlenwell. Information about MeetBot at http://wiki.debian.org/MeetBot. 17:02:04 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 17:02:06 The meeting name has been set to 'refstack' 17:02:22 roll call?!? 17:02:30 hi 17:02:31 o/ 17:02:39 Hello 17:02:42 o/ 17:03:01 ___O___ 17:03:09 ha ha rob is flying 17:03:27 17:03:51 hello 17:04:43 no praveen yet 17:05:00 What's the first topic? 17:05:22 on the agenda today: Tests, API Versioning, TCUP, packaging execute_test 17:05:51 #topic Tests 17:06:33 So it occured to me yesterday while digging into some refstack code that we.. the people who are supposed to test things.. have written 0 tests to cover our own code 17:07:29 So I'm no longer going to approve code without test coverage as a starting point.. and we'll need to start working it in as we go for the rest of it .. 17:07:41 thoughts .. objections .. questions ??? 17:07:49 Developers generally hate writing tests. and I understand. Unit test is the first step. 17:08:26 not sure how to test some of TCUP work 17:08:29 davidlenwell: will you be able to put in some sample test? 17:08:38 I'd ask to hold off until after summit on the testing requirement 17:08:40 RaymondWong: yes . I will 17:08:57 and it is hard to test GUI 17:09:00 (FWIW, I <3 tests) 17:09:53 davidlenwell, I am +1 on the requirement tho 17:09:54 unit testing just runs the code and insures it has the right exceptions... the right values come in and out .. you can easily write tests for the web.py methods and the api methods .. mostly I'm interested in testing the api to stay in spec 17:10:06 GUI should be tested at the layer below the display (the interface to the rest of the system) until it is reasonably stable. 17:10:12 +1 on API tests 17:10:13 zehicle_at_dell: we can discuss the timing of the requirement at a later time 17:10:26 davidlenwell, +1 17:10:39 +1 on unit tests and API tests 17:10:49 okay .. then next topic 17:11:03 #topic API Versioning 17:11:42 \o 17:12:29 So last night I threw this at the specs folder .. https://review.openstack.org/#/c/90044/ 17:13:21 This might be related: I've been following the Tempest no branch model discussion. I went to the QA IRC meeting and suggested a meeting between us and them to discuss how it would impact us, what to expect, how to work with it. I'm going to look at the spec now... 17:13:52 its a first draft .. I'd appriate a review from rob and catherineD and RaymondWong to make sure Its not missing any important api calls 17:14:23 rockyg: we should probably find time at the summit to sit down with them 17:14:56 davidlenwell: +1 17:15:17 davidlenwell, +1 17:15:52 catherineD and RaymondWong so you'll notice if you look at the spec I left out the get_script method.. 17:16:47 as I mentioned last week I do not feel like it is a secure way of passing executable code into a container. 17:17:57 but it can guarantee the executable is at the same level as the Refstack code (in the case of local refstack w/ docker) 17:18:18 how about using a git pull? 17:18:23 RaymondWong: that is the entire point of versioning the api 17:18:39 zehicle_at_dell: I'd preffer it pip installed the tester 17:18:55 people may have downloaded and installed a local copy of refstack, and it may not match the one in git... 17:19:16 RaymondWong: where did they download it from ? 17:19:34 originally from git hub... 17:19:51 if i download it from master of git hub now... it may be changed tomorrow... 17:20:00 pip allows us to version control the test clients 17:20:13 if the api changes it will change the required version of the test client 17:20:16 I'm confused - git clone would be the way to get the code if you wanted it. 17:20:42 only if the code hasn't been updated in the same branch. 17:20:42 therefore if your api is out of date.. its sitll requiring the older version of the tester 17:20:46 so they will be in line 17:21:04 does that makes sense? 17:21:37 if you up date your api.. it will then have the newer version of the tester in its requirments 17:21:57 but an old out of date api .. will use a version of the tester it works with .. always 17:22:18 davidlenwell: right, api and tester have to be matched. 17:22:42 its how python-novaclient and the nova api stay in sync without having to force the client to be downloaded from the nova api 17:22:42 i am not sure how you specify it with git 17:22:49 Yeah. All APIs should be versioned. It reduces problems when you figure out what you did wrong the first time ;-) 17:23:03 you don't .. thats why we use pip and versioning 17:23:15 question - we're assuming that we'll keep both v1 and vNext working for now? 17:23:17 What davidlenwell said 17:23:24 * zehicle_at_dell jumps ahead 17:23:43 pip requirements allow you to be specific with the version of the client you are installing 17:23:55 It's possible the first release GA will be >1 17:24:10 we'll call it v1 anyways 17:24:21 Until v2 17:24:25 davidlenwell, +1 17:24:36 GA = API not changing contract 17:24:45 Yes. 17:24:56 RaymondWong: I know it sounds whacky but it does work and its well practiced within openstack and in python in general 17:25:01 * zehicle_at_dell 's plan is starting to decend... may get booted out soon 17:25:19 so the next topic is 17:25:27 #topic TCUP 17:25:36 before rob gets booted 17:25:38 i agree with versioning, and to make sure the code version match... we can work out the implementation details... probably i can understand it better when i see the code or spec. 17:25:41 10Q! 17:26:05 I've updated the spec based on Rocky's review and also added a graphic 17:26:06 So rob has put a lot of time into a spec for tcup 17:26:16 rob .. I reviewed it last night 17:26:18 Yay! 17:26:28 zehicle_at_dell: had s few notes 17:26:28 I'll review again. 17:26:47 cool, thanks. Should be able to update tonight 17:26:48 generally I think you are on the right track 17:26:59 right now, we're making sure that tempest runs manually from TCUP 17:27:05 catherineD: RaymondWong can you review his spec 17:27:07 so that I have a baseline 17:27:20 praveen_dell, is helping document that so people can test it 17:27:21 My requirements exercise has helped me focus a bit on possible holes in design 17:27:29 I want to make sure that we've got a baseline 17:27:41 +1 baseline 17:27:50 zehicle_at_dell: so you have a base line data? 17:27:55 and I'm working on executing davidlenwell 's code to run from env variables 17:28:01 zehicle_at_dell: have you tried using execute_test ? 17:28:06 So I can point to TCUP running Tempest 17:28:08 zehicle_at_dell: can you share that data? 17:28:14 w/o worring about the config builder 17:28:22 it's docuemntation 17:28:36 basically, just how to run tempest, but in the context of the container 17:29:01 while I'm working on the config injection it occures to me to make sure the Dockerbuild file is valid 17:29:02 #link https://review.openstack.org/88587 17:29:05 that can go in parallel 17:30:29 I think I'm very close to having David's code working - I just need to be in place to test 17:30:30 CatherineD: review the spec and add your comments. If we can get all comments in today, Rob only needs one more pass, hopefully 17:30:33 and play 17:30:52 rockyg: Has your team being able to collect some Tempest data? I am interesting in seeing sometempest test data 17:31:02 * zehicle_at_dell letting DefCore and that Dell stuff get in the way :( 17:31:09 IN my env, I passed 998 test cases of 1296 17:31:18 I would like to see other's data 17:31:19 Not as yet. Is it ready to test again? 17:31:39 We have a Grizzly cloud, so we should get quite a number of fails 17:32:14 anyone try against trystack? 17:32:20 I was just going to complain that you guys are far behind .. but we're still on grizzly too 17:32:38 998 is a pretty good result. 17:32:48 davidlenwell, I have a topic for next meeting > we need to discuss control process for some important JSON data like driver test info & must-pass test lists 17:32:58 davidlenwell: and rockyg: so you will not colect data? 17:33:00 I'd like to prep that w/ you 17:33:12 when david renamed execute_test, refstack (local w/ docker) doesn't work anymore. i can put in a temporary fix so it can run again, until someone workout he pip thing. 17:33:23 Yes, we will collect data. It just won't pass as many tests. 17:33:38 that's ok 17:33:48 catherineD: we're in the process of migrating to icehouse right now .. skipping havana .. piston will be posting data after that 17:34:08 rockyg: I can help your team to start testing ... I am eager to see data .. 17:34:11 rockyg: when refstack is working again, you can simply change the config to use grizzly tempest url, and it can generate data for you. 17:34:13 * zehicle_at_dell thinks that's a reasonable idea 17:34:24 Kewl! 17:34:44 it would be really good for the API to accept which version you are testing - I'll check the spec and see 17:34:56 RaymondWong: I like your plan of temporarly fixing the docker build thing to work with the new code .. 17:35:29 question > there was a request for the output of TCUP to be available locally 17:35:48 should this be added to the requirements? I've been trying to have 0 leavebehind footprint 17:35:56 could easily be a flag 17:36:00 davidlenwell: I thought Refstack is based on Havan only for now ? When will Refstack support Icehouse? 17:36:02 agreed 17:36:04 Maybe we can get a havana devstack up quick to test against. 17:36:12 DefCore only needs Havana 17:36:13 catherineD: sooner than you think 17:36:27 We definitely need to be able to export TCUP data to a local location. 17:36:29 I suspect most of the Refstack users will want later releases 17:36:32 davidlenwell: great because we also need to test Icehouse 17:36:55 rockyg, I updated the spec to make it optional. Easy to add capability 17:36:56 zehicle_at_dell: catherineD icehouse is the present .. we have to support it 17:37:10 +1 17:37:25 * zehicle_at_dell putting seatbacks up....laptop closing 17:37:29 The requirements from the Use cases says a user wants to compare runs. On a private cloud, that means local data 17:37:47 rockyg: that is covered by running your own copy of refstack 17:37:58 Right. 17:38:05 tcup doesn't need that 17:38:10 But saving the data... 17:38:38 i think that is better for user to install local refstack, then they can run multiple tests, store the data, and possibly download the data and compare. 17:38:38 the point of tcup is that it runs and posts its data back to a refstack api some place .. could be refstack.org .. could be your own copy of refstack 17:38:58 its flexible that way .. 17:39:05 if you want to colate local data .. run refstack 17:39:09 Need to be able to specify where TCUP ships the data to when it finishes up....Ah. Okay. 17:39:15 if you just want to post test results to refstack .. use tcup 17:39:24 Is there a goal for Refstack(certification process ) at summit 17:39:28 is tcup working? 17:39:31 We should make sure that is commented in the code for extraction to docs. 17:39:43 I left notes to that effect on the review of the spec 17:40:04 right now tcup doesn't work .. tcup is a spec 17:40:08 Thanks. 17:40:32 once the spec is compelete we can make it work like designed in the spec 17:40:43 rob is toying with prototypes of tcup 17:40:45 then why don't we use something that work to collect data? Unless data collection is not important at this time? 17:41:07 I guess we are the only one have collected data now? 17:41:33 Is data collection important for DefCore to define "Core"? 17:41:53 data collection is important for a lot of reasons .. that is one that defcore cares about 17:42:15 Yes. Data collection is very important. Which may be why it's taking so long to get to. 17:42:16 then can we concentrate on that aspect by using what is working 17:42:45 we waiste a lot of time (> 3 weeks) so far .. 17:42:52 i.e. install local refstack and run the test with docker. 17:43:11 I am going to try to convince a DB engineer to join our effort. That should help get some focus on data collection. 17:43:14 I know the code is not perfect and we can work on that but at the same time we should proceed with data collection 17:43:20 RaymondWong: that doesn't cover the requirments for massive data collection 17:43:34 rockyg: I will help as much as I can ... 17:43:54 catherineD: RaymondWong I understand that it is frustrating that we are not jumping in and using what we have already prototyped 17:43:54 davidlenwell: right, we are missing the "upload/sync data back to refstack.org" part. 17:44:04 RaymondWong: hold up 17:44:04 davidlenwell: right. CatherineD, you a good db dev? 17:44:23 rockyg: you are going off topic .. hang on a sec 17:44:41 not a problem. TCUP. 17:44:44 the problem with the current code path is that its not simple to install 17:45:03 davidlenwell: Exactly... Even that it is prototped and not pertect it can produce data fro DefCore 17:45:05 the idea of tcup is that we have have lots of people run it easily without having to install things 17:45:38 Really needs either a package or pip install 17:45:41 if we can get tcup working we'll have a lot more data than if we just collect data from the few people we can get to set up the current path 17:45:56 rockyg: thats actually the next topic 17:45:57 Raymond got a version of TCUP working .... but we block that 17:46:14 catherineD: that version of tcup didn't meet the requirements ' 17:46:21 did any one look at Raymond's code? 17:46:36 catherineD: yes it tested .. but depended on too many things being set up right .. 17:46:39 I did review his code 17:46:43 But it is TCUP , it can collect data right? 17:46:47 CatherineD: perhaps we can work two paths: use a working version that doesn't meet spec to experiment with data collection/format 17:46:57 rockyg: +1 17:47:06 rockyg: catherineD going two paths isn't the right answer 17:47:38 please read robs latest spec for tcup .. he does a very good job explaining the requirments 17:47:41 We just want to focus on getting a usable data set, taking care of skipping tests, etc 17:47:46 i don't get which spec/requirement it is not meeting... maybe some simple fix. 17:47:58 Data is data ... 17:48:13 OKay. Will review. Maybe CatherineD and I can come up with a data spec. 17:48:14 okay .. lets table this discussion until after you guys all read and review robs latest spec 17:48:45 Okay. Next topic? 17:49:04 if your code meets the requirements I will approve it . but until the spec is approved and the requirements are agreed upon I am not approving any code that covers that use case 17:49:23 davidlenwell: +1 17:49:33 We're not talking code. We're talking spec. 17:49:37 yes 17:49:40 yes 17:49:53 so please .. use the spec review process to give your opinions 17:50:04 they are valid .. I want you to contribute .. 17:50:17 I do not want divergant code paths just so we can collect data faster 17:50:27 are we all on the same page? 17:50:37 Yes. 17:51:04 catherineD: RaymondWong? 17:51:25 #topic packaging execute_test and renaming it to refstack-tester 17:51:41 yes, no TCUP testing until later... but we can still help rocky or others to setup local refstack to start test (if they want to) 17:51:57 Yes, packaging. 17:52:04 + 1 for furture code ... -1 for data collect ... I think we should execute both in parallel ... using the existing code to collect data . 17:52:25 catherineD: we can discuss that offline if you wish 17:52:46 8 min. Let's get this packaging stuff out there. 17:52:47 phone call later today? 17:52:48 Let's do ... 17:53:11 maybe after this IRC ends .. 17:53:14 okay.. so packaging .. 17:53:28 catherineD: I have to commute to the office after this meeting .. so maybe in an hour or so ? 17:53:35 sure sure 17:53:39 agreed 17:53:53 alright ... lets talk about packaging execute test.. 17:54:08 I've already done some of the initial things that will be needed to spin this code off 17:54:29 it has its own requirements.txt file and setup.py 17:54:46 for now because having lots of repo's is a pain we'll keep it in the tools folder in refstack 17:54:59 agreed. 17:55:17 great .. so you will return execute_test to the tools folder ? 17:55:25 its already there in a folder 17:55:34 no, it is in tools/execute_test folder 17:55:38 it will stay were it is 17:55:48 but the installer will put it in the path 17:55:59 so the full path doesn't need to be used to execute 17:56:00 I thought you move that out ... 17:56:29 until we're further along I don't want a lot of repo's to manage 17:56:43 so we'll leave it where it is and put some work into the installer 17:57:05 so the installer for execute test requires tempest havana stable right now .. 17:57:17 So, when it gets' moved the installer just needs an update for the new location? 17:57:33 sure .. its just updating a path 17:58:23 it will just make a sym link in /usr/bin to /what/ever/path/python/libs/get/installed/to/refstack/tools/execute_test 17:58:26 K. What form will the installer be in? Python? pkg? yum? 17:58:38 python package 17:58:46 so it will end up evenutally in pypi 17:58:50 Good. Not OS dependent 17:58:51 so it can be pip installed 17:58:59 Really good. 17:59:16 I'll even do the extra work on that package to make sure its python 2.6 compatible 17:59:22 Will there be a way to get a copy for isolated clouds? 17:59:47 so rockyg.. the way that works is that you can download the egg from pypi and install it 17:59:56 That's good. 18:00:06 davidlenwell: +1 18:00:09 or you can do what a lot of folks do in isolated env and use a pip wheel to store dependancies 18:00:11 I wondered what thoses eggs were :-) 18:00:38 whcih is essentially a precompiled hash of all your stuff you need installed 18:00:52 Quick before everyone leaves: I'd like to have a F2F to get the req matrix in better shape next week. Anyone interested? 18:01:01 hi 18:01:06 neutrons here? 18:01:12 SumitNaiksatam: hi 18:01:12 I guess we're out of time 18:01:19 SumitNaiksatam: hi 18:01:20 #endmeeting