08:00:12 #startmeeting Functest weekly meeting April the 19th 08:00:12 Meeting started Tue Apr 19 08:00:12 2016 UTC. The chair is morgan_orange. Information about MeetBot at http://wiki.debian.org/MeetBot. 08:00:12 Useful Commands: #action #agreed #help #info #idea #link #topic. 08:00:12 The meeting name has been set to 'functest_weekly_meeting_april_the_19th' 08:00:26 #topic call roll 08:00:32 #info Morgan Richomme 08:00:35 #info Viktor Tikkanen 08:00:41 #info Jose Lausuch 08:00:43 #info Juha Kosonen 08:00:49 #info David Blaisonneau 08:00:51 #info CG_Nokia (Colum Gaynor) 08:00:52 jose_lausuch: only people who do nothing do not break things... 08:00:53 #info Juha Haapavirta 08:00:55 #info 08:01:13 #info lhinds 08:01:14 #info agenda is here https://wiki.opnfv.org/display/functest/Functest+Meeting 08:01:21 #info serena 08:01:26 any topic you want to add? 08:01:30 I missed a quote 08:01:30 haha 08:01:30 #info raghavendrachari 08:01:36 do we have GTM? 08:01:44 not planned for today 08:01:44 lhinds: nope 08:02:02 #topic action point follow-up 08:02:17 I think there were no AP last week 08:02:23 #link http://ircbot.wl.linuxfoundation.org/meetings/opnfv-testperf/2016/opnfv-testperf.2016-04-12-08.00.html 08:02:30 #info no AP last week 08:02:39 #topic test release criteria 08:02:45 I prepared some slides, can share in here 08:02:46 besides the current task everyone are working on :) 08:02:55 I there a audio/video meeting? The one on https://wiki.opnfv.org/display/functest/Functest+Meeting doesn't work for me 08:03:10 jlinkes_: only IRC this week 08:03:19 ok 08:03:50 lhinds: you can put your slide on the wiki? 08:03:52 if we need, maybe we can arrange a GTM (audio/video) for next week 08:04:05 regarding test release criteria 08:04:38 morgan_orange, ok, its on google docs 08:04:47 #info page created for each project to provide input about the release criteria https://wiki.opnfv.org/display/SWREL/Test+Release+Criteria 08:05:01 maybe I AP myself to send a reminder 08:05:03 jose_lausuch, I think that would be better for me, as you will likely all have questions 08:05:43 #action jose_lausuch send reminder to project test PTLs to fill the tables for release criteria 08:06:04 morgan_orange: we need to take this again during next test group meeting on thursday 08:06:17 hope we will get some feedback, and not only the week before the freeze :) 08:06:25 the idea is that the test subgroup has to come with a decision 08:06:28 about what test to pass 08:06:35 maybe we can talk now about what we want from functest 08:06:52 #action morgan_orange prepare presentation for test meeting on test criteria 08:07:05 presentation? 08:07:26 shall we not explain what we did and hope other test project will do the same... 08:07:39 yes, but that's appart 08:07:52 ok 08:07:54 we had the idea to present our tools/framework for the community 08:07:58 specially for the feature projects 08:08:14 yes it is a good idea 08:08:15 but weeks are passing and no one speaks up :) 08:08:21 we proposed it 2 weeks ago 08:08:32 let's do it this week... 08:08:38 bryan tried to come up with a page to collect input 08:08:50 https://wiki.opnfv.org/display/SWREL/Feature+Test+Input 08:09:00 the Functest table is pretty clear and reflect our roadmap 08:09:02 and those links have been sent to the community 08:09:08 but no one has put anything yet 08:09:17 looks like in brahmaputra 08:09:46 we did not get feedback, or a very late... 08:10:04 ok let's focus on Functest 08:10:15 sorry, connection drop 08:10:31 as far as iI can see most of the criteria for the internal features are already defined 08:10:50 do we miss any test case? 08:10:53 still questions on new test cases but could bee answered once implementation would be clearer 08:11:08 for rally-full and vIMS I got some feedback 08:11:13 about being part of the daily runs 08:11:14 regarding the discussions in Espoo, we will not get new open source VNFs from 01.org 08:11:30 I Said that we would reduce the frequency, and they wouldnt be run every day 08:11:30 release planned at the end of the year 08:11:38 of course there are other Open source VNFs 08:11:43 but then I got comments about being part of the criteria or not... if it makes sense.. 08:11:49 but they will not come from Intel project for the C release 08:12:14 morgan_orange: I say the email, we can't include those VNFs for Colorado I think :) 08:12:19 maybe for D-river 08:12:53 jose_lausuch: regarding the question, for me it makes sense to be part of the release even if we do not run systematically on each run 08:13:04 morgan_orange: yes but 08:13:18 release != release criteria 08:13:27 everything can be part of the release 08:13:32 but I'm talking about the criteria 08:13:33 for example 08:13:53 the decision about some performance test projects is still pending 08:13:57 about being part of the release crteira 08:14:04 of course they'll be part of the release 08:14:18 but we need to come up with something to evaluate the deployment 08:14:43 for performance I can understand the dilemna but for us most of the tests are it works or it does not work...we do not evaluate performance 08:14:48 that's why I put at the beggining "IMPORTANT: Not being part of the release criteria does not mean not being part of the release. Those test cases that won't be part of the criteria, can be still run on non CI PODs and released in Colorado." 08:14:58 that's right 08:15:29 the problem is that our functest job takes 3 hours :) 08:15:41 and we will include more features 08:15:55 but with our new mechanisms (tier + trust indicator) we should reduce that 08:16:12 it should and I think it will 08:16:33 juhak: showed yesterday that a basic Rally may last 28 minutes... 08:17:00 of course the scope is reduced and we will still trigger Rally tests that last longer but with teh mechanism we should be able to save time... 08:17:09 I think that's too long for a smoke, and as he pointed out, removing some scenarios, we can achieve 10 min for rally-smoke, which is perfect 08:17:23 so for the moment, I would not consider that, just think to the test criteria as defined 08:17:46 if it takes too long (we should be reasonable)...then we will reduce our scope / frequency of some tests / ... 08:17:47 (health / smoke / openstack ready / VNFs / features ) 08:17:47 Is it so that the goal is to have 5 tiers? 08:17:54 I can assume that the security scan will also take time 08:18:03 viktor_nokia: yes 08:18:20 #info discussion on test criteria https://wiki.opnfv.org/display/SWREL/Test+Release+Criteria 08:18:31 takes about 30 seconds per node 08:18:39 viktor_nokia: I started with this config file https://gerrit.opnfv.org/gerrit/#/c/12165/ 08:18:42 #info release != release criteria 08:18:51 but that was against a virtual machine with just 1026 of ram, 1 virt cpu 08:18:59 #info everything can be in the release 08:18:59 _1028 08:19:13 lhinds: ok, that's not much 08:19:29 #info test criteria are indicated to give an indicator if the test can be considered as successful or not 08:19:53 #info we know that timing is challenging and currently 3h is too long 08:20:29 #info that is why 2 mechanisms are planned for Colorado to reduce the duration of the run and not run tests systematically (if results are stable) 08:20:46 #info test criteria = group of tests that need to succeed to validate the release 08:21:05 #info first mechanism: slicing tests into 5 tiers: health / smoke / openstack ready / VNFs / features 08:21:32 #info second mechanism: use a trust indicator for each test case to trigger or not the test case 08:22:01 #info not possible to get the real impacts on these mechanism yet...but it would be one of the main contribution for C release... 08:22:04 #info trust indicator to be used on the long test cases 08:22:10 yes 08:22:30 #info first tiers are short and will be run systematically 08:22:44 shall we keep policytest in the list? 08:22:57 my view is that the project is no more active 08:23:21 maybe we have to remove it 08:23:25 and we got new requests 08:23:30 from movie, models, ... 08:23:35 #action morgan_orange contact policytest to see if they are still alive 08:24:08 that is a good news...they contact us early... 08:24:20 but I assume it is just to indicate that they want to perform tests... 08:24:32 yes, but there is no much information anywhere 08:24:35 just the intention :) 08:24:45 so we don't know how much work will be for them/us ? 08:24:49 to support , specially 08:24:55 yes that is why we need to explain what we did with doctor/promise/bgpvpn... 08:25:39 #action morgan_orange review new demands from feature projects and try to precise a way to work 08:26:00 there are still "to be decided" in the table 08:26:01 I already sent an email to the community asking for that 08:26:03 good luck :) 08:26:12 maybe its better to address directly to the PTLs.. 08:26:14 shall we decide? 08:26:25 mailing list mails are often forgotten 08:26:50 there is a new release manager and a release meeting...let's try all the different options :) 08:27:01 we can decide, but I'd like to bring it also to the test group, to see their view as well 08:27:05 shall we add a line rally_smoke rally_full 08:27:12 like for tempest 08:27:16 yes, I forgot that! 08:27:17 :) 08:27:28 and for tempest smoke, I put 90%, but I;d say 100% ! 08:28:20 100% should be the target...especially for smoke, we almost have 95% on the customized list 08:29:30 for the moment I would suggest to include all the test cases, we will adapt if our 2 mechanisms do not save enough time 08:29:37 100% can be reached but requires some work from installers 08:29:55 I'd like to see 100% too :) 08:30:34 for tempest-full maybe its fine with 80%, but a smoke should provide good results 08:30:43 if you like 95%, but ideally it should be 100 08:31:24 let's announce 100% (that is our view to ensure sustainability and homogeneous results) - we will see if e need to move it back to 95% to be pragmatic.. 08:31:33 anyway the release will be pronounced.. :) 08:31:42 +1 08:32:02 #action viktor_nokia set tempest smoke to 100% 08:32:20 #action juhak add line Rally_smoke / rally full 08:32:30 idem for rally, rally smoke should be 100% 08:32:38 and 90% for the full 08:32:42 makes sense for me 08:32:46 oops sorry 08:32:47 too late 08:32:52 np 08:32:53 I added it :) 08:32:59 hi there 08:32:59 =) 08:33:05 hi boris-42 08:33:11 boris-42: :) 08:33:23 any question on test criteria for release C? 08:33:38 why not 100% success rate?) 08:34:05 it should be... 08:34:09 boris-42: we will have a rally smoke suite and it will be 100% 08:34:19 but for the full... problems can happen with the deployment 08:34:23 we want to be a bit flexible 08:34:28 (for now) 08:34:36 maybe future releases we force it to 100% 08:35:16 #topic Security testing 08:35:38 lhinds: can you give us a status and share the link of the doc you want to share 08:36:07 #link https://wiki.opnfv.org/display/functest/Functest+Security 08:36:09 #info code for OpenScap scan merged yesterday https://gerrit.opnfv.org/gerrit/#/c/12411/ 08:36:16 slide deck embedded ^^ 08:37:41 Status: I have the full remote scanning working in my lab, the next AP's are to integrate with functests main script, and work on how to hop across nodes either in my own manner or with flash test. 08:38:13 sounds good 08:38:27 so you connect to the controllers/compute nodes 08:38:28 This is where I need to ask for advice on setting up a test environment to do the second AP, in that can I replicate this locally or would I need a lab 08:38:36 jose_lausuch, yes 08:38:52 and you check the keystone.cfg and sshd config 08:38:55 as a side note, we can scan anything we want. but plan to start with the OS nodes 08:39:01 ok 08:39:07 jose_lausuch, that is just two of hundreds 08:39:09 sounds good to start 08:39:12 we can include more things :) 08:39:18 sure 08:39:21 yes, those are just examples 08:39:43 when you say "install pkg deb" or centos 08:39:47 do you have access to OPNFV labs? 08:39:48 is that to be installed in the container? 08:39:48 it also checks for CVEs, up to date patches, pending reboots..a lot of stuff 08:39:51 or on the nodes? 08:40:17 jose_lausuch, plan to support RPM and DEB, but starting with RPM 08:40:28 morgan_orange, not yet 08:40:45 jose_lausuch, no. Nothing in container 08:41:05 the script remotely installs the RPM (scanner), performs the scan on the remote node 08:41:10 lhinds: how are we gonna handle installting stuff on the SUT ? 08:41:13 ok 08:41:14 downloads the report to the container 08:41:21 and then scrubs clean (if we want) 08:41:38 only need the requirements, which we discussed on gerrit 08:41:38 so we need root access to the nodes 08:41:53 jose_lausuch, yes or a sudo user with WHEEL ALL 08:42:00 ok 08:42:14 nice 08:42:53 I think connnection topic, just needs work, we will work something out I am sure. 08:43:27 regarding the lab: the intel lab dedicated to apex, is maybe the best place to start? 08:43:42 also if we want to add checks, we can upstream to the SCAP repos 08:44:02 morgan_orange, sounds good. would that be Tim Rozet and co? 08:44:07 it would be great to have a list of what we can include, and then we pick some things 08:44:26 jose_lausuch, that may be possible 08:44:33 the report is a nicely formatted html 08:44:40 but there is a also an xml version 08:44:56 lhinds: yes trozet and dan are the good contacts 08:44:57 we can then parse those delimiters into the dashboard 08:44:59 lhinds: that html can be pushed to artifacts 08:45:08 +1 08:45:10 and then we need a simple text report for our output on jenkins 08:45:14 jose_lausuch, that would be an easy way to start 08:45:18 and also a way to push it to the dashboard 08:45:24 on the dashboard, we just can push the scan duration and an overall status 08:45:25 maybe duration and % of test passed/failed 08:45:34 the hml artifacts looks fine 08:45:54 yep, it has pie charts and other elements to show overall status 08:46:03 lots of eye candy :P 08:46:48 let's first fix the integration and the automation on one isntaller, then try to generalize it. Dashboarding will not be difficult :) 08:46:50 #action luke to contact intel labs / tim rozet for security test integration 08:47:01 morgan_orange, agree 08:47:31 that would be a good way forwards 08:47:41 yes 08:47:48 with David_Orange we shall be able to test it locally on a joid solution 08:48:05 jose_lausuch: could probably test it on fuel and May-meimei on compass 08:48:16 morgan_orange: yes 08:48:40 morgan_orange, there are DEB packages for ubuntu, just the actual SCAP content might need some attention 08:48:51 but there is SCAP content out there 08:49:02 ok 08:49:15 or someone could show it some love and give it a review. I will try of course too 08:49:32 any other question on security suite? 08:49:45 10 minutes left 08:49:53 #topic functest offline 08:50:12 I added the topic because we started discussing with viktor_nokia about having it offline for colorado 08:50:18 an old story based on some frustration during the Düsseldorf event where there was no connection 08:50:24 it is possible, we just need to do some adaptations in some scripts 08:50:37 so its a request for all the "test owners" to try to pre install as much as possible 08:50:47 for vIMS I sent an email to valentin, but he is not available 08:51:01 for vIMS, as far as i know the way everything is done, it does not sound very easy...but I have to discuss with Valentin 08:51:05 for rally and tempest we already pre install the stuff 08:51:27 ok 08:51:37 BTW, more generally, "offline" can be a goal for installers as well 08:51:46 I pushed a patch yesterday about it as well 08:51:59 viktor_nokia: some of them can deploy offline already 08:52:15 but I dont know all the details 08:52:19 viktor_nokia: you think that it should be reported to genesis as a requirement ? 08:52:34 it may make sense 08:52:47 we can try :D 08:53:14 #action morgan_orange viktor_nokia contact Frank to see if offline could be considered as a requirement for C-release 08:53:16 but I dont think it's that easy for some of them 08:53:21 but let's see 08:53:25 but for us, I think we can manage 08:53:49 we need to identify what else needs internet 08:53:58 but then the first security scan will tell you that there are security issues in your "old" libraries 08:54:21 ? 08:54:41 didnt get that 08:55:13 I just imagined that if we preinstall everything, we will not be necessarily up to date on all the libraries 08:55:27 but np 08:55:30 mmm, we build constantly the docker image 08:55:36 like everyday 08:55:42 that should be fine I think 08:56:07 ok 08:56:11 #topic AoB 08:56:26 #info yardstick planned to create a CLI, I think it is a good idea... 08:56:32 functest show testcases 08:56:37 functest show tools 08:56:44 functest run rally-smoke 08:56:50 functest getenv 08:56:51 ... 08:57:04 that would be nice 08:57:07 we did not plan that but could be interesting 08:57:25 functest run tier-1 08:57:26 ./run_tests.sh --> functest 08:57:27 :) 08:57:30 yes 08:57:31 idea is good one 08:57:41 good idea, so we don't need to go to the exact path to run scripts 08:57:55 do we all agree? 08:57:58 I vote +1 :) 08:57:59 yes 08:58:01 +1 08:58:03 +1 08:58:03 we should add the description of the testcases (for the moment stored in the DB but that shall be available offline) 08:58:04 +1 08:58:20 morgan_orange: I'm working on that as well, for the tiers 08:58:27 we'll have 1 yaml file with all the tests 08:58:41 I think that's a new epic! 08:58:44 functest cli 08:58:46 ok we will "just" to be sync 08:58:49 yes 08:58:57 #action morgan_orange create functest cli epic 08:59:22 any volunteer to make a first study of what we need (wiki) then implement? 08:59:43 raghavendrachari: may I action you for that? 08:59:44 btw, for everyone working on something, if you want to add new things in the repo related to an epic, its good to create "TASK" in JIRA yourselv and relate it to the existing EPICs we have 08:59:50 yes 08:59:52 so you are also mentioned in the JIRA reports 08:59:54 do it 09:00:02 #action morgan_orange assign raghavendrachari for CLI 09:00:03 dont create SUB-TASK, since they can't be related to EPICs 09:00:06 but TASKs are 09:00:18 #info I also created versions in Jiras 09:00:28 when creating a Jira use Colorado / brahmaputra version 09:00:38 ok it is already time 09:00:38 yes 09:00:46 2 minutes left if you want to share something 09:01:01 any comment/remark/question 09:01:55 please review the odl testsuite evolution asap 09:02:06 as for the functtest-156, since there is no such things like flask-restful-swagger, I need to write one by myself, is that ok? 09:02:17 well no 09:02:21 https://jira.opnfv.org/browse/FUNCTEST-177 09:02:42 raghavendrachari: I think we can add all of them 09:02:47 with priority to the L3 stuff 09:02:55 openflow and neutron would be a nice one too 09:02:57 ok .. 09:03:09 morgan_orange no? 09:03:11 I saw something I put in the backlog serena-zte, let me try to find it again. If it does not exist, do not develop it specifically 09:03:55 serena-zte: oops not in backlogs.. 09:04:19 but swagger-ui cannot be used directly, it needs to be integrated with the web framework 09:05:55 serena-zte: let's continue offline 09:06:02 ok 09:06:37 thanks you for attending, have a good week, see you next week 09:06:40 #endmeeting