16:00:08 #startmeeting OPNFV BGS Arno release readiness - daily check in 16:00:08 Meeting started Wed May 13 16:00:08 2015 UTC. The chair is frankbrockners. Information about MeetBot at http://wiki.debian.org/MeetBot. 16:00:08 Useful Commands: #action #agreed #help #info #idea #link #topic. 16:00:08 The meeting name has been set to 'opnfv_bgs_arno_release_readiness___daily_check_in' 16:00:13 <[1]JonasB> #info Jonas Bjurel 16:00:13 #info Frank Brockners 16:00:20 #info Fatih Degirmenci 16:00:47 #info Peter Bandzi 16:01:05 #info Tim Rozet 16:02:01 #info Jose Lausuch 16:02:35 we're starting to have a quorum 16:03:00 lets get started with brief updates on progress - so that we have input for the TSC meeting tomorrow 16:03:24 let's do the usual order - functest - hardware (if any) - POD1 & POD2 16:03:36 #topic updates on functional testing 16:03:48 Jose could you give us a brief status? 16:03:55 #info POD1 fresh installed, I'll run tests today and tomorrow 16:04:03 #info the node will be released to BGS on Monday 16:04:50 <[1]JonasB> #info From moday on we will have nightly re-deploys 16:04:51 #info Morgan Richomme 16:05:09 do we have to repeat what we have said in the other meeting? :) 16:05:46 which other meeting? fine if you have a pointer - so no need to repeat 16:05:47 <[1]JonasB> Yes I'm afraid the other meeting was only limited to fuel:-) 16:05:57 #info test status page updated 16:06:05 #link https://wiki.opnfv.org/r1_tempest 16:06:12 #link https://wiki.opnfv.org/r1_rally_bench 16:06:26 oh, thanks, I was about to copy-paste it 16:06:27 will see with fdegir to integrate display results in CI 16:06:40 #link http://opnfv.orange.org/test/opnfv-nova-13052015_0542.html 16:06:54 awesome tables - kudos! 16:06:59 #link http://opnfv.orange.org/test/ 16:07:20 do we have something similar for robot - or are those tests "all green" 16:07:25 #info vPing failed on POD 2 but could be due to previous test (will check and clean existing networks) 16:07:30 * fdegir morgan_orange: please share the utils you have to generate that report 16:07:35 * fdegir offline 16:07:44 pbandzi: can I clean all the networks (do not know if they all were create dby me...) 16:08:06 fdegir: reports are automatically generated by Rally 16:08:07 morgan_orange: sure you can clean them all 16:08:23 morgan_orange: ok, will take a look at them 16:08:53 on my earlier question: will we have a r1_robot wiki? 16:09:14 frankbrockenrs: yes we will have 16:09:29 idem for vPing (even it will be limited) 16:09:48 #info functest doc updated 16:09:58 * ChrisPriceAB arrived late 16:10:05 #info JIRA on config set to resolved 16:11:27 #info there are plans for a "r1_robot" wiki to show results similar to tempest and rally (per pbandzi) 16:11:57 done with functest? 16:12:19 seems so 16:12:25 yep, I think 16:13:27 do we have anything new wrt/ HW config and setup? From what I know this is not the case, but checking.... 16:13:55 #info docs update, formatting and content completion ongoing across teams. 16:14:03 * ChrisPriceAB just saying 16:14:05 does not seem to be the case indeed... 16:14:16 ok - let's do docs then ChrisPriceAB 16:14:22 #topic updates on docs 16:14:34 Oh I just did... 16:14:55 anything else but "ongoing"? 16:15:07 #info docs update, formatting and content completion ongoing across teams (per ChrisPriceAB) 16:15:16 #info docs identified and content completion ongoing. 16:15:37 #info focus on terminology and facts in rags 16:15:40 Bus 16:15:51 ... Autocomplete 16:15:52 do we have an "inventory" by now what should be there - compared to what is there? 16:15:54 hi guys 16:16:13 Hmm not documented, currently in mail threads 16:16:35 ChrisPriceAB: Interested in doing such an inventory? 16:17:02 #info will provide and inventory and jira jobs by Monday 16:17:11 thanks Chris 16:17:17 done with docs? 16:17:19 Actions on me for weekend work! Yay 16:17:27 Done 16:17:31 * frankbrockners what weekend? 16:17:44 4 day weekend here 16:18:00 #topic updates on POD1 and POD2 autodeployment 16:18:15 [1]JonasB: any updates on POD1? 16:18:34 <[1]JonasB> #info POD1 autodeploy driven by Jenkins complete 16:18:42 kudos! 16:18:49 <[1]JonasB> #info Healtchecks all show green 16:19:10 <[1]JonasB> #link https://build.opnfv.org/ci/view/genesis/job/genesis-fuel-deploy/7/consoleFull 16:19:28 <[1]JonasB> #info functest will start working on POD1 16:20:10 <[1]JonasB> #info We will not run nightlys on POD1 before night between 18 and 19 to give functest air time 16:20:49 Its mine! :D 16:20:57 <[1]JonasB> #info A patch was submitted to make vPING work for Fuel, one more patch will be needed to avoid manual intervention 16:21:50 * frankbrockners side question: Does vPing report back to the Nova console now - or how do we detect "success"? 16:21:57 yes 16:21:59 <[1]JonasB> #info The ODL - OS HA proxy issue is a little bit more open ended than we first thought / would have hoped. Working on it 16:22:02 with console output 16:22:10 so, no fancy stuff with shuting down anything 16:22:18 thanks jose_lausuch 16:22:33 <[1]JonasB> Thats all I have, I think 16:22:44 <[1]JonasB> So one good and one bad thing. 16:22:51 np 16:23:26 [1]JonasB: Do we have a better understanding now on the ODL - OS HA issue you see? (per the discussion yesterday). Perhaps we can find some helping hands - in case we have a description of the issue 16:24:39 <[1]JonasB> #info We know what the issue is, quite related to Fuel. But the solution is time consuming, and maybe not in the direction we would like to go. 16:25:34 Is this a potential exclude from Arno? 16:25:42 <[1]JonasB> #info We have a prototype doing things in other ways wich seems to work, but to much change to dare throwing it in now! 16:26:11 <[1]JonasB> frankbrockners: It might be so that we need to push this for SR1 16:26:17 <[1]JonasB> :-( 16:26:42 ok - as long as we document things properly so that expectations are set - I don't see an issue 16:27:14 anything else on POD1/Fuel? 16:27:16 <[1]JonasB> Can we plan for an SR1 late June? 16:27:20 * ChrisPriceAB jira ticket and documented limitations, within reason 16:27:23 <[1]JonasB> No Nothing else 16:27:37 thanks Jonas. Let's deal with SR1 once we have R1 :-) 16:27:58 Let's know Arno tech debt and plan SR then 16:28:02 #info updates on POD2 / Foreman-Quickstack deploy tool 16:28:09 Ack frank 16:29:06 morgan_orange - quick update on ISO deployment experiences? 16:29:08 #info installation of the iso foreman in Orange POD1 OK 16:29:15 <[1]JonasB> Well, as for SR1 we need to plan vaccations to ackomodate for SR1 16:29:25 :) 16:29:38 #info BUT installation failed because of network card order 16:30:02 Can we start with nightly deploy on pod2? (Pushing pushing) 16:30:06 #info Tim proposed to use a branch of his github to fix the issue 16:30:37 and test as well 16:31:38 trozet: any news on autodeploy and test plans for POD2? 16:32:56 when I asked this couple of days ago, I was told that FuncTest is working on POD2 16:33:00 is this still the case? 16:33:08 yes 16:33:17 so still hands off 16:33:18 the last results come from tests on POD 2 16:33:26 #info working on the patch for external network creation 16:33:49 #info hoping to be able to finish testing on intel pod and get a gerrit review tomorrow 16:33:50 sorry folks, I need to leave now. see you 16:34:05 * frankbrockners thanks jose 16:34:35 thats it from me on LF pod2 16:34:45 * frankbrockners thanks tim 16:35:45 Question: Autodeploy and test on POD2 would just be a matter of flipping a switch right? I.e. things are ready to go once functest is done on POD2, correct? 16:36:05 foreman build and deploy are ready 16:36:16 the standalone FuncTests were run before as well 16:36:32 #info foreman build and deploy are ready (per fdegir) 16:36:48 #info standalone FuncTests were run before as well (per fdegir) 16:37:06 #info but morgan_orange's latest smoke jobs weren't run yet (functest-opnfv-jump-2 that is) 16:37:06 #info waiting on green light from functest to start autodeploy of POD2 16:37:28 ok go 16:37:33 go? 16:37:49 green-light? 16:37:51 you want to re deploy POD 2? 16:38:05 phladky was using it 16:38:13 not just redeploy 16:38:19 but continuously build-deploy-test 16:38:22 every night 16:38:22 can you please check with him 16:38:45 ok 16:39:42 #info fdegir to check with morgan_orange, trozet and phladky before doing anything 16:40:02 #info on POD2 to enable nightly autodeploy 16:40:18 so if I summarize things: In a nutshell we seem to be really close - pending resolution of some tests, vPing issues, etc. Hoping for green-light from functest on Monday for autodeploy on PODs 1 _and_ 2. 16:40:19 ok fro me if pbandzi is OK 16:40:29 ok for me too 16:40:29 I can re run test on a fresh install to finalize testing 16:40:34 #info I don't know if we want to enable nightly deploy yet do we? Aren't folks still debugging functest issues real time during hte day? 16:40:50 we know that tests will fail but we can already test the full sequecne build/deploy/test 16:41:02 when is night? 16:41:15 * frankbrockners that is a good one! 16:41:19 around 00:00UTC 16:41:52 is that workable for functest to have a fresh deploy every day? 16:41:53 #info Nightly runs are triggered around 00:00 UTC 16:42:24 peter, jose and myself are in Europe so 00:00 UTC = 22:00 CET 16:42:27 that is ok for me 16:42:34 oups 2h AM 16:42:37 even better 16:42:44 I slep usually at 2.. 16:42:46 where phladky is located? 16:42:56 Slovakia 16:43:06 ok, then everyone should be sleeping 16:43:26 the ones that have possibility to work on pod2 16:43:48 so... do we want to decide to switch to nightly autodeploy runs on POD2 from today onwards? 16:43:58 * fdegir yes please 16:44:23 everyone ok? 16:44:41 trozet, pbandzi, morgan_orange? 16:44:45 ok 16:44:54 ok fdegir we did not test the full sequence manually but it is the chaining of the one we tested 16:45:03 I wont be sleeping 16:45:06 but its ok with me :) 16:45:19 we never tested full chain unattended 16:45:27 everything was tested one by one 16:45:38 and we observed 16:45:45 ok a little step for humanity, a great step for CI 16:45:51 :D 16:45:52 Conclusion on nightly deploy? 16:46:47 sounds like the key folks gave their ok 16:46:48 good to go I think 16:46:55 I think we have a go for nightly build/deploy/test on POD 2 from tonight 16:47:03 thanks! 16:47:12 * ChrisPriceAB woot, can we info that? 16:47:19 #agree Nightly (00:00 UTC) autodeploys on POD2 from now onwards 16:47:29 #info As a side note, if there is no change in genesis repo, nightly runs are skipped by default 16:47:33 Not pod1? 16:47:35 #info This can be adjusted later on 16:47:45 jose_lausuch has taken over pod1 16:47:56 I think so, at least leading up to Arno 16:48:00 hands off until he gives green light 16:48:16 Jose we are in your hands... 16:48:31 we agreed to switch to autodeploy on POD1 likely by Monday 16:48:42 see the earlier conversation 16:48:42 :) thanks 16:48:48 Jose wants to run manual tests first 16:48:56 I saw, wanted it from you ;) 16:49:00 which does make a lot of sense... - see POD2 experiences 16:49:23 ok... looks like we're done for today 16:49:29 may I take 2 minutes 16:49:31 I have 2 very short artifact/ISO related info/questions 16:49:41 ChrisPriceAB: Do you need any additional info for your TSC meeting tomorrow? 16:49:54 fdegir... shoot 16:50:00 #info Artifact/ISO naming: we currently store ISOs with build numbers: opnfv-23.iso for example. 16:50:03 Excellent guys!!! I think I have enough thanks frankbrockners 16:50:17 #info this will be changed to date-timestamp: opnfv-2015-05-13_02-01-35.iso for example. 16:50:34 second question is 16:50:44 #info Artifact/ISO retention policy: We can perhaps remove all the ISOs but latest 7(?). (released ISO(s) will be excluded from automatic removal.) 16:50:58 any input/objection? 16:51:13 fdegir - is there a way to have a stable link to latest like opnfv-latest.iso which always links to the latest iso build? 16:51:26 Ok, but how to differential each deploy option? 16:51:29 we already have that - indirectly 16:51:31 <[1]JonasB> Why date instead of number, alot harder to communicate/refer to? 16:51:47 we store latest.properties file 16:51:56 which contains info about the latest artifact 16:52:12 including sha1 that is used for building, link to build log, link to ISO and so on 16:52:37 example: http://artifacts.opnfv.org/genesis/foreman/latest.properties 16:52:42 ok - that works 16:53:01 and +1 for removing old builds 16:53:09 +1 16:53:09 re [1]JonasB: that is due to change in job structuring 16:53:21 if we create a new job to build say fuel 16:53:25 build number is reset to 1 16:53:52 there are ways to solve it 16:53:58 have to leave 16:54:11 but then it will depend on somewhere to get number 16:54:13 * frankbrockners thanks morgan_orange 16:54:33 date-timestamp is independent from where the build is done and by which job 16:54:47 agree with fdegir 16:54:55 also in future 16:55:03 timestamp would allow us to build in multiple places independently 16:55:03 we might have parallel builds done after merge to master 16:55:06 race conditions 16:55:15 and not require sequence number synch 16:55:15 etc etc 16:55:54 and we will keep only latest 7 builds during development phase 16:55:58 so there won't be hard 16:56:00 so fdegir has 2 "+1" for restricting history to 7 builds... 16:56:06 released ones will be stored differently 16:56:10 with tag/label 16:56:18 any additional opinions 16:56:19 ? 16:57:15 fdegir... sounds like we should go ahead with 7 builds ... 16:57:30 ok 16:57:32 ... we're reaching the top of the hr... 16:57:36 anything else? 16:57:39 I need decision regarding build numbering for tonight 16:57:52 i'm ok with UTC timestamp 16:58:02 since restructured foreman jobs will be enabled, resetting to 1 16:58:03 [1]JonasB: ? 16:58:07 or a messy workaround 16:58:24 <[1]JonasB> Then were relying on that everything is NTP syched correctly, not buildt the same second, hmm 16:58:48 yes... 16:58:59 let's do the pragmatic thing for now 16:59:17 if we run into issues - we can fix those as they appear.. ok? 16:59:19 this can be changed later if it doesn't work 17:00:14 can we agree on fdegir's pragmatic UTC based numbering for now? 17:00:34 * frankbrockners need to run... let's close on this.. 17:00:46 ok 17:00:50 <[1]JonasB> Well, I guess youre forcing me to 17:00:52 <[1]JonasB> +1 17:00:56 nope 17:01:01 I can fix it in ugly way 17:01:12 <[1]JonasB> No lets go with it 17:01:15 ok 17:01:16 thanks 17:01:19 thanks Jonas 17:01:49 #agree ISO builds will use UTC timestamp for numbering purposes - at least for now 17:01:57 looks like we're done... 17:02:05 thanks - and see you tomorrow 17:02:09 <[1]JonasB> Thx 17:02:09 #endmeeting