aboutsummaryrefslogtreecommitdiffstats
AgeCommit message (Collapse)AuthorFilesLines
2017-02-10CSIT-518: Add testpmd numa awarenessTibor Frank12-1878/+259
Change-Id: I982834de1fbe71cf5303808ea58d4b58e530ffcb Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-02-09Enable to pass vpp pacakges to VIRL in case of vpp patchJan Gelety1-2/+2
Change-Id: Ibab23153b698470e640530c44d95ca6f3c4898b2 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-02-09Upgrade T-rex to newer versionPeter Mikus5-6/+6
Change-Id: I8b918a3c1d8109fb64bfdeec8e5c9afe45a86d21 Signed-off-by: pmikus <pmikus@cisco.com>
2017-02-08csit rls1701 report nits and updates:Maciek Konstantynowicz15-30/+72
- Completed paragraph for each thput and latency graph in report describing: graphs title, x-axis, y-axis, legend. Change-Id: Ia45e2da2623b909123f6b3d4abe65e6caf72e058 Signed-off-by: Maciek Konstantynowicz <mkonstan@cisco.com>
2017-02-07Update of VPP_STABLE_VERJan Gelety1-1/+1
- use new vpp ref build: - build tested by semiweekly job: 17.04-rc0~200-g460bc63~b1824_amd64 https://jenkins.fd.io/job/csit-vpp-verify-master-semiweekly/1080/ Change-Id: I5a17095bccf1e8b11dc2a0f52cfd27a8621f6f74 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-02-07CSIT-517: DPDK initialization and teardownTibor Frank4-42/+68
Change-Id: Iff42549e3be610c88b7a7d5518ef2cbb88c75ed2 Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-02-07CSIT doc autogen: Add media wiki formatTibor Frank1-8/+54
Change-Id: I4231cf4f056b2370e0a7a7f8a7267859bbeccd6f Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-02-07Add perf tests for DPDK L2XCTibor Frank4-0/+1648
- 10ge2p1vic1227-eth-l2xcbase-ndrdisc - 10ge2p1x710-eth-l2xcbase-ndrdisc - 40ge2p1vic1385-eth-l2xcbase-ndrdisc - 40ge2p1x1710-eth-l2xcbase-ndrdisc Change-Id: I1bd125c5d384fbece47d885a950226990e801837 Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-02-06Fix: Add apt-get update for report-genPeter Mikus1-1/+2
Change-Id: Id98c370be0a74b16667dd3b43fc44e36c286fdf0 Signed-off-by: pmikus <pmikus@cisco.com>
2017-02-06csit rls1701 report edits:Maciek Konstantynowicz76-915/+1351
- simplified section structure for clarity and readability, - updated overview sections, - moved not rls related content from rls_notes to overview sections, - removed section title suffixes: HW, VIRL. - completed vpp_unit_tests and vpp_unittest_results sections. - updated all documentation sections. - updated rls_notes sections for vpp performance and vpp functional. Change-Id: Id2c2abbf9d3531ec9f63ecd353f385a0b55ae1ba Signed-off-by: Maciek Konstantynowicz <mkonstan@cisco.com> Signed-off-by: pmikus <pmikus@cisco.com> Signed-off-by: Maciek Konstantynowicz <mkonstan@cisco.com> Signed-off-by: pmikus <pmikus@cisco.com> Signed-off-by: Maciek Konstantynowicz <mkonstan@cisco.com>
2017-02-03Revert "Fix: CSIT 1701 report files and script AD1"Peter Mikus1-1/+1
This reverts commit 8e3e798d302bef12d490c2963056bbcceedb8b13. Change-Id: I90c40e51b2f0e364ca33fb92fa36f8cae3e5829c Signed-off-by: pmikus <pmikus@cisco.com>
2017-02-03Fix: CSIT 1701 report files and script AD1Peter Mikus1-1/+1
fix libxslt-dev package Change-Id: I9ee8f23faa3a30a4e5894045b8375f983d05adc1 Signed-off-by: pmikus <pmikus@cisco.com>
2017-02-03CSIT 1701 report files and script AD1pmikus26-2209/+2504
CSIT 1701 report files and script addendum 1 Edits to correct and align all Overview sub-sections. Updates in Performance CSIT Release Notes - added more NDR and PDR performance changes. Change-Id: I52b6ee89e9c536fb4ab9d30dc27cca8dbdd88a20 Signed-off-by: pmikus <pmikus@cisco.com> Signed-off-by: Maciek Konstantynowicz <mkonstan@cisco.com> Signed-off-by: pmikus <pmikus@cisco.com>
2017-02-03Fix: Timeout during VPP installationPeter Mikus1-2/+2
Increase timeout during installation of VPP packages to 60s. Change-Id: I9384564a45951bbfb648c99f25d8de70b79ab783 Signed-off-by: pmikus <pmikus@cisco.com>
2017-02-03CSIT-516: Add keywords for 2-node topologyTibor Frank3-23/+156
Change-Id: Ib5a1e207f1dec99747329a755c3c365fef4bd64c Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-02-02Mark ipv4 ipfix test as non-criticalMatej Klotton2-2/+2
Change-Id: I8998cbc08979340a680d872fab48bfeca8091365 Signed-off-by: Matej Klotton <mklotton@cisco.com>
2017-02-01CSIT 1701 report files and scriptpmikus61-0/+16068
Add RST source files and script to generate CSIT 1701 report Change-Id: I4345564547270ba10c64d6beebf2c2b5a83de459 Signed-off-by: pmikus <pmikus@cisco.com> Signed-off-by: Maciek Konstantynowicz <mkonstan@cisco.com> Signed-off-by: pmikus <pmikus@cisco.com>
2017-01-31CSIT doc: Add chapter numberingTibor Frank5-76/+72
Change-Id: I5cdcda0461e6f4bdabd91ea440edf7a71db9afd9 Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-30HC Test: Fix jvpp workaround for 17.04selias1-2/+2
- original commit at https://gerrit.fd.io/r/4683 Change-Id: I0b2d77f66830fbbc2d27e9015ec67ea9cd9a9885 Signed-off-by: selias <samelias@cisco.com>
2017-01-30Add exception processing to output.xml parserTibor Frank2-23/+43
Change-Id: Ie7d7a004ae540233efcb3c3114d46d40d5d1f07d Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-30Update of VPP_STABLE_VERJan Gelety1-1/+1
- use new vpp ref build: 17.04-rc0~175-gd9aad29~b1798_amd64 - build tested by semiweekly job: https://jenkins.fd.io/job/csit-vpp-verify-master-semiweekly/1078/ Change-Id: Ib51e4fc98edf463ea440dbeb655956382e11b5c6 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-01-27CSIT-506: HC Test- Update and cleanup all suitesselias25-124/+155
- remove EXPECTED_FAILING tag from IPv4 neighbor, IPv6 address, MTU - fix bridge domain removal with interfaces assigned, it should fail - fix teardown of L2 FIB suite (bridge domain removal, see above) - disable vhost-user "server" test cases (VPP bug) - fix keyword verifying sub-interface state - update ACL test data (yang model changes) - remove EXPECTED_FAILING tag from ACL table removal test - update Jira IDs and comments in failing Lisp test case - remove EXPECTED_FAILING tag from Lisp removal test case - use vhost-user "client" instead of "server" in persistence tests Change-Id: I32eafb6013b4512090c0d9365e10c61029179d49 Signed-off-by: selias <samelias@cisco.com>
2017-01-27HC Test: Workaround for 17.04 jvpp version mismatchselias5-48/+60
- add workaround for mismatched jvpp versions in deb packages - disable NSH_SFC madule and test suite, it depends on outdated jvpp - cleanup package download script used in hc2vpp-csit-verify job Change-Id: I98526baa4de08bbbab2339c9e81f49cd189b57ac Signed-off-by: selias <samelias@cisco.com>
2017-01-27CSIT-507: Add perf tests for Cisco VIC-1385 L2BDMiroslav Miklus1-0/+348
- 40ge2p1vic1385-eth-l2bdbasemaclrn-ndrdisc Change-Id: I8e1f49b5d41a81544b8a03dd2d6be98c97b94d28 Signed-off-by: Miroslav Miklus <mmiklus@cisco.com>
2017-01-27CSIT-508: Add perf tests for Cisco VIC-1227 L2BDMiroslav Miklus2-0/+294
- 10ge2p1vic1227-eth-l2bdbasemaclrn-ndrdisc Change-Id: Ib0e99f8160ca951b58b0244de88fa587e61bd941 Signed-off-by: Miroslav Miklus <mmiklus@cisco.com>
2017-01-27Add Centos specific bootstrap files.Thomas F Herbert8-366/+1215
Add download and install script for rpms. Add topology virl file for Centos. Change VPP repo urls for centos. JIRA: CSIT-356 Change-Id: I3a0a88958a712d1b652f19c76e5e1b019796d0ae Signed-off-by: Thomas F Herbert <therbert@redhat.com>
2017-01-26Fix: Display of PDR latencypmikus1-3/+7
Fix the string in PDR latency output Change-Id: I7f244e29015da4e71485e88f2988efa73b6f4853 Signed-off-by: pmikus <pmikus@cisco.com>
2017-01-26CSIT-511: Add script to generate report with TC resultsTibor Frank1-100/+286
Change-Id: Iee1c3310e445487bb216c4e9c6a3bc7ee7879788 Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-26CSIT-512: Add x710, xl710 l2bd vhost testsTibor Frank3-24/+1389
- 10ge2p1x710-eth-l2bdbasemaclrn-eth-2vhost-1vm-ndrdisc - 40ge2p1xl710-eth-l2bdbasemaclrn-eth-2vhost-1vm-ndrdisc Change-Id: I99a70c8fcdfc0fffda96606033ba8752d6c07952 Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-26csit-func-hc naming changeselias18-86/+86
- change test suite filenames - add numbering to all test cases more details at https://wiki.fd.io/view/CSIT/csit-perf-tc-naming-change Change-Id: I58e6c60f750c07e99c6949d8fe2510780fa9007a Signed-off-by: selias <samelias@cisco.com>
2017-01-26CSIT doc gen: remove "package" and change "module" to "suite"Tibor Frank1-5/+5
Change-Id: I36664eb4c1ade7c2c96df457ed939f8681ba2dce Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-25Centos 7.3 image Version 1.2 Add dhcp client to VIRL base imageThomas F Herbert4-1/+70
Change-Id: Ibedc2002f26824cc763eb2ff62de09d5f262e0c3 Signed-off-by: Thomas F Herbert <therbert@redhat.com>
2017-01-25Remove _base from test suite dir namesJan Gelety22-1/+1
Change-Id: I2e495f99a88dedc47f64efcc14722bb629a25f02 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-01-24Correction of dpdk packages download when vpp deb packages providedJan Gelety3-10/+17
Change-Id: I7cefd0797103e5062eb48df95ad2e48cdddc19b6 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-01-24csit-func-tc-naming-change - phase 1Jan Gelety96-3396/+5495
- change of ts directories - change of ts file names - splitting of former files to more files when suitable - more details: https://wiki.fd.io/view/CSIT/csit-perf-tc-naming-change Change-Id: Ifda1038f8323735f86c1be7ba7f93e3fda183618 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-01-24CSIT-509: Add perf tests for vlan + l2 + vhostTibor Frank4-3/+1493
Change-Id: I65e716b51dd35092c10574ac4580ee4b8bd3b24b Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-23Update of VPP_STABLE_VERJan Gelety5-7/+19
- use new vpp ref build: 17.04-rc0~134-g2ce7f98~b1759_amd64 - build tested by semiweekly job: https://jenkins.fd.io/job/csit-vpp-verify-master-semiweekly/1076/ - use new path for DPDK packages Change-Id: I1001ee3a22817f97a60b3a6555e3026d2b153913 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-01-20Mark VXLAN test as non criticalMatej Klotton1-0/+1
Change-Id: I1a788de4106d7e71d8cad6b68759b9a01d21bb4a Signed-off-by: Matej Klotton <mklotton@cisco.com>
2017-01-19Fix: Adjust NDRCHK threshold valuespmikus2-12/+12
Adjust treshold values in NDRCHK tests based on latest data. Change-Id: I10f39b9a1a071bf0b38c91f904361ad7f54ccc92 Signed-off-by: pmikus <pmikus@cisco.com>
2017-01-19Add unified latency output for PDRDISC and NDRDISCpmikus1-5/+7
Add same latency formatting for PDR as for NDR search results. This will help us to parse the data in same way. Change-Id: I3be79cc7623f8c9d39fd44babd252ceec58c114b Signed-off-by: pmikus <pmikus@cisco.com>
2017-01-18Renaming of all perf TCs.Tibor Frank55-955/+991
Change-Id: Id71abf4f52d7fc555ed9c3ec19563d54f5db9ec5 Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-17Update: robot parser scriptspmikus2-13/+25
Update robot parser script due to new naming and TAG structure Change-Id: I963de747d2c88dadcb2d3fa48747fb2ef1403294 Signed-off-by: pmikus <pmikus@cisco.com>
2017-01-17Update of VPP_STABLE_VERJan Gelety1-1/+1
- use new vpp ref build: 17.04-rc0~105-g5a3a6c0~b1730_amd64 - build tested by semiweekly job: https://jenkins.fd.io/job/csit-vpp-verify-master-semiweekly/1074/ Change-Id: Ied08466e0aafee1913697873596a43d416e2cc65 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-01-17VIRL test: Replace IP probe for VXLAN testMatej Klotton3-2/+12
Change-Id: Ic16f91beabdc2ac2e19ccc65c04790d36c15d477 Signed-off-by: Matej Klotton <mklotton@cisco.com>
2017-01-13VIRL test: VXLAN-L2BD-vhos_user test (CSIT-500)Jan Gelety3-27/+185
- Create test cases to test VXLAN+L2BD+vhost_user IPv4 and IPv6 scenarios Change-Id: I46d27f22ec2e2e35fd5067ba7eeda9a2ccff9f84 Signed-off-by: Jan Gelety <jgelety@cisco.com>
2017-01-13Tags for tests with IPSec.Tibor Frank1-0/+24
Change-Id: Ic06bd15de44303f99133fa0df83319b5568b64dd Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-12Update: plot script for new tagspmikus1-1/+1
Update plot script based on the change in TC TAGS Change-Id: I9412e38d4306b69c75b16943b64b987c2553c167 Signed-off-by: pmikus <pmikus@cisco.com>
2017-01-12CSIT-501: Perf tests re-taggingTibor Frank55-905/+865
- Implement tags introduced by https://gerrit.fd.io/r/#/c/3856/ Change-Id: I59313f43f338e2b4a1ad5caf9521b1a0da84d1d3 Signed-off-by: Tibor Frank <tifrank@cisco.com>
2017-01-12Clean up bootstrap-verify-perf filepmikus1-57/+2
Remove obsolete code from performance bootstrap file Change-Id: I8486c000ee4049b390017a1bf96459ec9fc828ea Signed-off-by: pmikus <pmikus@cisco.com>
2017-01-11 Add the DPDK l2fwd performance test cases.Fangyin Hu13-33/+1093
Change-Id: I996847a4871ed994cd9b5edb459fb079ff39c86d Signed-off-by: Fangyin Hu <fangyinx.hu@intel.com>
ot; } "Suite name N": { "doc": "Suite N documentation", "parent": "Suite 2 parent", "level": "Level of the suite in the suite hierarchy" } } "tests": { "ID": { "name": "Test name", "parent": "Name of the parent of the test", "doc": "Test documentation" "msg": "Test message" "tags": ["tag 1", "tag 2", "tag n"], "type": "PDR" | "NDR", "throughput": { "value": int, "unit": "pps" | "bps" | "percentage" }, "latency": { "direction1": { "100": { "min": int, "avg": int, "max": int }, "50": { # Only for NDR "min": int, "avg": int, "max": int }, "10": { # Only for NDR "min": int, "avg": int, "max": int } }, "direction2": { "100": { "min": int, "avg": int, "max": int }, "50": { # Only for NDR "min": int, "avg": int, "max": int }, "10": { # Only for NDR "min": int, "avg": int, "max": int } } }, "lossTolerance": "lossTolerance", # Only for PDR "vat-history": "DUT1 and DUT2 VAT History" }, "show-run": "Show Run" }, "ID" { # next test } } } Functional tests: { "metadata": { # Optional "version": "VPP version", "job": "Jenkins job name", "build": "Information about the build" }, "suites": { "Suite name 1": { "doc": "Suite 1 documentation", "parent": "Suite 1 parent", "level": "Level of the suite in the suite hierarchy" } "Suite name N": { "doc": "Suite N documentation", "parent": "Suite 2 parent", "level": "Level of the suite in the suite hierarchy" } } "tests": { "ID": { "name": "Test name", "parent": "Name of the parent of the test", "doc": "Test documentation" "msg": "Test message" "tags": ["tag 1", "tag 2", "tag n"], "vat-history": "DUT1 and DUT2 VAT History" "show-run": "Show Run" "status": "PASS" | "FAIL" }, "ID" { # next test } } } .. note:: ID is the lowercase full path to the test. """ REGEX_RATE = re.compile(r'^[\D\d]*FINAL_RATE:\s(\d+\.\d+)\s(\w+)') REGEX_LAT_NDR = re.compile(r'^[\D\d]*' r'LAT_\d+%NDR:\s\[\'(-?\d+\/-?\d+/-?\d+)\',' r'\s\'(-?\d+/-?\d+/-?\d+)\'\]\s\n' r'LAT_\d+%NDR:\s\[\'(-?\d+/-?\d+/-?\d+)\',' r'\s\'(-?\d+/-?\d+/-?\d+)\'\]\s\n' r'LAT_\d+%NDR:\s\[\'(-?\d+/-?\d+/-?\d+)\',' r'\s\'(-?\d+/-?\d+/-?\d+)\'\]') REGEX_LAT_PDR = re.compile(r'^[\D\d]*' r'LAT_\d+%PDR:\s\[\'(-?\d+/-?\d+/-?\d+)\',' r'\s\'(-?\d+/-?\d+/-?\d+)\'\][\D\d]*') REGEX_TOLERANCE = re.compile(r'^[\D\d]*LOSS_ACCEPTANCE:\s(\d*\.\d*)\s' r'[\D\d]*') REGEX_VERSION = re.compile(r"(return STDOUT Version:\s*)(.*)") REGEX_TCP = re.compile(r'Total\s(rps|cps|throughput):\s([0-9]*).*$') REGEX_MRR = re.compile(r'MaxReceivedRate_Results\s\[pkts/(\d*)sec\]:\s' r'tx\s(\d*),\srx\s(\d*)') def __init__(self, **metadata): """Initialisation. :param metadata: Key-value pairs to be included in "metadata" part of JSON structure. :type metadata: dict """ # Type of message to parse out from the test messages self._msg_type = None # VPP version self._version = None # Number of VAT History messages found: # 0 - no message # 1 - VAT History of DUT1 # 2 - VAT History of DUT2 self._lookup_kw_nr = 0 self._vat_history_lookup_nr = 0 # Number of Show Running messages found # 0 - no message # 1 - Show run message found self._show_run_lookup_nr = 0 # Test ID of currently processed test- the lowercase full path to the # test self._test_ID = None # The main data structure self._data = { "metadata": OrderedDict(), "suites": OrderedDict(), "tests": OrderedDict() } # Save the provided metadata for key, val in metadata.items(): self._data["metadata"][key] = val # Dictionary defining the methods used to parse different types of # messages self.parse_msg = { "setup-version": self._get_version, "teardown-vat-history": self._get_vat_history, "test-show-runtime": self._get_show_run } @property def data(self): """Getter - Data parsed from the XML file. :returns: Data parsed from the XML file. :rtype: dict """ return self._data def _get_version(self, msg): """Called when extraction of VPP version is required. :param msg: Message to process. :type msg: Message :returns: Nothing. """ if msg.message.count("return STDOUT Version:"): self._version = str(re.search(self.REGEX_VERSION, msg.message). group(2)) self._data["metadata"]["version"] = self._version self._msg_type = None logging.debug(" VPP version: {0}".format(self._version)) def _get_vat_history(self, msg): """Called when extraction of VAT command history is required. :param msg: Message to process. :type msg: Message :returns: Nothing. """ if msg.message.count("VAT command history:"): self._vat_history_lookup_nr += 1 if self._vat_history_lookup_nr == 1: self._data["tests"][self._test_ID]["vat-history"] = str() else: self._msg_type = None text = re.sub("[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3} " "VAT command history:", "", msg.message, count=1). \ replace("\n\n", "\n").replace('\n', ' |br| ').\ replace('\r', '').replace('"', "'") self._data["tests"][self._test_ID]["vat-history"] += " |br| " self._data["tests"][self._test_ID]["vat-history"] += \ "**DUT" + str(self._vat_history_lookup_nr) + ":** " + text def _get_show_run(self, msg): """Called when extraction of VPP operational data (output of CLI command Show Runtime) is required. :param msg: Message to process. :type msg: Message :returns: Nothing. """ if msg.message.count("return STDOUT Thread "): self._show_run_lookup_nr += 1 if self._lookup_kw_nr == 1 and self._show_run_lookup_nr == 1: self._data["tests"][self._test_ID]["show-run"] = str() if self._lookup_kw_nr > 1: self._msg_type = None if self._show_run_lookup_nr == 1: text = msg.message.replace("vat# ", "").\ replace("return STDOUT ", "").replace("\n\n", "\n").\ replace('\n', ' |br| ').\ replace('\r', '').replace('"', "'") try: self._data["tests"][self._test_ID]["show-run"] += " |br| " self._data["tests"][self._test_ID]["show-run"] += \ "**DUT" + str(self._lookup_kw_nr) + ":** |br| " + text except KeyError: pass def _get_latency(self, msg, test_type): """Get the latency data from the test message. :param msg: Message to be parsed. :param test_type: Type of the test - NDR or PDR. :type msg: str :type test_type: str :returns: Latencies parsed from the message. :rtype: dict """ if test_type == "NDR": groups = re.search(self.REGEX_LAT_NDR, msg) groups_range = range(1, 7) elif test_type == "PDR": groups = re.search(self.REGEX_LAT_PDR, msg) groups_range = range(1, 3) else: return {} latencies = list() for idx in groups_range: try: lat = [int(item) for item in str(groups.group(idx)).split('/')] except (AttributeError, ValueError): lat = [-1, -1, -1] latencies.append(lat) keys = ("min", "avg", "max") latency = { "direction1": { }, "direction2": { } } latency["direction1"]["100"] = dict(zip(keys, latencies[0])) latency["direction2"]["100"] = dict(zip(keys, latencies[1])) if test_type == "NDR": latency["direction1"]["50"] = dict(zip(keys, latencies[2])) latency["direction2"]["50"] = dict(zip(keys, latencies[3])) latency["direction1"]["10"] = dict(zip(keys, latencies[4])) latency["direction2"]["10"] = dict(zip(keys, latencies[5])) return latency def visit_suite(self, suite): """Implements traversing through the suite and its direct children. :param suite: Suite to process. :type suite: Suite :returns: Nothing. """ if self.start_suite(suite) is not False: suite.suites.visit(self) suite.tests.visit(self) self.end_suite(suite) def start_suite(self, suite): """Called when suite starts. :param suite: Suite to process. :type suite: Suite :returns: Nothing. """ try: parent_name = suite.parent.name except AttributeError: return doc_str = suite.doc.replace('"', "'").replace('\n', ' ').\ replace('\r', '').replace('*[', ' |br| *[').replace("*", "**") doc_str = replace(doc_str, ' |br| *[', '*[', maxreplace=1) self._data["suites"][suite.longname.lower().replace('"', "'"). replace(" ", "_")] = { "name": suite.name.lower(), "doc": doc_str, "parent": parent_name, "level": len(suite.longname.split(".")) } suite.keywords.visit(self) def end_suite(self, suite): """Called when suite ends. :param suite: Suite to process. :type suite: Suite :returns: Nothing. """ pass def visit_test(self, test): """Implements traversing through the test. :param test: Test to process. :type test: Test :returns: Nothing. """ if self.start_test(test) is not False: test.keywords.visit(self) self.end_test(test) def start_test(self, test): """Called when test starts. :param test: Test to process. :type test: Test :returns: Nothing. """ tags = [str(tag) for tag in test.tags] test_result = dict() test_result["name"] = test.name.lower() test_result["parent"] = test.parent.name.lower() test_result["tags"] = tags doc_str = test.doc.replace('"', "'").replace('\n', ' '). \ replace('\r', '').replace('[', ' |br| [') test_result["doc"] = replace(doc_str, ' |br| [', '[', maxreplace=1) test_result["msg"] = test.message.replace('\n', ' |br| '). \ replace('\r', '').replace('"', "'") if test.status == "PASS" and ("NDRPDRDISC" in tags or "TCP" in tags or "MRR" in tags): if "NDRDISC" in tags: test_type = "NDR" elif "PDRDISC" in tags: test_type = "PDR" elif "TCP" in tags: test_type = "TCP" elif "MRR" in tags: test_type = "MRR" else: return test_result["type"] = test_type if test_type in ("NDR", "PDR"): try: rate_value = str(re.search( self.REGEX_RATE, test.message).group(1)) except AttributeError: rate_value = "-1" try: rate_unit = str(re.search( self.REGEX_RATE, test.message).group(2)) except AttributeError: rate_unit = "-1" test_result["throughput"] = dict() test_result["throughput"]["value"] = \ int(rate_value.split('.')[0]) test_result["throughput"]["unit"] = rate_unit test_result["latency"] = \ self._get_latency(test.message, test_type) if test_type == "PDR": test_result["lossTolerance"] = str(re.search( self.REGEX_TOLERANCE, test.message).group(1)) elif test_type in ("TCP", ): groups = re.search(self.REGEX_TCP, test.message) test_result["result"] = dict() test_result["result"]["value"] = int(groups.group(2)) test_result["result"]["unit"] = groups.group(1) elif test_type in ("MRR", ): groups = re.search(self.REGEX_MRR, test.message) test_result["result"] = dict() test_result["result"]["duration"] = int(groups.group(1)) test_result["result"]["tx"] = int(groups.group(2)) test_result["result"]["rx"] = int(groups.group(3)) test_result["result"]["throughput"] = int( test_result["result"]["rx"] / test_result["result"]["duration"]) else: test_result["status"] = test.status self._test_ID = test.longname.lower() self._data["tests"][self._test_ID] = test_result def end_test(self, test): """Called when test ends. :param test: Test to process. :type test: Test :returns: Nothing. """ pass def visit_keyword(self, keyword): """Implements traversing through the keyword and its child keywords. :param keyword: Keyword to process. :type keyword: Keyword :returns: Nothing. """ if self.start_keyword(keyword) is not False: self.end_keyword(keyword) def start_keyword(self, keyword): """Called when keyword starts. Default implementation does nothing. :param keyword: Keyword to process. :type keyword: Keyword :returns: Nothing. """ try: if keyword.type == "setup": self.visit_setup_kw(keyword) elif keyword.type == "teardown": self._lookup_kw_nr = 0 self.visit_teardown_kw(keyword) else: self._lookup_kw_nr = 0 self.visit_test_kw(keyword) except AttributeError: pass def end_keyword(self, keyword): """Called when keyword ends. Default implementation does nothing. :param keyword: Keyword to process. :type keyword: Keyword :returns: Nothing. """ pass def visit_test_kw(self, test_kw): """Implements traversing through the test keyword and its child keywords. :param test_kw: Keyword to process. :type test_kw: Keyword :returns: Nothing. """ for keyword in test_kw.keywords: if self.start_test_kw(keyword) is not False: self.visit_test_kw(keyword) self.end_test_kw(keyword) def start_test_kw(self, test_kw): """Called when test keyword starts. Default implementation does nothing. :param test_kw: Keyword to process. :type test_kw: Keyword :returns: Nothing. """ if test_kw.name.count("Show Runtime Counters On All Duts"): self._lookup_kw_nr += 1 self._show_run_lookup_nr = 0 self._msg_type = "test-show-runtime" test_kw.messages.visit(self) def end_test_kw(self, test_kw): """Called when keyword ends. Default implementation does nothing. :param test_kw: Keyword to process. :type test_kw: Keyword :returns: Nothing. """ pass def visit_setup_kw(self, setup_kw): """Implements traversing through the teardown keyword and its child keywords. :param setup_kw: Keyword to process. :type setup_kw: Keyword :returns: Nothing. """ for keyword in setup_kw.keywords: if self.start_setup_kw(keyword) is not False: self.visit_setup_kw(keyword) self.end_setup_kw(keyword) def start_setup_kw(self, setup_kw): """Called when teardown keyword starts. Default implementation does nothing. :param setup_kw: Keyword to process. :type setup_kw: Keyword :returns: Nothing. """ if setup_kw.name.count("Show Vpp Version On All Duts") \ and not self._version: self._msg_type = "setup-version" setup_kw.messages.visit(self) def end_setup_kw(self, setup_kw): """Called when keyword ends. Default implementation does nothing. :param setup_kw: Keyword to process. :type setup_kw: Keyword :returns: Nothing. """ pass def visit_teardown_kw(self, teardown_kw): """Implements traversing through the teardown keyword and its child keywords. :param teardown_kw: Keyword to process. :type teardown_kw: Keyword :returns: Nothing. """ for keyword in teardown_kw.keywords: if self.start_teardown_kw(keyword) is not False: self.visit_teardown_kw(keyword) self.end_teardown_kw(keyword) def start_teardown_kw(self, teardown_kw): """Called when teardown keyword starts. Default implementation does nothing. :param teardown_kw: Keyword to process. :type teardown_kw: Keyword :returns: Nothing. """ if teardown_kw.name.count("Show Vat History On All Duts"): self._vat_history_lookup_nr = 0 self._msg_type = "teardown-vat-history" teardown_kw.messages.visit(self) def end_teardown_kw(self, teardown_kw): """Called when keyword ends. Default implementation does nothing. :param teardown_kw: Keyword to process. :type teardown_kw: Keyword :returns: Nothing. """ pass def visit_message(self, msg): """Implements visiting the message. :param msg: Message to process. :type msg: Message :returns: Nothing. """ if self.start_message(msg) is not False: self.end_message(msg) def start_message(self, msg): """Called when message starts. Get required information from messages: - VPP version. :param msg: Message to process. :type msg: Message :returns: Nothing. """ if self._msg_type: self.parse_msg[self._msg_type](msg) def end_message(self, msg): """Called when message ends. Default implementation does nothing. :param msg: Message to process. :type msg: Message :returns: Nothing. """ pass class InputData(object): """Input data The data is extracted from output.xml files generated by Jenkins jobs and stored in pandas' DataFrames. The data structure: - job name - build number - metadata - job - build - vpp version - suites - tests - ID: test data (as described in ExecutionChecker documentation) """ def __init__(self, spec): """Initialization. :param spec: Specification. :type spec: Specification """ # Specification: self._cfg = spec # Data store: self._input_data = None @property def data(self): """Getter - Input data. :returns: Input data :rtype: pandas.Series """ return self._input_data def metadata(self, job, build): """Getter - metadata :param job: Job which metadata we want. :param build: Build which metadata we want. :type job: str :type build: str :returns: Metadata :rtype: pandas.Series """ return self.data[job][build]["metadata"] def suites(self, job, build): """Getter - suites :param job: Job which suites we want. :param build: Build which suites we want. :type job: str :type build: str :returns: Suites. :rtype: pandas.Series """ return self.data[job][str(build)]["suites"] def tests(self, job, build): """Getter - tests :param job: Job which tests we want. :param build: Build which tests we want. :type job: str :type build: str :returns: Tests. :rtype: pandas.Series """ return self.data[job][build]["tests"] @staticmethod def _parse_tests(job, build): """Process data from robot output.xml file and return JSON structured data. :param job: The name of job which build output data will be processed. :param build: The build which output data will be processed. :type job: str :type build: dict :returns: JSON data structure. :rtype: dict """ tree = ET.parse(build["file-name"]) root = tree.getroot() generated = root.attrib["generated"] with open(build["file-name"], 'r') as data_file: try: result = ExecutionResult(data_file) except errors.DataError as err: logging.error("Error occurred while parsing output.xml: {0}". format(err)) return None checker = ExecutionChecker(job=job, build=build, generated=generated) result.visit(checker) return checker.data def read_data(self): """Parse input data from input files and store in pandas' Series. """ logging.info("Parsing input files ...") job_data = dict() for job, builds in self._cfg.builds.items(): logging.info(" Extracting data from the job '{0}' ...'". format(job)) builds_data = dict() for build in builds: if build["status"] == "failed" \ or build["status"] == "not found": continue logging.info(" Extracting data from the build '{0}'". format(build["build"])) logging.info(" Processing the file '{0}'". format(build["file-name"])) data = InputData._parse_tests(job, build) if data is None: logging.error("Input data file from the job '{job}', build " "'{build}' is damaged. Skipped.". format(job=job, build=build["build"])) continue build_data = pd.Series({ "metadata": pd.Series(data["metadata"].values(), index=data["metadata"].keys()), "suites": pd.Series(data["suites"].values(), index=data["suites"].keys()), "tests": pd.Series(data["tests"].values(), index=data["tests"].keys()), }) builds_data[str(build["build"])] = build_data logging.info(" Done.") job_data[job] = pd.Series(builds_data.values(), index=builds_data.keys()) logging.info(" Done.") self._input_data = pd.Series(job_data.values(), index=job_data.keys()) logging.info("Done.") @staticmethod def _end_of_tag(tag_filter, start=0, closer="'"): """Return the index of character in the string which is the end of tag. :param tag_filter: The string where the end of tag is being searched. :param start: The index where the searching is stated. :param closer: The character which is the tag closer. :type tag_filter: str :type start: int :type closer: str :returns: The index of the tag closer. :rtype: int """ try: idx_opener = tag_filter.index(closer, start) return tag_filter.index(closer, idx_opener + 1) except ValueError: return None @staticmethod def _condition(tag_filter): """Create a conditional statement from the given tag filter. :param tag_filter: Filter based on tags from the element specification. :type tag_filter: str :returns: Conditional statement which can be evaluated. :rtype: str """ index = 0 while True: index = InputData._end_of_tag(tag_filter, index) if index is None: return tag_filter index += 1 tag_filter = tag_filter[:index] + " in tags" + tag_filter[index:] def filter_data(self, element, params=None, data_set="tests", continue_on_error=False): """Filter required data from the given jobs and builds. The output data structure is: - job 1 - build 1 - test (suite) 1 ID: - param 1 - param 2 ... - param n ... - test (suite) n ID: ... ... - build n ... - job n :param element: Element which will use the filtered data. :param params: Parameters which will be included in the output. If None, all parameters are included. :param data_set: The set of data to be filtered: tests, suites, metadata. :param continue_on_error: Continue if there is error while reading the data. The Item will be empty then :type element: pandas.Series :type params: list :type data_set: str :type continue_on_error: bool :returns: Filtered data. :rtype pandas.Series """ logging.info(" Creating the data set for the {0} '{1}'.". format(element.get("type", ""), element.get("title", ""))) try: if element["filter"] in ("all", "template"): cond = "True" else: cond = InputData._condition(element["filter"]) logging.debug(" Filter: {0}".format(cond)) except KeyError: logging.error(" No filter defined.") return None if params is None: params = element.get("parameters", None) data = pd.Series() try: for job, builds in element["data"].items(): data[job] = pd.Series() for build in builds: data[job][str(build)] = pd.Series() try: data_iter = self.data[job][str(build)][data_set].\ iteritems() except KeyError: if continue_on_error: continue else: return None for test_ID, test_data in data_iter: if eval(cond, {"tags": test_data.get("tags", "")}): data[job][str(build)][test_ID] = pd.Series() if params is None: for param, val in test_data.items(): data[job][str(build)][test_ID][param] = val else: for param in params: try: data[job][str(build)][test_ID][param] =\ test_data[param] except KeyError: data[job][str(build)][test_ID][param] =\ "No Data" return data except (KeyError, IndexError, ValueError) as err: logging.error(" Missing mandatory parameter in the element " "specification: {0}".format(err)) return None except AttributeError: return None except SyntaxError: logging.error(" The filter '{0}' is not correct. Check if all " "tags are enclosed by apostrophes.".format(cond)) return None @staticmethod def merge_data(data): """Merge data from more jobs and builds to a simple data structure. The output data structure is: - test (suite) 1 ID: - param 1 - param 2 ... - param n ... - test (suite) n ID: ... :param data: Data to merge. :type data: pandas.Series :returns: Merged data. :rtype: pandas.Series """ logging.info(" Merging data ...") merged_data = pd.Series() for _, builds in data.iteritems(): for _, item in builds.iteritems(): for ID, item_data in item.iteritems(): merged_data[ID] = item_data return merged_data