1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
|
.. This work is licensed under a
Creative Commons Attribution 4.0 International License.
.. integration-tests:
Tests
=====
.. important::
Integration is in charge of several types of tests:
- Use Cases: developed by use case teams, usually complex, demonstrating high value capabilities of ONAP. They may be partially automated and even
integrated in CD.
- CSIT Tests: functional tests created by the projects, partially hosted in CSIT repository
- Automatic Test Cases: these use cases are usually more simple and aim to validate that ONAP is working properly.
These tests have been developed to validate ONAP as a software solution.
In theory all the main functions shall be covered by such tests in order to have more robust CI/CD and then avoid regressions.
These tests are usually developed and maintained by the integration team.
We may also indicate that when the development of the test framework python-onapsdk
follows standard development quality rules and imposes the creation of
unit/functional/integration tests.
As an example python-onapsdk requires a unit test coverage of 98% before merging
a new feature, which is far above the project criteria in SonarCloud today.
Use Cases
---------
The use cases are described in :ref:`Verified Use cases <docs_usecases>`.
CSIT Tests
----------
The CSIT tests are functional tests executed by the projects on mocked
environment to validate their components.
Historically it was hosted in a CSIT repository.
Integration team invited the projects to bring back such tests back to home
repository for 2 main reasons:
- integration cannot be a bottleneck: +2/merge from integration needed for each
project
- most of the tests are abandoned and not maintained when hosted in a third party
repository leading to CI/CD resource waste and misleading test reporting
In Guilin a PoC to help the project to re-insource their functional tests have
been initiated.
See `CSIT wiki page <https://wiki.onap.org/display/DW/Maximizing+Benefits+of+CSIT+in+ONAP+Development>`__
for details.
Automatic Tests
---------------
These tests are run daily/weekly on each new gate (new patchset in OOM, clamp
or SO). They can be in any language (bash, go, python,...), leveraging any test
framework (robotframework, MTS, python-onapsdk).
They are all embedded in `xtesting <https://pypi.org/project/xtesting/>`__ dockers.
.. hint::
Automatic tests are currently divided in 4 different categories:
- infrastructure-healthcheck: tests from OOM checking the ONAP namespace, certificates...
- healthcheck: basic tests on components
- smoke tests: end to end tests
- security tests
A dashboard summarizing the status and providing the links to the test result
page or the logs is automatically created at the end of the execution of the
tests.
.. figure:: files/tests/test-dashboard.png
All the pages and artifacts are pushed to LF backend:
- Daily chaines: https://logs.onap.org/onap-integration/daily
- Weekly chains: https://logs.onap.org/onap-integration/weekly
- Gating chains: https://logs.onap.org/onap-integration/gating
Infrastructure Healthcheck Tests
................................
.. csv-table:: Infrastructure Healthcheck Tests
:file: ./files/csv/tests-infrastructure-healthcheck.csv
:widths: 20,40,20,20
:delim: ;
:header-rows: 1
See `Infrastructure Healthcheck README <https://git.onap.org/integration/xtesting/tree/infra-healthcheck/README.md>`__
to adapt then run infrastructure healthcheck tests on your own system.
Please note that the onap-k8s is run 2 times in CD chains. It is run just after
the installation (onap-k8s) and at the end of the test execution (onap-k8s-teardown)
in order to collect the logs of the different components during the test execution.
.. figure:: files/tests/test-onap-k8s.png
Healthcheck Tests
................
.. csv-table:: Healthcheck Tests
:file: ./files/csv/tests-healthcheck.csv
:widths: 20,40,20,20
:delim: ;
:header-rows: 1
See `Healthcheck README <https://git.onap.org/integration/xtesting/tree/healthcheck/README.md>`__
to adapt then run healthcheck tests on your own system.
Smoke Tests
...........
.. csv-table:: Smoke Tests
:file: ./files/csv/tests-smoke.csv
:widths: 20,40,20,20
:delim: ;
:header-rows: 1
See `Python smoke test README <https://git.onap.org/integration/xtesting/tree/smoke-usecases-robot/README.md>`__
to adapt and run robot based smoke tests.
An html page is generated by the pythonsdk-test tests.
.. figure:: files/tests/test-basic-cnf.png
See `Robot smoke test README <https://git.onap.org/integration/xtesting/tree/smoke-usecases-pythonsdk/README.md>`__
to adapt and run pythonsdk based smoke tests.
Standard Robot html pages are generated. See :ref:`Robot page <docs_robot>`.
Security Tests
...............
.. csv-table:: Security Tests
:file: ./files/csv/tests-security.csv
:widths: 20,40,20,20
:delim: ;
:header-rows: 1
See `Security test README <https://git.onap.org/integration/xtesting/tree/security/README.md>`__
to adapt then run the security tests on your own system.
Note for security tests, integration team follows `SECCOM recommendations and
apply waivers granted by SECCOM if needed through xfail lists <https://git.onap.org/integration/seccom/tree/>`__.
Stability Testing
-----------------
Ensuring the stability of ONAP is one of the missions of the Integration team.
CI chains and stability tests are performed to help stabilising the release.
See :ref:`Integration stability tests <integration-s3p>` for details.
|