summaryrefslogtreecommitdiffstats
path: root/docs/contributing.rst
diff options
context:
space:
mode:
authorLovett, Trevor <trevor.lovett@att.com>2019-09-03 10:52:15 -0500
committerLovett, Trevor (tl2972) <tl2972@att.com>2019-09-03 10:52:57 -0500
commit7766554b3057c8955ec8c0c15e4d9c62a83eb126 (patch)
treedeb5619344620794cd881bf33ce37398f33293c8 /docs/contributing.rst
parent97e4fc1a526984c82720a3a6bed5419450fff567 (diff)
[VVP] Update the Contributing.rst and Release Notes
Addresses: - How to add tests for the validations - Hardening practices followed by the project - How to perform quality checks before submission Issue-ID: VVP-250 Issue-ID: VVP-251 Issue-ID: VVP-253 Change-Id: I8be8626372bd25ef2089491a2f1e4938b21cdcfa Signed-off-by: Lovett, Trevor <trevor.lovett@att.com>
Diffstat (limited to 'docs/contributing.rst')
-rw-r--r--docs/contributing.rst215
1 files changed, 185 insertions, 30 deletions
diff --git a/docs/contributing.rst b/docs/contributing.rst
index 2643f7c..8b561af 100644
--- a/docs/contributing.rst
+++ b/docs/contributing.rst
@@ -2,43 +2,107 @@
.. http://creativecommons.org/licenses/by/4.0
.. Copyright 2019 AT&T Intellectual Property. All rights reserved.
-Contributing
-############
+How to Contribute
+#################
+
+Overview
+~~~~~~~~
+
+This section will provide details on how to contribute changes to the project
+covering both the mechanics of how to contribute new code as well as
+how to adhere to the code quality and coding practices of the project.
+
+
+Prerequisites
+~~~~~~~~~~~~~
+
+As this project is part of the overall ONAP project, there are some common
+guidelines and activities you will need to adhere to:
+
+- Acquire a `Linux Foundation ID <https://identity.linuxfoundation.org/>`__
+- Configure your `ONAP Gerrit Account <https://wiki.onap.org/display/DW/Configuring+Gerrit>`__
+ as it is used for all code submissions and reviews
+- Install VVP per the :ref:`installation instructions <vvp-installation>`
+- Most features will require some knowledge of both `Openstack Heat <https://wiki.openstack.org/wiki/Heat>`__
+ and the `ONAP Heat requirements <https://onap.readthedocs.io/en/latest/submodules/vnfrqts/requirements.git/docs/Chapter5/Heat/index.html>`__
+- Knowledge of writing tests in `pytest <https://pytest.readthedocs.io/>`__
+
+Other useful links:
+
+- All work is documented and tracked via the `VVP Project <https://jira.onap.org/projects/VVP/issues/>`__
+ in the `ONAP JIRA instance <https://jira.onap.org/>`__. Login is via your
+ Linux Foundation ID
+- Proposals for new features, general information about the projects,
+ meeting minutes, and ONAP process information is located on the
+ `ONAP Wiki <https://wiki.onap.org/>`__
+- The VVP project hosts a weekly meeting to plan upcoming work, discuss open
+ issues, and align on priorities. Please consider attending if possible if you
+ intend to contribute to the project. Refer to the ONAP Calendar `<https://wiki.onap.org/pages/viewpage.action?pageId=6587439>`__
+ for scheduling details
Objective
~~~~~~~~~
-**The objective for the VVP test suite is for each
-test to directly correlate with at least one requirement in the**
-`VNF Requirements <https://onap.readthedocs.io/en/latest/submodules/vnfrqts/requirements.git/docs/index.html>`__
-**project in ONAP. If the test you intend to write doesn't
-have a corresponding requirement in the VNF Requirements project, consider
-making a contribution to that project first.**
+The primary focus of VVP is ensuring that a VNF that is described using Openstack
+Heat complies with the ONAP Heat requirements specified in the `VNF Requirements (VNFRQTS) <https://onap.readthedocs.io/en/latest/submodules/vnfrqts/requirements.git/docs/index.html>`__
+project. If a VNF does not comply with these rules, then it may not successfully
+be modeled in SDC, fail to instantiate, be improperly inventoried in A&AI, or
+fail orchestration.
-Convenience vs Convention
-~~~~~~~~~~~~~~~~~~~~~~~~~
+The project aims to validate every mandatory requirement in the VNF Requirements
+related to Heat (i.e. all requirements with a **MUST** or **MUST NOT** keyword)
-There are a lot of ways to write tests. Priorities for the VVP test suite are
+Heat templates are validated using tests written in ``pytest``. Each test will
+validate one or more requirements. Typically we strive to have 1 test per
+requirement, but there are situations where it is easiest and clearest to
+validate multiple, tightly related requirements with a single test.
- - Accuracy
- - User Comprehension
+Every test **MUST** have a corresponding requirement in the ONAP VNF Requirements
+project. If your contribution is a test and there is not an appropriate
+requirement, then please consider making a contribution to that project first.
-The test suite is often used by people who don't write code, or people
-who aren't devoted to writing python validation tests.
-
-The output of failed validation tests can be difficult to read, so
-keep that in mind when you are deciding whether to create another
-level of abstraction vs having some code duplication or verbose tests.
Writing Tests
#############
+Coding Conventions
+~~~~~~~~~~~~~~~~~~
+
+* Follow `PEP-8 conventions <https://www.python.org/dev/peps/pep-0008/>`__
+ * NOTE: The only variation is that the line-length can be 88
+ characters vs. 80
+* Familiarize yourself with the utilities that exist in the following utility
+ modules and leverage them to avoid duplication.
+
+ - ``ice_validator/tests/helpers.py``
+ - ``ice_validator/tests/structures.py``
+ - ``ice_validator/tests/utils/**``
+
+* Ensure all source files include the standard Apache License 2.0 text and
+ appropriate copyright statement (see other source files for example)
+* All code must pass standard quality checks prior to submission which can be
+ executed via ``tox`` or by running ``checks.py``
+* When parsing YAML, always use the ``tests/cached_yaml`` module versus the
+ default ``yaml`` module. This will greatly improve performance due to the
+ large number of yaml files that must be parsed.
+* In an effort to keep the number of dependencies down, please favor using
+ the Python standard library unless using an external library significantly
+ improves or reduces the needed code to implement the functionality.
+* For security purposes the following hardening activities are used:
+
+ - Avoid usage of ``yaml.load`` and always use ``yaml.safe_load`` (Note: if you
+ use ``cached_yaml`` as instructed above, then this is covered automatically)
+ - Docker containers must not be run as root (see current Dockerfile for
+ an example)
+ - Inspect and resolve all findings by the bandit scans
+
File Name
~~~~~~~~~
-Test files are written in python, and should go into the
+Test files are written in Python, and should go into the
``/validation-scripts/ice_validator/tests/`` directory. They should be prefixed
-with ``test_``. If not, ``pytest`` will not discover your test.
+with ``test_``. If not, ``pytest`` will not discover your test. The file name
+should reflect what is being tested.
Test Name
~~~~~~~~~
@@ -56,11 +120,11 @@ For example:
Requirement Decorator
~~~~~~~~~~~~~~~~~~~~~
-Each test function should be decorated with a requirement ID from the
+Each test function must be decorated with a requirement ID from the
VNF Requirements project. The following is required to be imported at
the top of the test file:
-``from .helpers import validates``
+``from tests.helpers import validates``
Then, your test function should be decorated like this:
@@ -75,10 +139,19 @@ report that includes the requirements that were violated. If a test is not
decorated it is unclear what the reason for a failure is, and the
implication is that the test is not needed.
+The validation reports will show the requirement text that was violated and
+it will be pulled from the ``heat_requirements.json`` file. This file
+is published by the VNFRQTS project, and VVP maintains a copy of the file.
+Your requirement should be present in this file. The ``update_reqs.py``
+command can be used to re-synchronize the VVP copy with VNFRQTS master.
+
Test Parameters
~~~~~~~~~~~~~~~
-Each test should be parameterized based on what artifact is being validated.
+There are several dynamic fixtures that can be injected into a test based on
+what the test is attempting to validate. Each test should be parameterized based
+on what artifact is being validated.
+
Available parameters are enumerated in
``/validation-scripts/ice_validator/tests/parameterizers.py``. Below is a description
of the most commonly used:
@@ -143,8 +216,9 @@ Optional: Pytest Markers and Validation Categories
The VVP test suite has the concept of a ``base`` test. These are used as
sanity tests and are executed before the other tests, and if they fail the
-test suite execution is halted. If you are writing a ``base`` test, mark your
-test like this:
+test suite execution is halted. A test should be annotated with ``base`` if the
+failure is likely to generate many subsequent failures (ex: improperly formatted
+YAML). If you are writing a ``base`` test, mark your test like this:
.. code-block:: python
@@ -155,16 +229,17 @@ test like this:
def test_my_new_requirement():
The VVP test suite also has the concept of a ``category`` to
-define what additional set of optional tests to execute. The way it works
-is by using ``categories`` decorator.
+define what additional set of optional tests to execute when requested by the
+end user. The way it works is by applying the ``categories`` decorator to the
+test.
By default, all ``base`` tests and tests with no category are executed.
If you want an additional category to run, pass the command line argument:
``--category=<category>``
-This will execute all ``base`` tests, non-marked tests,
-and tests marked like the following:
+This will extend the default set of tests to also include tests marked with
+the requested category like the following:
.. code-block:: python
@@ -176,3 +251,83 @@ and tests marked like the following:
This should be used sparingly, and in practice consider reviewing a requirement
with the VNF Requirements team before adding a test to a category.
+
+Testing your Test
+~~~~~~~~~~~~~~~~~
+
+Every Heat validation test must have a unit test that validates the test is
+working as expected. This is handled by creating a one or more "fixtures" that
+will exercise the test and validate the expected result.
+
+The fixtures are stored in the ``ice_validator/tests/fixtures`` directory under
+a directory that matches the test file name **exactly**.
+
+For example, if your test is named ``test_neutron_ports.py``, then the test
+fixtures must be in the ``ice_validator/tests/fixtures/test_neutron_ports/``
+directory.
+
+At minimum, each test must have one example of heat templates/files that
+pass (stored in the ``pass`` subdirectory), and one example that fails (
+stored in the ``fail`` subdirectory). These templates do not need to be complete,
+valid Heat template - they only need to include the minimum content to
+validate the test.
+
+If you need to test multiple conditions or branches of your test, then you
+can nest other directories under your test's fixture directory. Each nested
+directory, must in turn have a ``pass`` and ``fail`` subdirectory.
+
+.. code-block::
+
+ ice_validator/
+ |--- tests/
+ |--- fixtures/
+ |--- test_neutron_ports/
+ |--- scenario_one/
+ | |--- pass/
+ | |--- fail/
+ |--- scenario_two/
+ |--- pass/
+ |--- fail/
+
+To execute all tests for the entire suite, issue the following commmand
+from the ``ice_validator`` directory:
+
+``pytest --self-test``
+
+If you wish to selectively execute your test against one of the fixtures,
+then issue the following command from the ``ice_validator`` directory:
+
+``pytest tests/<test_file>.py --template-directory=tests/fixtures/<test_file>/<scenario>``
+
+If you have contributed code outside of a ``tests_*.py`` file, then you should
+create suitable tests for that functionality in the ``app_tests`` directory.
+The tests should be compatible with ``pytest``, but these tests
+do not use the fixtures mechanism.
+
+Submitting Your Changes For Review
+##################################
+
+Once you have completed your changes and tested they work as expected, then the
+next step is to validate they are ready for submission. The ``checks.py``
+module in the root directory contains are variety of code quality checks
+that the build server will execute. These can be executed locally using ``tox``
+or simply running ``checks.py``.
+
+At the time of this writing, the following checks will be performed:
+
+- Executing the full test suite (``app_tests`` and ``--self-test``)
+- flake8 code style validation
+- Ensuring the ``heat_requirements.json`` file is up-to-date with VNFRQTS
+ (run ``update_reqs.py`` if this check fails)
+- Ensures all mandatory tests from VNFRQTS have tests in VVP
+- Security checks via bandit
+
+Once all tests are passed, then refer to `Pushing Changes Using Git <https://wiki.onap.org/display/DW/Pushing+Changes+Using+Git>`__
+for details on how to submit your change.
+
+Once your change has been submitted, please add the following individuals as
+reviewers at minimum:
+
+- Steven Stark
+- Trevor Lovett
+- Steven Wright