This page is intended as a working draft of how we will look to leverage and develop test cases for the dovetail project.

Dovetail Test Suite Purpose and Goals

The dovetail test suite is intended to provide a method for validating the interfaces and behaviors of an NFVi platform according to the expected capabilities exposed in an OPNFV NVFi instance, and to provide a baseline of functionality to enable VNF portability across different OPNFV NFVi instances. All dovetail tests will be available in open source and will be developed on readily available open source test frameworks.

Working with the ETSI NFV TST 001 reference: http://www.etsi.org/deliver/etsi_gs/NFV-TST/001_099/001/01.01.01_60/gs_NFV-TST001v010101p.pdf
The Dovetail project will focus on tests validating a scope for a System Under Test (SUT) associated with Chapter 4.9 - NFV Infrastructure + VIM Under Test, as adapted for OPNFV (see figure below).  The test suite will also define preconditions and assumptions about the state of any platform under evaluation. The test suite must not require access to OPNFV infrastructre or resources in order to pass.

Test case requirements

The following requirements should be fulfilled for all tests added to the Dovetail test suite:

New test case proposals should complete a Dovetail test case worksheet to ensure that all of these considerations are met before the test case is approved for inclusion in the Dovetail test suite.

Considerations for the Test Strategy Document

There is a draft test strategy document for OPNFV in preparation (owner: Chris Price).

 

Dovetail Test Suite Structure

A dovetail test suite should have the following overall components and structure:  (stolen, if simplified a little, from IEEE)

Dovetail Test Suite Naming Convention

Test case naming and structure for dovetail, external facing naming sequences for compliance and certification test cases.

Dovetail Test Result Compilation, Storage and Authentication

Test execution identification, results evaluation, storage, identification and security for dovetail compliance and certification test cases.

Phasing the Dovetail Development Effort

While not all tests will be possible to develop at once the following approach is proposed for the development of dovetail test suites in a structured manner.

Dovetail phase 1

Dovetail should initially set out to provide validation of interfaces and behaviors common to an OPNFV NVFi.  This can be seen as a set of test cases that evaluate if a NVFi implementation is able to achieve a steady operational state covering the common behaviors expected of an OPNFV NFVi.  In this case the dovetail tests will focus on a SUT definition of VNFi & VIM as described in 4.9 of the ETSI NFV TST 001 specification.

Dovetail phase 2

Dovetail should further establish a set of test suites that validate additional desired OPNFV VNFi behaviours.  This may include for instance, deployment specific capabilities for edge or remote installations.  It may include the validation of functionality that is not yet common to all OPNFV VNFi scenario's.

In phase 2 it may also be possible that dovetail provides such services as application test suites to validate the behavior of applications in preparation for deployment on an OPNFV VNFi.  This may result in the definition of new SUT scopes for dovetail as described in of the ETSI NFV TST 001 specification.

Dovetail phase 3

hongbo: the same as those defined in the phase 1 and phase 2

Annotated brainstorming/requirements proposals

Additional Requirements / Assumptions (not yet agreed by Dovetail Group)

These are additional requirements and assumptions that should be considered by the group.  This agreement / issue is being tracked in JIRA, under DOVETAIL-352.  As these are agreed, they should be moved above into the full list.  Once the story is completed, this section can be deleted.

Assumptions

  1. Tests start from (use) an already installed / deployed OPNFV platform. OPNFV deployment/install testing is not a target of the program (that is for CI).
    1. DN: Dovetail should be able to test platforms which are not OPNFV scenarios - we have a requirement that OPNFV scenarios should be able to pass the test suite, which ensures that only features in scope for OPNFV can be included

Requirements

  1. All test cases must be fully documented, in a common format, clearly identifying the test procedure and expected results / metrics to determine a “pass” or “fail” result for the test case.
    1. DN: We currently list a set of things which must be documented for test cases - is this insufficient, in combination with the test strategy document?
      1. lylavoie - No, we need to have the actual list of what things are tested in Dovetail, and how those things are tested.  Otherwise, how to can we even begin to know if the tool test the things we think it does (i.e. validate the tool).
      2. DN: I agree, but the scope of the test suite is a topic for the test strategy document, and not part of the decision criteria for a specific, individual test. Test case purpose and pass/fail criteria are part of the documentation requirements.
  2. Tests and tool must support / run on both vanilla OPNFV and commercial OPNFV based solution (i.e. the tests and tool can not use interfaces or hooks that are internal to OPNFV, i.e. something during deployment / install / etc.).
    1. DN: Again, there is already a requirement thsat tests pass on reference OPNFV deployment scenarios
      1. lylavoie: Yes, but it can not do that by requiring access to something "under the hood," this might be obvious, but it's an important requirement for Dovetail developers to know.
      2. DN: Good point. We do have that test cases must use public standard interfaces and APIs.
  3. Tests and tool must run independent of installer (Apex, Joid, Compass) and architecture (Intel / ARM).
    1. DN: This is already in the requirements: "Tests must not require a specific NFVi platform composition or installation tool"
  4. Tests and tool must run independent of specific OPNFV components, allowing different components to “swap in”. An example would be using a different storage than Ceph.
    1. DN: This is also covered by the above test requirement
  5. Tool / Tests must be validated for purpose, beyond running on the platform (this may require each test to be run with both an expected positive and negative outcome, to validate the test/tool for that case).
    1. DN: I do not understand what this proposal refers to
      1. lylavoie - The tool and program need must be validated.  For example, if a test case purpose is to verify a specific API is implemented or functions is a specific way, we need to verify the test tool does actually test that API/function.  Put differently, we need to check the test tool doesn't false pass or false fail devices.  This is far beyond just a normal CI type test (i.e. did it compile and pass some unit tests).
      2. DN: This is part of the test strategy document rather than part of the requirements for a specific test
  6. Tests should focus on functionality and not performance.
    1. Performance test output could be built in as “for information only,” but must not carry pass/fail metrics.
    2. DN: This is covered in the CVP already

Additional brainstorming

This section is intended to help us discuss and collaboratively edit a set of guiding principles for the dovetail program in OPNFV.
1) Focus on OPNFV role and unique aspects, e.g.
2) Leverage upstream frameworks
3) Deliver clear value: release only tests that add clear value to SUT providers, end-users, and the community

4) Agile development with master and stable releases

5) Establish and maintain a roadmap for the program
6) Move tests upstream asap