Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Test cases should favour implementation of a published standard interface for validation
    • Where a compliance test suite exists for components of the SUT, this test suite should generally be considered as a baseline for Dovetail testing
    • Where no standard is available provide API support references
    • If a standard exists and is not followed, an exemption is required
  • The following things must be documented for the test case:
    • Use case specification
    • Test preconditions
    • Basic test flow execution descriptor
    • Post conditions and pass fail criteria
  • The following things may be documented for the test case:
    • Parameter border test cases descriptions
    • Fault/Error test case descriptions
  • Test cases must pass on OPNFV reference deployments
    • Tests must not require a specific NFVi platform composition or installation tool
    • Tests must not require unmerged patches to the relevant upstream projects
    • Tests must not require features or code which are out of scope for the latest release of the OPNFV project

...

Dovetail

...

These are additonal requirements and assumptions that should be considered by the group.  This agreement / issue is being tracked in JIRA, under DOVETAIL-352.  As these are agreed, they should be moved above into the full list.  Once the story is completed, this section can be deleted.

Assumptions

  1. Tests start from (use) an already installed / deployed OPNFV platform. OPNFV deployment/install testing is not a target of the program (that is for CI).

Requirements

  1. All test cases must be fully documented, in a common format, clearly identifying the test procedure and expected results / metrics to determine a “pass” or “fail” result for the test case.
  2. Tests and tool must support / run on both vanilla OPNFV and commercial OPNFV based solution (i.e. the tests and tool can not use interfaces or hooks that are internal to OPNFV, i.e. something during deployment / install / etc.).
  3. Tests and tool must run independent of installer (Apex, Joid, Compass) and architecture (Intel / ARM).
  4. Tests and tool must run independent of specific OPNFV components, allowing different components to “swap in”. An example would be using a different storage than Ceph.
  5. Tool / Tests must be validated for purpose, beyond running on the platform (this may require each test to be run with both an expected positive and negative outcome, to validate the test/tool for that case).
  6. Tests should focus on functionality and not performance.
    1. Performance test output could be built in as “for information only,” but must not carry pass/fail metrics.

Dovetail Test Suite Structure

A dovetail test suite should have the following overall components and structure:  (stolen, if simplified a little, from IEEE)

...

hongbo: the same as those defined in the phase 1 and phase 2

Annotated brainstorming/requirements proposals

Additional Requirements / Assumptions (not yet agreed by Dovetail Group)

These are additonal requirements and assumptions that should be considered by the group.  This agreement / issue is being tracked in JIRA, under DOVETAIL-352.  As these are agreed, they should be moved above into the full list.  Once the story is completed, this section can be deleted.

Assumptions

  1. Tests start from (use) an already installed / deployed OPNFV platform. OPNFV deployment/install testing is not a target of the program (that is for CI).
    1. DN: Dovetail should be able to test platforms which are not OPNFV scenarios - we have a requirement that OPNFV scenarios should be able to pass the test suite, which ensures that only features in scope for OPNFV can be included

Requirements

  1. All test cases must be fully documented, in a common format, clearly identifying the test procedure and expected results / metrics to determine a “pass” or “fail” result for the test case.

      ...

        1. DN: We currently list a set of things which must be documented for test cases - is this insufficient, in combination with the test strategy document?
      1. Tests and tool must support / run on both vanilla OPNFV and commercial OPNFV based solution (i.e. the tests and tool can not use interfaces or hooks that are internal to OPNFV, i.e. something during deployment / install / etc.).
        1. DN: Again, there is already a requirement thsat tests pass on reference OPNFV deployment scenarios
      2. Tests and tool must run independent of installer (Apex, Joid, Compass) and architecture (Intel / ARM).
        1. DN: This is already in the requirements: "Tests must not require a specific NFVi platform composition or installation tool"
      3. Tests and tool must run independent of specific OPNFV components, allowing different components to “swap in”. An example would be using a different storage than Ceph.
        1. DN: This is also covered by the above test requirement
      4. Tool / Tests must be validated for purpose, beyond running on the platform (this may require each test to be run with both an expected positive and negative outcome, to validate the test/tool for that case).
        1. DN: I do not understand what this proposal refers to
      5. Tests should focus on functionality and not performance.
        1. Performance test output could be built in as “for information only,” but must not carry pass/fail metrics.
        2. DN: This is covered in the CVP already

      Additional brainstorming

      This section is intended to help us discuss and collaboratively edit a set of guiding principles for the dovetail program in OPNFV.

      ...

      • The  basic assumption is that the scope of compliance testing falls within the included features of a given OPNFV release (as its upper bound)
      • DN: Additional clarification added that OPNFV is the upper bound: "Tests must not require features or code which are out of scope for the latest release of the OPNFV project"
      2) Leverage upstream frameworks

      ...

      • work proactively to further develop those frameworks to meet OPNFV needs.
      • DN: Requirements modified to clarify: "Where a compliance test suite exists for components of the SUT, this test suite should generally be considered as a baseline for Dovetail testing" - there was a feeling that we could not wholesale accept upstream frameworks without question, but the goal is to ensure that OPNFV compliant solutions are also compliant to component certification requirements
      3) Deliver clear value: release only tests that add clear value to SUT providers, end-users, and the community
      • DN: Not easily actionable for test case requirements

      4) Agile development with master and stable releases

      • fast rollout with minimum viable baseline, and continuous agile extension for new tests
      • Support SUT assessment against master and stable releases.
      • DN: This is a process issue, not a test case requirement
      5) Establish and maintain a roadmap for the program

      ...

      • well-established features, i.e. features that have become broadly supported by current OPNFV distros)
      • DN: This is a process issue, not a requirement for an individual test case
      6) Move tests upstream asap

      ...

      • one more point: upstream testing is sometimes developer-oriented, not always focused on compliance. in such cases, we need a way of either enhancing the upstream test suite or coming up a methodoly in dovetail.
      • DN: Requirement added to underline that upstream test suites are a baseline, but I do not think that we can specify in test case requirements that the test has been proposed to the relevant upstream