SWE.4 Software Unit Verification
Process ID SWE.4
Process Name Software Unit Verification
Process Purpose The purpose of the Software unit Verification Process is to verify software units to provide evidence for compliance of the software units with the software detailed design and with the non-functional software requirements.
Process Outcomes

As a result of successful implementation of this process:

  1. a software unit verification strategy including regression strategy is developed to verify the software units;

  2. criteria for software unit verification are developed according to the software unit verification strategy that are suitable to provide evidence for compliance of the software units with the software detailed design and with the non-functional software requirements;

  3. software units are verified according to the software unit verification strategy and the defined criteria for software unit verification and the results are recorded;

  4. consistency and bidirectional traceability are established between software units, criteria for verification and verification results; and

  5. results of the unit verification are summarized and communicated to all affected parties.

Base Practices

SWE.4.BP1: Develop software unit verification strategy including regression strategy. Develop a strategy for verification of the software units including regression strategy for re-verification if a software unit is changed. The verification strategy shall define how to provide evidence for compliance of the software units with the software detailed design and with the non-functional requirements. [Outcome 1]

NOTE 1: Possible techniques for unit verification include static/dynamic analysis, code reviews, unit testing etc.

SWE.4.BP2: Develop criteria for unit verification. Develop criteria for unit verification that are suitable to provide evidence for compliance of the software units, and their interactions within the component, with the software detailed design and with the non-functional requirements according to the verification strategy. For unit testing, criteria shall be defined in a unit test specification. [OUTCOME 2]

NOTE 2: Possible criteria for unit verification include unit test cases, unit test data, static verification, coverage goals and coding standards such as the MISRA rules.

NOTE 3: The unit test specification may be implemented e.g. as a script in an automated test bench.

ESWE.4.BP3: Perform static verification of software units. Verify software units for correctness using the defined criteria for verification. Record the results of the static verification. [Outcome 3]

NOTE 4: Static verification may include static analysis, code reviews, checks against coding standards and guidelines, and other techniques.

NOTE 5: See SUP.9 for handling of non-conformances.

SWE.4.BP4: Test software units. Test software units using the unit test specification according to the software unit verification strategy. Record the test results and logs. [Outcome 3]

NOTE 6: See SUP.9 for handling of non-conformances.

SWE.4.BP5: Establish bidirectional traceability. Establish bidirectional traceability between software units and static verification results. Establish bidirectional traceability between the software detailed design and the unit test specification. Establish bidirectional traceability between the unit test specification and unit test results. [Outcome 4]

NOTE 7: Bidirectional traceability supports coverage, consistency and impact analysis.

SWE.4.BP6: Ensure consistency. Ensure consistency between the software detailed design and the unit test specification. [Outcome 4]

NOTE 8: Consistency is supported by bidirectional traceability and can be demonstrated by review records.

SWE.4.BP7: Summarize and communicate results. Summarize the unit test results and static verification results and communicate them to all affected parties. [Outcome 5]

NOTE 9: Providing all necessary information from the test case execution in a summary enables other parties to judge the consequences.

Output Work Products

08-50 Test specification [Outcome 2]

08-52 Test plan [Outcome 1]

13-04 Communication Record [Outcome 5]

13-19 Review Record [Outcome 3, 4]

13-22 Traceability record [Outcome 4]

13-25 Verification Results [Outcome 3, 5]

13-50 Test Result[Outcome 3, 5]

15-01 Analysis report [Outcome 3]

NOTE: For software and system test documentation, the IEEE-Standard 829- 2008 might be used.

 

Process capability levels and process attributes

Process capability Level 0: Incomplete process

Process capability Level 1: Performed process

Process capability Level 2: Managed process

Process capability Level 3: Established process

Process capability Level 4: Predictable process

Process capability Level 5: Innovating process