|SWE.5 Software Integration and Integration Test|
|Process Name||Software Integration and Integration Test|
|Process Purpose||The purpose of the Software Integration and Integration Test Process is to integrate the software units into larger software items up to a complete integrated software consistent with the software architectural design and to ensure that the software items are tested to provide evidence for compliance of the integrated software items with the software architectural design, including the interfaces between the software units and between the software items.|
As a result of successful implementation of this process:
SWE.5.BP1: Develop software integration strategy. Develop a strategy for integrating software items consistent with the project plan and release plan. Identify software items based on the software architectural design and define a sequence for integrating them. [Outcome 1]
SWE.5.BP2: Develop software integration test strategy including regression test strategy.Develop a strategy for testing the integrated software items following the integration strategy. This includes a regression test strategy for re-testing integrated software items if a software item is changed. [Outcome 2]
SWE.5.BP3: Develop specification for software integration test. Develop the test specification for software integration test including the test cases according to the software integration test strategy for each integrated software item. The test specification shall be suitable to provide evidence or compliance of the integrated software items with the software architectural design. [Outcome 3]
NOTE 1: Compliance to the architectural design means that the specified integration tests are suitable to prove that the interfaces between the software units and between the software items fulfill the specification given by the software architectural design.
NOTE 2: The software integration test cases may focus on
• the correct dataflow between software items
• the timeliness and timing dependencies of dataflow between software items
• the correct interpretation of data by all software items using an interface
• the dynamic interaction between software items
• the compliance to resource consumption objectives of interfaces
SWE.5.BP4: Integrate software units and software items. Integrate the software units to software items and software items to integrated software according to the software integration strategy. [Outcome 4]
SWE.5.BP5: Select test cases. Select test cases from the software integration test specification. The selection of test cases shall have sufficient coverage according to the software integration test strategy and the release plan. [Outcome 5]
SWE.5.BP6: Perform software integration test. Perform the software integration test using the selected test cases. Record the integration test results and logs. [Outcome 6]
NOTE 4: See SUP.9 for handling of non-conformances.
NOTE 5: The software integration test may be supported by using hardware debug interfaces or simulation environments (e.g. Software-in-the-Loop-Simulation).
SWE.5.BP7: Establish bidirectional traceability. Establish bidirectional traceability between elements of the software architectural design and test cases included in the software integration test specification. Establish bidirectional traceability between test cases included in the software integration test specification and software integration test results. [Outcome 7]
NOTE 6: Bidirectional traceability supports coverage, consistency and impact analysis.
SWE.5.BP8: Ensure consistency. Ensure consistency between elements of the software architectural design and test cases included in the software integration test specification. [Outcome 7]
NOTE 7: Consistency is supported by bidirectional traceability and can be demonstrated by review records.
SWE.5.BP9: Summarize and communicate results. Summarize the software integration test results and communicate them to all affected parties. [Outcome 8]
NOTE 8: Providing all necessary information from the test case execution in a summary enables other parties to judge the consequences.
|Output Work Products||
01-03 Software item [Outcome 4]
01-50 Integrated software [Outcome 4]
08-50 Test specification [Outcome 3, 5]
08-52 Test plan [Outcome 1, 2]
13-04 Communication record [Outcome 8]
13-19 Review record [Outcome 7]
13-22 Traceability record [Outcome 7]
13-50 Test result [Outcome 6, 8]
17-02 Build List [Outcome 4, 7]
Process capability levels and process attributes
Process capability Level 1: Performed process
Process capability Level 2: Managed process
Process capability Level 3: Established process
Process capability Level 4: Predictable process
Process capability Level 5: Innovating process