Linked Knowledge Nuggets: arrow_forward "Systems Engineering - Why the heck do we need this?"
personAuthor: Alexander Feulner
The increasing complexity, shortened development times, distributed development, heightened cost pressures, technological change, and organizational transformations are just a few of the challenges facing the current development.
If these statements seem unfamiliar to you, we congratulate you warmly. For all those who are confronted with these challenges on a daily basis, we present the methodology of ‘Systems Engineering’ in our webinar and answer the question: "Why the heck do we need this?". This webinar is tailored not only for system developers but also for professionals in various domains including software development, hardware development, testing, project management and more.
school
Webinar recording and slides
arrow_forward "Testmanagement"
personAuthor: Process Fellows
Test Management ensures that testing activities are strategically planned, monitored, and evaluated across all development phases. From unit tests to system-level integration, this cross-cutting discipline defines methods, tools, documents, and roles to ensure traceable and efficient verification and validation.
school
PF_Testmanagement_Extract.pdf Short Overview of Test Management (related to all Automotive SPICE® verification processes)
arrow_forward "Verification level vs. Verification timepoint"
personAuthor: Process Fellows
The execution of a verification measure is not necessarily linked to the verification time point.
It is possible that a verification measure from SWE.6 is carried out as part of the verification of the SW component and integration verrifcation if the setup or the environment is better suited to this.
However, it remains a verification of the SW requirements.
The decisive factor is what a verification measure is derived from.
However, it is important to ensure that this verification measure is included in and part of the report and in the summary of the SW verification.
Pay attention to the sequences and dependencies to be followed.
# PROCESS PURPOSE
The purpose is to integrate the system and to ensure that the integrated system is consistent with its provisions and compliant with the system requirements.
# PROCESS OUTCOMES
O1 (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) for system integration and for system (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) are specified
O2 (System Element = System elements can be:
Logical and structural objects at the architectural and design level. System elements can be further decomposed into more fine-grained system elements of the architecture or design across appropriate hierarchical levels.
Physical representations of these objects, or a combination, e.g., peripherals, sensors, actuators, mechanical parts, software executables.
) are integrated up to a complete integrated system, the integration is verified with specified (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.), and the (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) results are recorded
O3
The integrated system is verified with specified (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) and the results of system (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) are recorded
O4
Horizontal traceability is established on all levels
# BASE PRACTICES
BP1
Specify and perform verification measures for integration. (
O1, O2 )
Specify and perform (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) for the integration and record the (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) results including pass/fail status. Perform integration of the (System Element = System elements can be:
Logical and structural objects at the architectural and design level. System elements can be further decomposed into more fine-grained system elements of the architecture or design across appropriate hierarchical levels.
Physical representations of these objects, or a combination, e.g., peripherals, sensors, actuators, mechanical parts, software executables.
) until the system is fully integrated. Note 1: Examples for preconditions for starting integration can be successful (System Element = System elements can be:
Logical and structural objects at the architectural and design level. System elements can be further decomposed into more fine-grained system elements of the architecture or design across appropriate hierarchical levels.
Physical representations of these objects, or a combination, e.g., peripherals, sensors, actuators, mechanical parts, software executables.
) (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) or qualification of pre-existing (System Element = System elements can be:
Logical and structural objects at the architectural and design level. System elements can be further decomposed into more fine-grained system elements of the architecture or design across appropriate hierarchical levels.
Physical representations of these objects, or a combination, e.g., peripherals, sensors, actuators, mechanical parts, software executables.
)
Linked Knowledge Nuggets: arrow_forward "Archiving test results"
personAuthor: Process Fellows
Don’t lose your evidence. With perspective to testing, SUP.8.BP1 in combination with SUP.8.BP8 expects structured storage of test logs, verdicts, anomalies, and configuration info. This is not only a formality: It enables you to later on reproduce details about a certain system version.
BP2
Specify and perform system verification measures for system. (
O1, O3 )
Specify and perform the (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) suitable to provide evidence of compliance of the integrated system with the system requirements. Record the (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) results including pass/fail status.
Linked Knowledge Nuggets: arrow_forward "Archiving test results"
personAuthor: Process Fellows
Don’t lose your evidence. With perspective to testing, SUP.8.BP1 in combination with SUP.8.BP8 expects structured storage of test logs, verdicts, anomalies, and configuration info. This is not only a formality: It enables you to later on reproduce details about a certain system version.
BP3
Establish horizontal traceability. (
O4 )
Ensure horizontal traceability from system requirements and system architecture to the corresponding (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) and results. Note 2: Horizontal traceability supports consistency, impact analysis and (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) coverage demonstration for a respective V-model level.
Linked Knowledge Nuggets: arrow_forward "Consistency vs. Traceability – What’s the Difference?"
personAuthor: Process Fellows
Consistency ensures that related content doesn’t contradict itself – e.g., requirements align with architecture and test. Traceability, in contrast, is about links: can you follow a requirement through to implementation and verification? Both are needed – consistency builds trust, traceability enables control. Typically, traceability strongly supports consistency review.
arrow_forward "The true benefit of traceability
"
personAuthor: Process Fellows
Sometimes the creation of traceability is seen as an additional expense, the benefits are not recognized.
Traceability should be set up at the same time as the derived elements are created. Both work products are open in front of us and the creation of the trace often only takes a few moments.
In the aftermath, the effort increases noticeably and the risk of gaps is high.
If the traceability is complete and consistent, the discovery of dependencies is unbeatably fast and reliable compared to searching for dependencies at a later stage, when there may also be time pressure.
It also enables proof of complete coverage of the derived elements and allows the complete consistency check.
# OUTPUT INFORMATION ITEMS
13-51
Consistency Evidence (
O4 )
Demonstrates bidirectional traceability between artifacts or information in artifacts, throughout all phases of the life cycle, by e.g.,
tool links
hyperlinks
editorial references
naming conventions
Evidence that the content of the referenced or mapped information coheres semantically along the traceability chain, e.g., by
performing pair working or group work
performing by peers, e.g., spot checks
maintaining revision histories in documents
providing change commenting (via e.g., meta-information) of database or repository entries
Note: This evidence can be accompanied by e.g., Definition of Done (DoD) approaches.
Used by these processes:
CSGE Cybersecurity Goal Elicitation
SWIV Software Integration and Verification
SYIV System Integration and Verification
11-06
Integrated System (
O2 )
Integrated product
(Application parameter = An application parameter is a software variable containing data that can be changed at the system or software levels; they influence the system’s or software behavior and properties. The notion of application parameter is expressed in two ways:
The specification (including variable names, the domain value range, technical data types, default values, physical unit (if applicable), the corresponding memory maps, respectively).
The actual quantitative data value it receives by means of data application.
Application parameters are not requirements. They are a technical implementation solution for configurability-oriented requirements.) files (being a technical implementation solution for configurability-oriented requirements)
All configured elements for the product (Release = A physical product delivered to a customer, including a defined set of functionalities and properties.) are included
Used by these processes:
SYIV System Integration and Verification
08-60
Verification Measure (
O1 )
A (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) can be a test case, a (Measurement = “The activity to find the size, quantity or degree of something”.), a calculation, a simulation, a review, an optical inspection, or an analysis
The specification of a (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) includes
pass/fail criteria for (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) (test completion and ending criteria)
a definition of entry and exit criteria for the (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.), and abort and re-start criteria
Techniques (e.g., black-box and/or white-box-testing, equivalence classes and boundary values, fault injection for Functional Safety, penetration testing for Cybersecurity, back-to- back testing for model-based development, ICT)
Necessary (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) environment and infrastructure
Necessary sequence or ordering
Used by these processes:
CSVV Cybersecurity Verification and Validation
SWIV Software Integration and Verification
SYIV System Integration and Verification
08-58
Verification Measure Selection Set (
O2, O3 )
Include criteria for re- (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) in the case of changes (regression).
Identification of (Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.), also for regression testing
Used by these processes:
CSVV Cybersecurity Verification and Validation
SWIV Software Integration and Verification
SYIV System Integration and Verification
15-52
Verification Results (
O2, O3 )
(Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) data and logs
(Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) passed
(Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) not passed
(Verification measure = Verification measure can be:
Test cases
Measurements
Calculations
Simulations
Reviews
Analyses
Note, that in particular domains certain verification measures may not be applicable, e.g., software units generally cannot be verified by means of calculations or analyses.) not executed
information about the test execution (date, tester name etc.)
Abstraction or summary of (Verification = Verification is confirmation through the provision of objective evidence that an element fulfils the specified requirements.) results