Efficient testing ensures traceability and verification requirements (on)

The need to integrate automotive electronics hardware and software testing can be more streamlined and less expensive. The need for traceability and verification is like a contractual requirement that affects automotive electronics suppliers. As frequency increases, vendors are increasingly aware that demand-based testing is often an important element of software development engineering success.

This article refers to the address: http://

As a deliverable contract, or more generally, as a labor product, the task requiring traceability generates a Test Verification Matrix (TVM), a product that is difficult to manufacture and consumes this process. A valuable resource that has been transferred from other more productive activities.

The true importance of TVM will only emerge when people try to maintain TVM through the testing, integration and deployment phases of the project. When a defect occurs, the inherent deficiencies of the TVM and the manual processing it represents are exposed as defects. Rather, most of these shortcomings are attributed to requirements management, including requirements confirmation, assignment, and proper implementation. In fact, records show that up to 70% of such defects are classified as related to requirements management!

The next challenge is to generate a requirements traceability solution for development and test teams that works in existing tools and program environments. Currently, most customers LDRA have the requirement for database or flat document processing capabilities, where they define and maintain systems or high-level requirements.

Delay mapping

Some customers map these high-level requirements to the top-level design; they even map these requirements to the actual build design and source code. In general, customers must at least map requirements to test cases that validate these requirements. However, before a user waits for a test to perform a required traceability, the possibility of an error map is very high, especially in system testing.

The reason for this late requirement mapping is that the test environment of the project manager's office and development engineer workstations or the requirements database on the lab target system has an impact on operational constraints. Or at the far end, the subcontractor is performing the test. At a minimum, these operational constraints dictate that some level of integration between the requirements database and the test environment is required to introduce an automated solution. .

A more efficient approach is to map at least the requirements to (or in detail) the actual build design and embedded source code. Mapping a built system is part of the test qualification or test preparation process. The test preparer determines the appropriate relationship between the requirements and the code; a deduction from this check is to eliminate obsolete code in the source code (not used) Code). In addition, it may be controversial that code that does not work or that cannot run under any combination of test data should also be corrected or cleared before the test is ready.

The best solution for traceability includes: the first step, mapping system requirements to the highest level design, and proper execution when using a design modeling tool (this option is in the LDRA white paper "LDRA Tool Suite/ Telelogic I-logix Rhapsody Integration ”).

Prototype design

Existing low-level and extension requirements force further traceability of the actual construction design, and the development team defines these requirements in the process of detailed system requirements (or prototyping) and defines workable and testable system configurations. . The evolutionary model of this product is most prominent in the development of embedded software tasks, where target constraints and hardware requirements must also be considered.

The prevalence and context of low-level requirements is another major challenge for requiring traceability. These requirements do not take into account system or customer needs; they address the “how” of the software system, and customer requirements define what the system should “do”. As a result, low-level and extension requirements are often disconnected from system requirements. This raises another data management requirement.

A key aspect of low-level requirements for management, tracking, and verification is how to divide these requirements into development engineers and test engineers. The development engineer has to fully understand the interface specification of the code they will implement and the program that the code will call. These specifications must be explicitly linked to the relevant advanced requirements so that the development engineer can correctly understand the context of the implementation. With the right information, development engineers can design for testability and consider features that must be used at multiple test levels.

Key software has many applications in the automotive industry and in other commercial and government sectors around the world, such as safety critical, mission critical and business critical applications. Below is a list of commonly used such applications.

11.gif

If people consider "consumer-critical" applications, then these softwares have a wider range of applications, including ATMs and game consoles (especially when they spend their own money). Most of these applications are developed for industry and government organizations that define and publish their own software development and testing standards. The following are representative of such standards:

MISRA: Automotive Software Development Guide, 3.6, “Testing”

IEEE 1012: Software Verification and Validation Criteria

IEEE 829: Software Test Documentation Standard

IEC 61508: Functional safety of electrical/electronic/programmable safety related systems

FDA: General Principles of Software Verification, 5.2.5, “Testing by Software Development Engineers”

EN 50128: Railway applications, "Software for railway control and protection systems"

RTCA DO-178B: Software Considerations in Naval System and Equipment Certification Requirements, 6.x, “Software Verification Process”

Def Stan 00-55: Requirements for Security-Related Software in Defense Equipment (Part 2), Section 5, "Testing and Integration"

Common to these standards is the running of requirements-based tests. The most notable of these standards is the standard for the bomb system, DO-178B. This standard primarily defines two test-based requirements activities as functional tests or black box tests (below), as well as structural overlays or white-box tests.

22.gif

Functional testing requires the development engineer or test engineer to have the software requirements to determine the behavior of the code under test. More specifically, the development engineer (or test engineer) must define the inputs and conditions based on the output and expected results in order to develop test specifications. This test specification may be given in the form of one or more test cases to fully traverse the requirements of the test specification.

Structural overlay or white box testing helps verify the integrity of the black box test. Structural testing also helps determine the correctness of the actual build design; for example, if all the necessary software functions have been run, but there are still uncovered code, then the extra code is the problem, the code run time The predictability is the same.

Part 2 of this paper will discuss the role of the Capability Maturity Model (CMMI) standard in improving the software development process, leading to the mapping of test information to required tools.

DOB LED Module

Shenzhen Isource lighting Co., Ltd , https://www.isourceled.com