Virtual testing with model-based design

by Brett Murphy, The MathWorks , TechOnline India - August 17, 2009

Virtual testing, based on system simulation and model-based design,

Virtual testing, based on system simulation and Model-Based Design, takes the traditional "test-at-the-end" system development process (represented in the V diagram) and makes it more iterative. Virtual testing shortens both the design cycle and the final physical testing phase.

The Problems: System Complexity and Late-Stage Error Detection
Complexity in embedded software development is driving the cost of test and verification to as much as 70% of overall development costs. Industrial automation, automotive, and aerospace engineers conduct exhaustive design and code reviews and build increasingly complex test systems to confirm that the software in embedded processors meets high-integrity standards and design requirements. And as verification consumes more time, engineers have fewer opportunities to innovate and create product differentiation through design optimization.

Many organizations are finding that most errors they uncover in test and integration were introduced at the beginning of the design process while interpreting system requirements. Engineers now face the challenge of checking for errors and "cleaning them out" closer to the beginning of the development process, when they are cheaper to fix.

Finally, as development teams grow and become geographically dispersed, they are seeking better ways to collaborate.

The Solution: Test Early
Embedded software errors can be cut substantially by doing more complete design verification at or near the beginning of the design process via systematic system simulation, or virtual testing. "Virtual" in this case denotes simulation, but with no hardware involved—just software and simulation engines. "Systematic" means building tests based on requirements and then executing those tests against the system design.

Two critical concepts that drive virtual testing process improvements:
  • An executable system specification
  • Requirements based tests
The executable system specification is a model that you can simulate, and includes your system design as well as environment models: models of the important elements of the physical world your embedded system interacts with. The system model needs to include subsystem domains such as controls, mechanical components, electrical components, and hydraulics. These models should be as detailed as needed to capture their system-level functional behavior.

Requirements-based tests are formulated from functional requirements that can be expressed as tests; essentially, "Given a particular input, here is the output I expect." At a minimum, you need to have a simulation input signal or sequence for each test, as well as output captured from the simulation, to compare with the expected output. You should also build a complete set of tests that fully exercise the requirements.

To understand how this works, consider the classic design process V diagram (Figure 1). With virtual testing, you would follow a second V early in the process, starting partway down the left leg and then up to the right. The design flow moves to a virtual test rather than down the left leg to the original apex (the point of implementation) and then up the right leg (the hardware side) to physical testing.


Figure 1: The V diagram, which illustrates the traditional embedded system development process. The right side represents physical test and verification, which occur late in the process when problems are more costly to fix.
(Click on image to enlarge)



Figure 1 illustrates a traditional design flow, in which engineering teams move from written requirements to design, manual coding and, finally, system integration and test. Ensuring requirements are valid at the start is a challenge, verifying the design against the requirements is difficult, and implementing the code is a manual, error-prone process.
{pagebreak}Model-based design enables verification as a parallel activity that occurs throughout the development process, as shown in Figure 2. Virtual testing, indicated with green arrows, enables rapid iterations between requirements and the design. The ability to test and verify at any step of the development process means you can find errors at the point at which they are introduced. The team can thus iterate, fix, and verify the design and implementation faster.


Figure 2: Model-Based Design methods for virtual test (green arrows). The model includes multiple feedback loops in the beginning of the design process (left leg of V diagram) for earlier verification and validation to find errors sooner when they cost less to remedy.
(Click on image to enlarge)



Model-Based Design
As embedded software grows in functionality and complexity, engineering teams are moving beyond traditional code development techniques of using text editors and debuggers. They're shifting to design centered on models, using modeling, simulation, and code generation tools on the desktop. This model-centric development approach is called model-based design.

Using models and simulation at the core of the development process provides insight into the dynamics and algorithmic aspects of the system. In addition, the models can be used to:

Embedded system developers in industrial control, automotive, and aerospace are adopting model-based design to improve their development processes and manage costs, while maintaining quality. They use models and simulation to increase efficiency with complex system designs, and automatic code generation from models to streamline implementation.

Designing with Models
While simulation alone will not find all errors, it is a huge step forward and can be done almost as soon as you design a model. Iterating in a modeling environment is fast and easy.

A whole system model allows the development group to check if the design is going to work, if requirements make sense, and if the design meets those requirements. In a traditional process, they couldn't obtain such results until they reached the right side of the V when testing physical components and systems.

Modeling individual components is incredibly useful and may be necessary to complete a complex design, but it's helpful to first model the system or environment your component will operate in. By modeling the whole system in a single environment, you can quickly see how the functionality of your component will interact with other components and how the integrated components will behave in the deployed system or environment. You can also find missing requirements for your individual component or others. With a system model to return to as you iterate one component, you can assess how design iterations will affect system functionality.

By automatically generating code from the model, you can test design trade-offs and iterate faster than with hand coding.

Testing Models
Developing tests in parallel to the design process enables early detection of potential problems and significantly reduces the cost and time for fixing them. By thinking about testing while developing the model, you can design better for "testability," thereby ensuring the design can be fully tested.

Almost every testing scenario involves a variable, such as inputs, plant parameters, and environmental factors. Time and expense often limit how much variability you can incorporate. However, by testing in a simulation environment, you can simulate different test cases much faster, even in parallel if the processing power is available. In addition, you can safely investigate system limits and "destructive" tests in a virtual environment. Simulation lets you test conditions that aren't feasible or practical on actual hardware. Exploring the entire parameter space in simulation can also narrow down the critical tests that must run in real time.

Conclusion
One of the primary benefits of model-based design is the opportunity to conduct verification and validation in parallel with all other development steps, especially early in the development process. Developing tests along with models, and reusing model tests on code and hardware, significantly reduces the risk of uncovering errors late in the process and jeopardizing quality or delivery goals.

About the Author
Brett Murphy is the manager of product marketing for verification, validation, and test, for The MathWorks.

Comments

blog comments powered by Disqus