Early verification cuts design time and cost in algorithm-intensive systems

by Ken Karnofsky, senior strategist for signal processing applications, The MathWorks , TechOnline India - January 22, 2010

In this article, The MathWorks explains three practical approaches to early verification in order to reduce verification time and improve the performance of the designs.

Verification of algorithm-intensive systems is a long, costly process. Studies show that the majority of flaws in embedded systems are introduced at the specification stage, but are not detected until late in the development process. These flaws are the dominant cause of project delays and a major contributor to engineering costs. For algorithm-intensive systems —including systems with communications, audio, video, imaging, and navigation functions— these delays and costs are exploding as system complexity increases.

It doesn't have to be this way. Many designers of algorithm-intensive systems already have the tools they need to get verification under control. Engineers can use these same tools to build system models that help them find and correct problems earlier in the development process. This can not only reduce verification time, but also improves the performance of their designs. In this article, we'll explain three practical approaches to early verification that make this possible.

First, let's examine why the current algorithm verification process is inefficient and error-prone. In a typical workflow, designs start with algorithm developers, who pass the design to hardware and software teams using specification documents. (Figure 1)

Figure 1. In a typical flow, designs start with algorithm developers, who pass the design to implementation teams using static specification documents.

Each team typically crafts its own test procedures to determine that the implementation is functionally correct. These test procedures are often constructed in an ad-hoc fashion, and they rely on the engineer's interpretation of the specifications. This is a problem because hardware and software engineers often lack the application domain knowledge or tools to correctly interpret and implement the specifications. Conversely, algorithm developers may lack the tools and expertise to ensure that they have identified all of the "real-world" requirements. They often discover late in the development process that algorithms don't work as intended in the target environment.

Compounding this inefficiency is the use of separate tools and workflows for software, digital, and RF/analog hardware components, which inhibits cross-domain verification of system behavior. This can lead to unexpected hardware and software interactions. As a result, system verification does not occur until the end of the workflow, at the system integration phase, when design changes are most difficult and expensive to make.

Because most errors are introduced at the specification stage, conventional hardware, software, and ESL tools cannot solve these problems; conventional tools assume that the requirements have been adequately and accurately captured in the specification. The verification problem must be tackled at the beginning of the process and by connecting early stage tools with the downstream workflows.

Early verification with Model-Based Design
Model-Based Design offers a better approach. It provides a set of tools for algorithm design, system simulation and prototyping, and rigorous analysis. Using these tools, the algorithms and tests are designed as part of a system model. This system model serves as the basis of an executable specification that all design teams can use as a design reference and test bench. This approach gives all the design teams —including the algorithm, system architecture, and various component development teams— a shared, unambiguous understanding of the design requirements. (Figure 2)

Figure 2. The Model-Based Design workflow enables early verification.
{pagebreak}Using Model-Based Design to verify system and component behavior has several advantages:

. Design and integration problems can be found early through simulation, when they are still easy to correct.
. Tests can be developed concurrently with design to ensure that the executable specification satisfies requirements.
. Reusing models as test benches for component implementation eliminates manual test creation and reduces interpretation errors.
. Designers can quickly evaluate tradeoffs, component interactions, and system-level metrics.

These early verification capabilities address the largest source of product delays by enabling engineers to find and correct flaws at the specification stage. Independent research has shown that this can cut product development time and costs in half, and can lead to superior product designs.

Automated verification and test bench reuse
Getting started with Model-Based Design is surprisingly straightforward. As a first step, the algorithm designers and implementation teams can work together to automate existing test procedures with the tools they already use. From the perspective of the algorithm designers, this automation amounts to a change in thinking more than a change in workload because algorithm designers already develop test benches in order to check their own work. In the traditional design flow, these test benches remain within the algorithm design group. With Model-Based Design, this work can be reused rather than being lost in the handoff to the implementation team.

The reuse is achieved by automation interfaces in the algorithm and system modeling tools that enable co-simulation with widely used hardware simulators and embedded software tools. This co-simulation replaces manual and script-based comparison techniques that would otherwise be needed to verify that C code, HDL, and analog circuit implementations meet system-level metrics. Test bench reuse significantly reduces verification effort and allows each team to more efficiently use existing tools and workflows.

Multidomain Modeling
Another problem with the traditional design process is that the software, digital, and analog hardware teams use disparate tools and workflows. These disparate tools inhibit cross-domain design and verification, leaving errors undiscovered until the system integration phase. This problem can be addressed by pushing verification up to a higher level in the design flow. Model-Based Design supports this aspect of early verification through "virtual integration," simulating algorithms, software, digital hardware, and analog together in one environment. This multidomain modeling approach allows designers to evaluate design tradeoffs, component interactions, and system-level metrics early in the design process.

Multidomain modeling and simulation brings together discrete-time modeling for digital components and continuous-time modeling for analog components. It can also incorporate timing and control logic, finite state machines, event-driven simulation, and fixed-point simulation. Designers can start with an abstract model to capture behavior and validate requirements. As work progresses, they elaborate the model until they achieve timing- and bit-true accuracy.

By using these models for virtual integration, engineers can see how component design decisions affect overall system behavior without becoming experts in different domains or tools. As a result, they can more quickly find solutions that satisfy or exceed requirements, and can address problems that typically aren't found until late in the process, at the system integration stage.

This approach offers immediate benefits in fast-moving markets like wireless communications. A wireless system designer is concerned not just with the baseband algorithms but also the RF chain, receiver synchronization, integration with higher layers in the stack, network latency and throughput, and so on.

For example, a common challenge is the use of lower-performance power amplifiers to reduce base station cost and power consumption. These amplifiers require the use of digital predistortion (DPD), a Digital Signal Processing (DSP) technique that compensates for device nonlinearities. Multidomain models enable the DSP engineers to verify that these algorithms work properly by using a model of the relevant RF impairments. {pagebreak}Rapid prototyping
A third verification challenge occurs when algorithms don't work as intended in the target environment. In the traditional workflow, algorithm problems may not be discovered until the end of the design process. This can force extensive re-working of the entire design —a situation that is obviously undesirable.

With Model-Based Design, the same tools used for algorithm development and system simulation can be used to rapidly prototype designs on DSP and FPGA hardware, without low-level programming. This early verification technique allows designers to quickly prove the viability of new ideas and analyze performance under real-world conditions.

With rapid prototyping, design-test cycles that took weeks can be completed in less than a day. This capability is particularly valuable for engineers who have new, untested design ideas that they need to verify quickly and thoroughly.

Quantifying the Results
Leading communications, electronics, and semiconductor companies have used early verification with Model-Based Design to gain competitive advantage by cutting development costs. As illustrated in Figure 3, independent studies have shown that companies who adopt this approach can cut development costs in half. These results were measured prior to the existence of interfaces to HDL simulators. Since those products were introduced, customers are achieving even greater results. For example, an international communications and information technology company reports an 85 percent reduction in functional verification time.

Figure 3. Design costs before and after using Model-Based Design.
Source: Return on Investment in Simulink for Electronic System Design, © 2005 International Business Strategies

Perhaps even more important is the ability to create better product designs and evolve existing ones as robust system models makes it easier to develop derivative designs, or to adapt to new requirements quickly.

The companies who adopt early verification techniques find that they improve communication and collaboration among distributed, multidiscipline teams. Smaller teams also report significant time and cost savings, even if they adopt only one aspect.

Leading communications, electronics, and semiconductor companies have used all of these early verification techniques to gain competitive advantages by simultaneously reducing their test and verification costs while strengthening their ability to develop and ship innovative new products faster.

Comments

blog comments powered by Disqus