Verification coverage measures up to higher level

By Richard Goering

11/28/07

Many chip design and verification teams use verification coverage metrics today, but the information gleaned from these metrics is often of limited value. A broad rethinking of verification coverage is underway that promises to make this data far more valuable, greatly easing the functional verification challenge as a result.

This rethinking is occurring on several fronts. One is the Accellera Unified Coverage Interoperability Standard (UCIS), which is developing an API that can unify coverage data from different tools and technologies, including simulation and formal verification. Such a standard could help engineers make much more informed decisions about which verification strategies to employ where.

Secondly, commercial offerings are challenging existing notions of verification coverage. For example, startup Certess Inc. is extending the concept of coverage to bug propagation and detection, while Breker Systems promises 100 percent “scenario coverage” from its graphical test generation tool. In an unannounced acquisition of startup Lighthouse Design Automation, Mentor Graphics Corp. has acquired a tool that promises high coverage through intelligent testbench generation. Stealth-mode startup Nusym Technology appears to be considering a similar capability.

Finally, there’s a renewed push on using coverage metrics within the context of an overall methodology driven by a verification plan, as opposed to an ad-hoc exercise carried out in a shotgun fashion. Vendors are proposing tools and offering resources that can help users develop a metric-driven methodology.

The problem with verification coverage today is not the lack of data – quite the contrary. “Many of the problems we are facing with coverage these days is that we have too much data and too much information, and therefore a lot of effort and expertise is needed to extract the important things that are hidden in the data,” said Avi Ziv, research staff member for simulation-based verification technologies at IBM (Haifa, Israel). “As a result, more advanced analysis techniques are needed to handle this data.”

David Lacey, verification scientist at Hewlett-Packard (Richardson, Tex.), noted in a presentation at the 2007 Design Automation Conference that a recent 3-chip verification project resulted in 815,000 functional coverage points. “We have to find effective ways to organize the data,” he said. Lacey noted that verification coverage is “still fairly young in terms of the maturity of its technology, and has a long ways to go in terms of what it can ultimately become.”

Verification coverage numbers are sometimes meaningless. Olivier Haller, design verification team leader at ST Microelectronics’ Functional Verification Group (Grenoble, France), noted that if you don’t check to see if bugs propagate to outputs, “you can end up with 100 percent coverage and verify absolutely nothing. We have found many examples where that was the case.”

Incomplete metrics

According to a recent paper by verification consultant Brian Bailey, coverage metrics serve two primary roles: to provide an indication of the completeness of the verification task, and to help identify weaknesses in the verification strategy. As Bailey noted, the coverage metrics in use today can only identify that the verification task is not complete, not when it is complete.

The most familiar metric, code coverage, measures whether lines of RTL code were executed by the simulator. The metric may include line, path, branch, or expression coverage. It’s an easy metric to collect, but doesn’t tell you whether the functionality of the code is right or wrong.

Functional coverage, often used with constrained-random test generation, measures the execution of specified scenarios. Users insert coverage points that check various conditions, such as whether all required data packets have been sent. Functional coverage can identify missing functionality, but it’s time-consuming to implement and has some simulation overhead.

Assertion coverage is a vaguely-defined term that could imply a check to see if an assertion was activated, a count of the number of assertions that were proven, a measurement of assertion “density” in the code, or a check that an enabling condition of an assertion actually occurred. Other possible coverage metrics include software code coverage, error message logs, and revision control information.

In formal verification, where proofs are exhaustive by definition, the notion of coverage metrics is less clear. Rajeev Ranjan, CTO at Jasper Design Automation, noted that formal verification users are generally concerned with whether properties have passed or failed, and why failures have occurred. They also use formal analysis to check whether coverage points were sensitized during the analysis.

Ranjan noted that some formal tools let users create “cover pragmas” that ensure that some enabling condition of a property actually occurs. For example, a property might state that if a request occurs, an acknowledge statement must follow in 10 cycles. A cover pragma could verify that a request does in fact occur at some point in time. Ranjan calls this practice “sanity checking.”

Code coverage and functional coverage metrics may not have direct correspondences in the formal world. But according to Ranjan, formal analysis can establish that certain coverage targets are not reachable for any legal input sequence, potentially eliminating unnecessary simulation. Further, formal analysis can create scenarios that will automatically fulfill specific coverage targets.

It is this interplay between formal analysis and verification coverage that helped spark the coverage interoperability effort that led to Accellera UCIS effort. Ranjan said that Jasper received numerous requests from customers who wanted to combine results from different verification tools and technologies, including simulation and formal verification, “to have an overall picture of where they were in the verification roadmap.” With every tool using its own data format, that just wasn’t possible. So Jasper helped launch the Coverage Interoperability Forum (CIF), an ad-hoc effort that led to the formation of the Accellera UCIS committee in December 2006.

“The effort from UCIS will enable the creation of a database whereby information from different technologies will be put in, and then EDA tools will help users analyze and keep track of a verification plan, what coverage objectives they have in mind, and what they have achieved,” Ranjan said.

A user-directed effort

The Accellera UCIS effort today is a user-driven effort with two major goals. One is to define a standard API to which tools will write coverage data, so that users can extract that data and merge it into a single database. Another is to define a standard taxonomy of terms for verification coverage. There are some 75 participants including ARM, Cadence Design Systems, Certess, Denali, Freescale, IBM, Infineon, Intel, Jasper, Mentor Graphics, Nokia, Novas, OneSpin Solutions, SpringSoft, STMicroelectronics, Sun, Synopsys, and Texas Instruments.

Many different kinds of verification tools collect coverage data, noted Shrenik Mehta, Accellera chair. “We need some kind of accounting standard for measuring coverage,” he said. “It needs to interoperate with various peoples’ tools so that, when we say something is covered, it’s covered across various vendors’ tools.” With a common understanding of coverage, Mehta said, engineering teams can more intelligently partition verification tasks.
Legacy-Automation-Coverage

The Accellera UCIS effort seeks to unify coverage data from different types of tools. Source: Accellera

“We want to develop a standard API that will make it easier to merge data into one database,” said Faisal Haque, Accellera UCIS chairman. “Right now, there’s no easy way to get coverage data out of tools. You have to go to each vendor and figure out what their format is, extract it, and convert it to your database format.”

Volis Written by: