@head: Verification Needs To Be Reigned In
@text:Design teams have been grasping for the reins to control chip verification. Most of them have migrated to automated, higher-abstraction-level testbench environments like Specman/e, Vera, and SystemVerilog. In addition, many are considering assertions--some with an eye toward formal verification. If they’re lucky, the teams are treading water on schedules and chip quality. Most of them aren’t so fortunate, however. And chip size and complexity will only continue their exponential growth.
Perhaps it’s finally time for chip design to take a page from quality-management wisdom. Chip design today shares a lot in common with 1970s-era automobile manufacturing in the U.S. Automotive quality was sought by inspecting the product. The philosophy was that better quality could be achieved with more and better inspection. While the auto industry and the rest of the world have developed a passion for designing quality in, the chip industry has kept its focus on verification. It’s time for chip design to develop a passion for designing quality in.
How Do You Design Quality Into Chips?
The root cause of verification’s complexity lies in increasing chip complexity. The only way of dealing with complexity is to raise the level of abstraction and let automation deal with the lower-level design details. This meaningful and sustainable approach can improve quality and reduce time and costs.
What if complex, enterprise-class software was written in assembly language? It certainly could be done. But imagine having to isolate all of the software bugs while dealing with design alterations to either fix them, keep up with specification changes, or address performance issues. Software has moved to fourth-generation (4G) languages in order to manage complexity. Yet hardware has been using Verilog and VHDL for chip design for almost 15 years. Using Intel processors as a measure, device complexity has increased roughly 500X in this same period while tools for design haven’t dramatically changed.
It’s tempting to push the software analogy too directly. Yet hardware has fundamental differences that impact the level of abstraction that’s chosen. First, hardware has a higher sensitivity than software to quality of results (QoR) including latency, area, timing, or power. As errors in hardware are etched in, chip teams also must be 100% certain that the design that they taped out is the one that they verified.
An ideal level of abstraction for hardware will account for these differences while improving quality in the areas that are prone to error. To optimize the QoR, engineers must maintain control over the micro-architecture. The design must be transparent and predictable to the register transfer level (RTL) generated in order to support RTL debug, RTL-to-source correlation, and engineering change orders (ECOs). Finally, the language cannot have narrow applicability (say, for example, to computation units or processor pipelines). Otherwise, the portions of the design that are still designed with RTL will drive the schedule and determine quality.
Previous high-level design attempts suffered in one or more of these areas. Newer attempts at electronic-system-level (ESL) synthesis provide a more formal approach by focusing on new semantics for managing what makes hardware design hard: complex concurrency with shared resources. At a higher level of abstraction, designers wind up with significantly fewer lines of code. They can see the forest from the trees, allowing easier, rapid, larger, and more correct changes. They also can verify at this higher level with transactional tests, which more accurately reflect real usage. They can produce correct designs by letting tools automatically create all of the mechanical details required at lower levels.
Higher quality can significantly rein in verification. The earlier that bugs are found in the design process, the lower the cost. At this year’s DesignCon, MIPS CEO John Bourgoin outlined how finding bugs at the front of the process is least expensive. Costs increase by 10X at each design stage--from model testing to component test, system test, and finally to the field. What if the majority of bugs that currently enter the verification process were no longer there? What would be the savings in no longer having to find, debug, and fix them? What would happen to the likelihood of a bug escape?
ESL synthesis’ high level of abstraction also includes powerful interface semantics that guarantee correct connectivity and protocol behavior. What if unit-level testing efforts guaranteed that blocks would work together in a larger system wherever they were used?
There’s no way to ensure that a design has been 100% verified. Nor is there any way to know that a design is sufficiently verified for tapeout. Verification never guarantees the absence of bugs. It only finds what it can. Shouldn’t the goal be better quality rather than more verification? With significantly fewer bugs entering verification, less time will be spent finding, debugging, and fixing them. Most importantly, the chance of a bug escape is reduced. When quality is designed in, the designer can choose whether to tape out sooner or whether the extra time and resources should be used to verify further.
Shiv Tasker is Chief Executive Officer at Bluespec. Previously, he was the President and CEO of Phase Forward and Senior Vice President for Worldwide Sales, Consulting Services, and Corporate Marketing at Viewlogic Systems Inc. Tasker also spent eight years at Cadence Design Systems in a variety of marketing and general management roles. He holds an MBA from the University of Texas at Arlington and a BS from the University of Bombay in Statistics and Economics. Tasker was recently honored by Mass Hi-Tech as an All-Star.