Reduction Of Redundancy In Directed Random Verification Through Checkpointing

Jesse Craig
University of Vermont

Abstract

The growing complexity of function and gate counts found in modern digital semiconductor designs has made the verification of those designs the dominant cost of their development and implementation. Design teams rarely have the resources to exhaustively verify their designs, leading to a growing number of bugs not found during the verification process. This thesis will study a popular technique for verifying digital designs, directed random verification, and the diminishing returns which are often cited as the technique?s greatest challenge. This thesis shows that these diminishing returns are directly related to the inherent redundancy found in directed random verification and explores a method of reducing this redundancy. The method of optimization creates checkpoints at user selected, randomly controlled, decision points and exhaustively explores the outcomes of these decision points. This method is facilitated by a new software system, the Reduction of Redundancy (RoR) Verification System, introduced in this thesis. Using this software system several verification environments were created to understand the method?s effects on the verification performance and the challenges with implementing the method in a verification environment. In each case the optimization method is able to increase the performance of the verification environment with only a minimal amount of effort by the verification engineer.

A Brief Explanation

Functional verification is composed of four major steps: generating stimuli for a design, simulating the design?s reaction to those stimuli, checking the reaction against an expected result, and recording what coverage goals the stimuli tested. In this system, coverage goals act as a quantitative embodiment of the verification engineer?s testing requirements. Coverage goals enumerate the functions, combinations of functions, and parameters to those functions that need testing. Discrete events, such as executing a design feature in a specific way, are referred to as standard coverage goals. Permutations of these discrete events form a second class of coverage goals referred to as cross-coverage goals. Directed random verification is a refinement of functional verification that utilizes a design specific random test generator to create stimuli. This is opposed to other techniques which rely on manual methods, or which create design independent random stimuli. A directed random verification environment is composed of five major components. These components correspond to the four steps that compose functional verification: a stimulus generator to create random stimuli, a monitor to observe the stimuli generated, a simulation of the design under test to generate the design?s reaction, a checker to compare that reaction to an expected result, and a coverage monitor to track which functions of the design have been tested. This research focuses on the actions of the stimulus generator, as it is this component which controls the functions of the design that are tested and how. The stimulus generator is essentially a program that randomly enumerates permutations from a discrete number of actions, where the actions generate some form of useful stimulus. Together these permutations
Figure 1. Demonstration of redundancy reduction versus traditional methods

Having trouble? If the applet never starts it is most likely because you are using an older version of the Java Runtime Environment (JRE). This applet requires Java 5 or higher. You can upgrade for free at Java Soft. Please go to Java Soft's download site and download the latest JRE.

form a permutation tree. Each vertex of this tree is a randomly-controlled condition with one outgoing edge per outcome. The edges of the tree represent the generation of various stimuli for testing the design?s functions. Each testcase generated by the stimulus generator is a traversal from the tree?s root to some randomly chosen leaf. The demonstration in Figure 1 helps explain how this works. In this demonstration a ball (shown in yellow) is dropped onto a series of pegs (shown in black). Each time the ball hits a peg it will fall to either the left or right side, chosen at random. The ball acts as an analog for a directed random verification testcase with each peg acting as a randomly-controlled conditional. As the ball falls a blue line is left indicating the path it has taken. As more balls follow a path the blue line becomes darker. After running this demonstration for a while one sees how certain portions of the permutation tree are executed more than others. Specifically, the conditions near the top of the triangle are explored more than those at the bottom, and some areas of the triangle are never explored. Ideally, the entire triangle would be explored only once, as this uses the minimal amount of resources yet stills tests all the functions of the design under test. The Reduction of Redundancy (RoR) technique achieves this goal by utilizing checkpointing to efficiently explore the permutation tree. When the RoR method is applied to a randomly-controlled condition, that condition is no longer stochastic. The conditional statement is transformed such that it explores all of its outcomes. During the execution of the condition a checkpoint of the verification environment is created. That checkpoint is then restored once for each possible outcome. The restored verification environments are exact replicas of the original, executing from the point where the checkpoint was created. The only difference in the replicas is that each one explores a unique outcome of the checkpointed conditional statement. By selecting the "RoR" option in the Figure 1 demonstration, and clearing the demonstration's display, one sees how RoR efficiently explores the entire triangle evenly, with no redundant execution. The cloning of the balls is equivalent to the replication of the verification environments, with one ball being created per outcome. The materials below explain this work in greater detail and show the results from applying RoR to real-world verification challenges.

Masters Thesis Documents

J. Craig. Reduction of Redundancy in Directed Random Verification through Checkpointing.
M.S. Thesis, University of Vermont, Burlington, VT, April 2006.

Thesis Defense Slides

Thesis Proposal

Useful Links:

Wikipedia