0706EF4-Sandworks.doc
Keywords: analog, AMS, verification
Editorial Feature Tab: Analog – Simulation and Verification
@head: Debugging Analog Designs Demands More than Waveform Analysis
@deck: The analysis, verification, and debugging of today’s analog simulations requires tools that work with all of the industry-standard simulators, formats, and environments.
@text: The analog and mixed-signal (analog-digital) content of today’s complex electronic systems is increasing dramatically. Unfortunately, the tools that are used to analyze and debug these designs aren’t keeping pace with the increasingly sophisticated requirements of design engineers. The EDA vendors that specialize in analog simulation have tended to focus the bulk of their engineering resources on improving the accuracy, capacity, and performance of the simulation engines themselves. Each of these simulators comes equipped with some level of waveform-viewing capability. These visualization utilities are invariably limited, however, in their ability to perform sophisticated analysis of the simulation results and use this analysis to debug the design.
In the not-so-distant past, a similar situation occurred in the digital arena. The EDA vendors fielding digital-simulation solutions focused on the capacity and performance of the simulation engines. But they provided relatively rudimentary waveform-display utilities. These utilities had limited analysis and debug capabilities.
To address this problem in the digital domain, one EDA vendor decided to focus on visualizing, analyzing, and debugging the results from simulation. This company--Novas ( provide a digital-simulation solution per se. Instead, it provides tools that have become de-facto industry standards with respect to the visualization, analysis, and debugging of the digital portions of the design. (Due to the fact that a large proportion of today’s digital designs have some level of analog content, Novas also provides adequate visualization capabilities for analog signals. But it doesn’t offer sophisticated analysis and debug capabilities for these signals.)
Although this article addresses analog-design debugging and results analysis, the Novas solutions are important for three key reasons:
- These digital solutions work seamlessly with all industry-standard digital simulators and digital-simulation-result file formats.
- The Novas solutions facilitate cross-probing between the various design views, which greatly eases the task of traversing and debugging the design.
- The company has defined a fast-signal-database (FSDB) format, which can be used to store huge amounts of data. It also optimizes the loading and accessing of simulation results.
In many respects, the analog-simulation industry finds itself in the same predicament as the vendors that were using digital simulation prior to the introduction of the Novas solutions. In fact, the situation that’s now faced by analog and mixed-signal (A/MS) designers is worse than it was for digital design. After all, A/MS design teams almost invariably make use of multiple analog simulators. (Digital design teams typically standardize on a single simulator that’s used throughout the design process.)
A/MS designers use multiple simulators for several reasons. In some cases, different simulation engines and algorithms may be more appropriate to the simulation of specific analog functions. Alternatively, it may be that one simulator is used to perform fast (but less accurate) simulations. A slower (but more accurate) simulator may subsequently be used to perform the more rigorous sign-off design verification. In the case of mission- and/or safety-critical designs, two (or more) different simulators may be required to provide a high level of confidence in the results.
Whatever the reasons, A/MS design teams are often working with multiple simulators--each of which has a different user interface for the display and analysis of the simulation results. Furthermore, each simulator’s visualization and analysis utilities typically provide only a limited subset of the capabilities that are required by the different design engineers. The EDA industry now needs one vendor to focus on the analysis, verification, and debugging of the results from analog simulation. As with digital, this company doesn’t need to provide an analog-simulation solution per se. Instead, the company should focus its resources on providing a suite of tools that works with all of the industry-standard analog simulators and simulation environments.
Pre-Simulation Analysis and Debug
Numerous types of capabilities should be provided by such a vendor-neutral analog and mixed-signal analysis, verification, and debugging (AVAD) solution. For example, a large proportion of analog designers use schematic capture to create the initial design view. Different schematic-capture tools are typically associated with specific analog-simulation environments. For this reason, an AVAD tool suite should work seamlessly with all of the industry-standard schematic applications. Such applications include those from Cadence, Mentor, and Silicon Canvas in the US; Jedat (which used to be known as Seiko Epson) in Japan, etc.
This level of integration should go as high as cross probing. For instance, selecting a signal in the schematic should automatically locate and present the corresponding signal in the simulation-results waveform display. Having said this, design analysis and debug activities should commence long before a simulation run is actually performed.
Often, there’s a misconception among design engineers that the schematic-capture tool will perform extensive electrical and functional checking of the design prior to netlist generation and simulation. But this isn’t necessarily the case. Furthermore, some expert users prefer to work with hand-coded netlists. Third-party analog intellectual property (IP) is typically presented in the form of netlists.
There are several considerations here. First, the AVAD tools should comprehensively screen the design for syntax, parameter, and connectivity issues. This screening is particularly important when one realizes that different simulators may treat specific cases differently. In the case of a floating-gate terminal on a MOS transistor, for example, some simulators will leave it “as-is.” Yet others will tie the terminal to ground via an internal low conductance. Different results will then be generated from the different simulators, making it important that users be made aware of this potential problem.
For another very simple example, an annotation of 1e-3F in the netlist may be understood to refer to units of femto-Farads or Farads, depending on the specific simulator that’s used. It’s therefore important for the AVAD suite to check all of the simulation setup parameters and options and to alert the user of any potential problems. In the case of parameterized netlists, attributes like the lengths and widths of transistor channels will be passed into the cell. As a result, the AVAD suite must track these parameters all the way down into the cells and verify that everything is as it should be. For example, the parameters should be appropriate and not result in negative resistance or capacitance values.
Because a large analog circuit can take days to simulate, one must ensure that everything is set up correctly before the simulation is launched. Thus, one capability that an AVAD environment should have is the ability to locate any stimulus--such as signal generators-- embedded in the SPICE netlist. This stimulus should be displayed in the form of graphical waveforms.
The AVAD suite also should provide stimulus-generation capabilities in the form of a piece-wise-linear (PWL) editor. This editor will allow the design engineers to draw and save waveforms. The engineers should be able to apply any of the processing and analysis functions on the stimulus waveforms that they can apply on the post-simulation results waveforms. As a result, the engineers will be able to quickly verify that the stimulus is correct before launching into a time-consuming simulation.
Post-Simulation Analysis and Debug
Following simulation, the AVAD suite must be capable of quickly accessing, processing, and storing very large results files that can be gigabytes in size. It should be able to import results from multiple simulation engines and environments. Results also should be exportable in industry-neutral formats, such as comma-separated-value (CSV) files that can be read into applications like Excel or PWL format. It’s common to use the (possibly processed) results from one simulation as the stimulus for another simulation.
In addition to providing sophisticated waveform visualization and manipulation capabilities, the AVAD environment should provide data-processing functions. Such functions include the following: math (minimum, maximum, sine, cos, tan, etc.); waveform transfer (derivatives, Fast Fourier Transform, Inverse Fourier Transform, etc.); measurement (delay, average, RMS, etc.); and radio frequency (RF) like S-parameter functions for wireless applications. When appropriate, these functions should support a "waveform-in/waveform-out" usage model, thereby allowing them to be cascaded to construct more complex functions.
The waveform-analysis environment should support the use of unlimited cursors in both the horizontal and vertical directions. In addition to reporting instantaneous values, it should be possible to use pairs of cursors to report averages and delta values. It also should be possible to specify the jump settings associated with a cursor, such as the crossing of two waveforms. The engineer should then be able to repeatedly jump forward/backward to the next/previous occurrence of the condition.
In general, the most often-used measurements should be interactive. They will then minimize the number of mouse clicks, increasing productivity and reducing user fatigue. For example, a frequency-versus-time measurement traditionally requires that users measure each cycle period. They must then take the inverse to get frequency and plot the frequency that was calculated against time. This sort of time-wasting activity is best left for automation.
In addition to cursors providing instantaneous values, the environment should support the ability to perform interactive measurements of rise and fall times, pulse widths, frequency, duty cycle, and so forth. In the case of frequency measurements, the tools used to perform these measurements should automatically locate the proper edges. In addition, it should be possible for the user to lock/anchor these tools on time values or threshold levels (absolute values or as a percentage of the signal’s full swing).
Because a large proportion of today’s analog designs have some level of digital content, the AVAD environment also should provide adequate capabilities for digital signals. In many cases, it’s necessary to perform digital-to-analog (D/A) or analog-to-digital (A/D) transformation functions on signals. The environment should therefore be capable of taking an analog signal and converting it into a digital equivalent. In such a case, the digital version may be a simple 0 or 1 related to a specified threshold. Or it may be translated into a vector, in which case the user should have the ability to specify the number of bits, quantization levels, and sample rates. Furthermore, it should be possible to display the resulting logic values in binary, octal, decimal, hexadecimal, or ASCII.
With regard to the analysis and debug of repetitive clock and/or data signals over a large number of cycles, the environment should support the concept of eye diagrams. These diagrams can be used to visualize and analyze timing, jitter, signal degradation, and the effects of noise. Such diagrams are of particular interest for functions like phase-locked loops (PLLs), digital-locked loops (DLLs), high-speed serial interconnects (PCI Express, Rocket I/O, etc.), and double-data-rate (DDR) memory interfaces. The user should be able to select a signal and request that the system automatically create an eye diagram from it. These diagrams should support the analysis of clock-period, cycle-to-cycle, and clock-to-data jitter. In addition, they should automatically perform the eight-point eye-diagram measurements required by DDR-memory applications.
All of these aspects are just a starting point. After all, the AVAD environment also needs to be able to analyze and display the results from statistical simulations. Following a multi-thousand-run Monte Carlo simulation, for example, it should be possible to instruct the system to determine the variation in width of a specified signal at the 3-V level and present the results as a histogram. Similarly, it should be possible to create parametric plots resulting from multi-variable sweep analysis. Such plots will facilitate the visualization and analysis of the effects of a measured response versus one or more sweeping conditions (see the Figure).
As a final example of the capabilities that are required of a de-facto industry-standard AVAD environment, consider the requirements of post-layout verification. Following parasitic extraction, detailed-standard-parasitic-format (DSPF) netlists can contain extremely large numbers of randomly generated net names. Those names correspond to the parasitic resistance and capacitance elements associated with the different net segments. As a result of their presence in the netlists, crosstalk and ground-bounce issues become hard to diagnose. Often, such issues go undetected during post-layout verification. The randomly generated net names in the simulation results make it extremely difficult to locate the original ideal net, which is the source. Thus, the environment should support the ability to cross-probe from the simulation waveform back to the original ideal net in the schematic.
Real-World Analysis and Debug
An all-encompassing AVAD solution also should support the ability to load data from oscilloscopes into the environment for the following reasons:
- The software environment will support a much wider range of analysis capabilities than what’s offered by the oscilloscope.
- The expensive oscilloscope hardware is freed up and made available to other users.
- The real-world data can then be processed as required (for example, smoothing any sensor-related noise) and compared to the simulation results.
- If oscilloscope results are brought into the software environment, exactly the same algorithms will be used on both measured and simulated data.
- Subsequently, the (possibly processed) signals from the oscilloscope can be exported in PWL format. Real-world measured data can then be used as sources in downstream simulations. For example, users could capture waveforms from real-world noisy power supplies and convert them to PWL format. They could then use them as the power-supply source in simulations, which would allow them to check the robustness of the design and its immunity to noise.
The AVAD environment also should provide the ability to read in the I/O-buffer-information-specification (IBIS) models; to display and compare multiple IBIS models; and to compare these models with the results from simulations and/or captured real-world data (from oscilloscopes).
Customization and Automation
It simply isn’t possible to create an "off-the-shelf" AVAD environment that can completely satisfy every user’s unique requirements. Instead, it should be possible to customize and extend the environment. For this reason, users should be able to create scripts in commonly used scripting languages, such as Tcl and Perl. They should then be able to launch these scripts from within the environment’s graphical user interface (GUI) or in batch mode.
In the case of the GUI, it should be possible to add custom menus to facilitate the launching of scripts that need to be run repeatedly and/or by other users. For example, the engineer should be able to create a script to automatically detect, locate, and report voltage glitches that are greater than a specified voltage amplitude and/or narrower than a specified duration at a defined threshold. This script should be associated with a custom menu item.
It’s very important for this environment to be format-independent. In other words, it should be possible to create a single set of scripts that work with the results from all industry-standard analog simulators and simulation environments.
The analog and mixed-signal content of today’s complex electronic systems is increasing dramatically. But the tools that are used to analyze and debug these designs aren’t keeping pace with the increasingly sophisticated requirements of design engineers. The EDA industry needs one vendor to focus on the analysis, verification, and debugging of results from analog simulation. This vendor should provide a suite of tools that work with all of the industry-standard analog simulators, simulation file formats, and simulation environments.
These tools should be capable of being tightly and seamlessly integrated into the host environment. In addition, they should support a complete range of analysis capabilities including transient analysis, frequency analysis, statistical analysis, and more. Furthermore, these tools should be capable of importing, analyzing, and processing real-world data from oscilloscopes, for example, and exporting these (and simulated) results in piece-wise-linear (PWL) format. The results can then be used as the stimulus for downstream simulations. Now that ICs are including more and more analog and mixed-signal content, only tools that can accomplish these tasks will allow design productivity to keep pace with design complexity.