0805FR-Collaboration.doc

Keywords: point solutions, second-tier, standard, tool flow

Editorial Feature Tabs: Digital – Methods and Tools (EDA)

@head:Second-Tier EDA Vendors Must Collaborate to Survive

@deck:With today’s mature chip-design flow, innovative point solutions must support multiple vendor formats and strive to work within established standards.

@text:A subtle yet fundamental shift has occurred in the EDA-tool community. Startup and second-tier companies no longer compete directly with the major first-tier giants (i.e., Synopsys, Cadence, Magma, and Mentor). Gone are the days when companies like Magma rose to prominence by challenging the established EDA vendors. Today’s second-tier firms can no longer survive by competing with the big boys. Instead, they must work with the chip-development flow created by those first-tier vendors. The smaller player has to work in a collaborative environment by offering products that support many different vendors. These tools are agnostic to the first-tier programs.

This tool-agnostic approach is quite a change from the late ’90s, when the second-tier startup companies aligned themselves with one of the first-tier players. These startups effectively positioned their products to be in direct competition with the other major EDA-tool vendors.

What factors have caused this shift to a more collaborative model among the second-tier companies? What new challenges must now be faced? This report answers these questions by covering a mix of second-tier companies that follow the collaborative, “tool-agnostic” approach.

SoC Design Solved

Over the last several years, the first-tier EDA companies have been working hard to fully integrate their tool suites into a core design flow. In one closed loop, for example, the Magma Blast Fusion suite has integrated design synthesis, change-management ECOs, place and route, and extraction for validation. Similarly, Synopsys’ IC Compiler and Cadence’s Nano-Encounter provide the same closed-loop design environment.

In other words, the high-level, core design flow for a typical vanilla-type chip has been “solved”--at least in terms of timing criticality and functionality. As stated by Pallab Chatterjee of President SiliconMap, LLC., this core design flow uses an ARM library because ARM and Artisan own 75% of the semiconductor world’s IP. It also uses a Virage-compiled memory, as it has 90% of that memory market. Chatterjee poses this question: How can a second-tier EDA startup differentiate itself if a typical chip is developed around a standard--e.g., 802.11g--using a Synopsys flow with ARM IP and Virage memory?

The answer is straightforward. Second-tier companies must offer different point solutions that support the de-facto development flow. These point solutions usually focus on performance optimizations, improved design productivity, or alternative design exploration in a way that isn’t covered by the first-tier players. But the change for second-tier vendors from a competitive to a collaborative relationship with the first-tier giants is something new.

For the first time in about six years or so, Chatterjee observes, the second-tier players aren’t aligning themselves with one vendor. Instead, they’re offering tools that are partner-agnostic. For instance, companies like Prolific, Zenesis, OEA, Silvaco, and others are focusing on the performance optimization of different aspects of the established tool flow.

With the establishment of a de-facto development flow for creating complex ASIC and even FPGA chips, anyone who wishes to enter the EDA market must support the flow. Of course, end users have additional reasons to continue with the established flows. One of the most compelling reasons is that chip companies have already invested large sums of money in both EDA tools and training.

Yet as Knowlent’s CEO Sandipan Bhanot notes, time investments are another equally important incentive for second-tier companies to stay within established tool methodologies. Semiconductor companies have spent many man-months to -years of effort ensuring that all of the tools in their design flows actually work together. With such a large commitment of money and time, is it any wonder that most chip-development companies are reluctant to add new tools to their flows?

But such companies do invest in point tools in two instances: when it easily fits into their design flow and when certain design constraints simply mandate that an optimized point solution be used. In regards to the former instance, Kumar Venkatramani, Executive Consultant for Softjin Pvt. Ltd., says, “To take a customer’s viewpoint, there are very few points in time in the design cycle when a customer is willing to consider a new tool suite. However, if a point tool is available that can help them get their job done faster, cheaper, and better, they will consider doing this at other points in the life of the product.”

Certain key areas in the design of today’s complex chips may require tools that provide optimal solutions. Rich Faris, Director of Marketing for Real Intent, Inc., observes that design teams cannot compromise in critical performance areas like power or dye. In these cases, they choose to supplant the first-tier vendors’ proprietary flows with best-in-class solutions from smaller EDA firms that focus on their area of expertise.

This shift is a minor but important one, as it moves toward agnostic tools that complement the design flow of the first-tier companies. Such a shift has largely been made possible by the standardization of interfaces. Startups that align themselves with the Cadence design flow must support the Open Access (OA) database. Similarly, Chatterjee notes that Synopsys’ client base is building everything that’s socketed into Milkyway through either the scheme application programming interface (API) or the C API. The back end of the chip-development process is being driven by Mentor’s Calibre program--in particular, the Standard Verification Ref Format (SVRF) rule-writing language.

Standards Enable

Working within the established chip-development tool flow is critical for second-tier companies. In fact, it is a requirement for doing business, notes Stan Krolikoski, CEO of Chip Vision. “It’s difficult enough for customers to set up a flow that works and to maintain that flow,” observes Stan. Thus, all new tools need to hook onto the flow that exists.

That “hook” is achieved through standard interfaces. Such standards are the glue that allows all EDA solutions to interoperate. Many experienced designers can remember how standardizing the format of their data files lead to the sharing of design data among many different programs. Similarly, standardizing design databases will help chip designers assemble the most appropriate tools for their projects. Examples of such standard databases would include OpenAccess and Synopsys’ Milkyway. Common databases will allow enormous portability of designs.

Until just recently, however, EDA companies kept their chip development flows secret. Integration into such a proprietary flow was only possible when allowed by the flow vendor. But this situation is changing, thanks to the emergence of common databases like Cadence’s OpenAcess, states Jeff Lewis President of CiraNova, Inc. He explains that OpenAccess enables plug-and-play tools, thus enabling tools from many vendors to inter-operate as well as any proprietary flow.

Standards are good for competition because users don’t have to lock into one solution, notes John Emmit, Business Development Manager for Carbon Design Systems. He elaborates that the customer’s ongoing investment in EDA design-verification tools and training is significant. “They can’t justify the time and expense to integrate new tools that don’t easily fit into their flow. And customers must have the flexibility to add leading-edge point tools to their flow in order to achieve and maintain a competitive advantage.”

Standards are the reason that point solutions can fit into the existing development flow. But exceptions are sometimes necessary. As Jake Karrfalt, CEO of Alternative Systems Concepts cautions, it is important to start by supporting standards. Yet second-tier vendors must also provide switches that can “divert to vendor-dependent (proprietary) exceptions.”

Standards Challenge

Standards are at once an enabler and a source of headaches for most second-tier companies. They enable interoperability, but they do so at a high cost. Steve Sapiro, VP Marketing and Chief Reality Office for Stelar Tools, Inc., explains that they have to design the tool from scratch to fit into any scenario and work with any data format. It also has to be flexible enough that as things change (which they will), the tool will still fit comfortably into the customer’s design and verification flow. “It’s a full-time job to make sure that we work with all the tools that the customer wants to use,” remarks Sapiro.

Rob Roy, VP of Marketing and Business Development for Zenasis Technologies agrees that standard-based point solutions must overcome many challenges. “As tool providers, we have to keep track of changes in tools in the traditional design flows,” notes Roy. Any changes in the standards or traditional design flows mean accompanying changes to all appropriate APIs. Custom design flows for specific vendors must be maintained as well. A more subtle cost has to do with the testing and flow integration tools from in-house vendors. Roy states, “Many times, this becomes a difficult process since the existing vendors impose stringent requirements to have their tools available for this purpose.”

Meeting the specifications of a standard is often not enough, however. Sometimes, the greater challenge comes from a lack of standards. Gigascale’s Adam Traidman explains that even common file and model formats within EDA, such as Synopsys Liberty and Cadence LEF, are riddled with interoperability issues. He notes, “The existence of over a half-dozen mainstream ‘versions’ of these formats and a lack of communication among EDA and IP vendors create a quagmire” for designers.

Traidman says that one reason that things have transpired in this way is that these original “standards” were often the product of a single company’s engineering efforts and product vision. By virtue of their single-minded creation, they naturally don’t meet the needs of the general industry. They also are “old”--often dating back 10+ years. Although versioning has provided some additional features, these formats are largely becoming outdated. They lack much of the meta-data required to efficiently use IP and interoperate between tool flows.

Standards or a lack thereof represent critical technical challenges to second-tier vendors. The business side of the equation has its own obstacles--the most critical of which is profitability. William Ruby, VP of Marketing for Golden Gate states, “In order to be commercially successful, tools that complement the traditional flow must provide significant additional value. The customer must be able to justify the new tool based on what he or she perceives is the return on investment (ROI). So the tools that complement traditional design flows must provide a good ROI to be adopted.”

The need to provide collaborative, tool-agnostic products has become a technical and business requirement for most second-tier EDA vendors. But the trend toward collaborative tools isn’t limited to EDA companies. Hardware-chip and IP suppliers like Quicklogic have recently made a shift from “features and functions” to “platforms and solutions.” Today, such shifts to a platform or system level are said to be in support of the ecosystem. But no matter the vernacular of the day, second-tier EDA and hardware solutions must work together to survive.