FPGA Developer – draft10/10/20181
======
FPGA Developer
July 6, 2006
Development Tools
======
Welcome to the July 2006 edition ofFPGA Developer. Wecomplement Chip Design magazine by providing the latest FPGA and Structured ASIC news, opinions from industry experts, and timely technology articles. See below for subscribe andunsubscribe options.
This Month's Table of Contents:
1.Editor’s Note –The Grasshopper's Tale
2. Ease of Design Compliments FPGA Performance Goals
3. Advances in Compiler Technology & Tools for FPGAs
4.Space Flight FPGAs Save Time and Mitigate Risks
5.PCI Express PHY Development Kit Simplifies Migration
6.65nm FPGA Logic Design Solution
7.IP Core Portfolio Expands Support
8.FPGA-based High-Performance Computing for IBM BladeCenter Applications
9. Research Agreement Concentrates on FPGA Based Reconfigurable Computing and DSP
10.In-Depth Coverage Links
For Better Designs, Add SPICE to Taste
RTL++ and the Return of the Tall Thin Designer
11. New Book
Real World FPGA Design with Verilog
12. Happenings
**************
1. Editor’s Note
**************
The Grasshopper's Tale
By Jim Kobylecky, Editor
<INSERT 0606_FPGAD_EN1.JPG>
I've always admired (and envied) colleagues who can stay focused – the ants, people who decide on a goal and keep going, no matter what the obstacle. That's not me I'm afraid; obstacles drive me grasshopper-crazy. I'm always looking for another path, and that often requires a new tool or a new way of looking at the problem. That's why I enjoyed this issue's focus on "Development Tools." In our Viewpoints section, you'll find Mark Goosman concluding our tool miniseries and suggesting that designers need to look at more than just performance specifications. He argues that it's a balance of factors (and a balance of tools) that determines design success. Meanwhile Neil Harold, in his Viewpoint, thumps away at the design barriers that have been holding back the use of FPGAs. He details new and developing tools that are already breaking through. Our focus on tools and solutions repeats in the In Depth selections and the Book Review. You'll even catch a glimpse in our regular survey of the News.
Tools are a necessary part of any solution, but sometimes it also calls for creative thinking – and the inclination to jump sideways as well as forward.It's the key defense mechanism of us frustration-adverse types.
As I've said, I greatly respect my more focused co-workers. I think, though history, they've been the one who have achieved the Really Big Things and I'm happy to applaud them. Sometimes, however, you really can't get there from here, or maybe you can, but not in a practical way. Human-powered flight comes to mind. There's more to being Superman than the color of the sun.
Other times we do solve the problems and develop the means, only to discover that our target has disappeared. This can be as simple as a closed market window or as world-challenging as a paradigm shift.
Take the giant electromagnet built by the Federal Telegraph Company to achieve radio transmission across the Pacific. A necessary step for 19th Century technology, it had been rendered obsolete by the vacuum tube. Eighty-five tons of engineering dreams and man-hours, condemned to rust away in a Palo Alto storage dump.
That's where Ernest Lawrence, the father of American experimental physics, and Stanley Livingston found in it 1931. That rusty magnet, something they could not have afforded to buy or build, became the heart of their early cyclotrons. It became a cornerstone of early atomic physics, but only because it had already failed in its intended role.
I don't, of course, hope that any of my failed projects will build the next atomic bomb, but that’s not the point. What is important is that in science nothing is wasted. You never know when a failed experiment will provide the key to something else. We’re always learning something, and our experience will only fail if we will it to. It's what we tell our kids at Little League and soccer games, that winning isn't everything, but it's just as true at work. How we learn to handle barriers, how we search them for new opportunities, will always shape our future, personal and professional. (And if sharing what I've learned saves the ant in the next cubicle a few hours or months, then great! I'vehelped the whole corporate ecosystem succeed.)
********************
2.Viewpoint – Exclusive
*********************
Editor's Note: This is the final installment of a three-part mini-series on crucial FPGA development topics.
Ease of Design Compliments FPGA Performance Goals
By Mark Goosman, Product Marketing Manager, Xilinx (
<INSERT 0706_FPGAD_VP1.tif
A quick scan of the headlines in industry publications gives the appearance that the lion’s share of the focus from FPGA vendors is on performance. Saying only that the emphasis is on speed is telling only part of the story. Independent surveys and feedback from customers show that the number one concern of designers is timing closure. If performance targets weren’t aggressive, this would be an easier task but these gains in performance are in response to market demand. Performance, however, can’t be the only goal at the expense of power, productivity, functionality, and repeatability.
The less flashy part of the goal of FPGA vendors is to provide an environment which delivers ease-of-design. This means allowing the designer to more easily achieve uncompromising design goals in an environment that reduces complex design hurdles. Ease-of-design means integrating the individual tools, whether they are supplied by the silicon vendor or from an EDA partner. It means providing tools to help them better understand power issues early in the design process and help them stay within their power budget. It also means making the most efficient use of hardware resources.
Not so long ago, FPGAs were used primarily in high-end, low-volume equipment which constituted a fraction of the total marketplace. This was due in part to the relatively high cost of FPGAs (as compared to custom devices) and the limitations of the FPGA design tools. To serve the remainder of the marketplace, experience has shown a need to deliver a combination of leading edge silicon architecture, tight integration with third party EDA products, and a robust design environment delivering fast time-to-market and functionality to address specific user demands.
The balancing act is to provide a solution that delivers the highest level of functionality, the lowest power, the least risk, and the greatest ease of use at the lowest cost. Oh, and it has to be fast. The part of this equation that is of greatest concern to the majority of designers and that’s easiest to quantify is performance, but this doesn’t satisfy all the concerns.
New and growing demands drive a great deal of research and development for FPGA vendors. One example is the emergence of what has been termed “triple-play”. Triple-play service, or digital convergence, is a marketing term for the provisioning of three services: voice, video, and data, over a single broadband (IP) connection. These market demands for new applications bring growing pressure for performance but also for quality of results in other areas such as power, cost, and rapid development.
<INSERT 0706_FPGAD_VP1_Fig1.JPG>
Delivering a solution to the challenges presented by this triple-play involve much more than just performance. A combination of silicon, design tools, and IP is needed to address the problems associated with jitter, packet loss, power, and scalability in addition to performance. New innovations, in addition to existing technologies, are combining to deliver the highest performance; low power, low system cost, and maximum productivity to address high density, high performance system-on-chip (SoC) designs that are at the heart of core infrastructure applications that address triple play digital convergence.
Ease-of-design also means reduced risk, especially late in the design cycle. Repeatability is closely associated with performance but, for the designer who is struggling with changes in their own design and the impact these changes create late in the schedule, it can be a concern all its own. Through internal development and work with partners, FPGA vendors are working to help facilitate a more hierarchical design methodology. Floorplanning, once the bane of FPGA designers, has matured to become an integral part of many design flows. With the ability, not only to improve results for their current design, but the ability to facilitate team-based design and IP reuse, designers are able realize a long list of benefits.
<INSERT 0706_FPGAD_VP1_Fig2.JPG>
Recent years have seen greater demand for partial reconfiguration. This is the ability to replace a subset of the logic in an FPGA as the remainder of the device continues to operate. This functionality is a vital component in software defined radio (SDR) and other applications. While there are performance demands in these types of applications, the need for a complete development environment for this complex design flow is essential. Providing an environment which allows the designer to easily manage the multitude of combinations of static and dynamic blocks of logic can better assure design success. This allows applications to improve power and reduce space on the board in addition to reducing the project costs by replacing multiple devices with a single FPGA solution.
There is a great deal of focus on areas that sit far from the spotlight of performance. Providing users with a solution for hurdles like application specific IP, source code version control, rapid design verification, hierarchical design, structured messages, and others may not be candidates for front page headlines but are important in improving a designer’s productivity and increasing the quality of their designs.
As more and more attention is placed on achieving the industry’s fastest performance, there is a lot of less glamorous work being done to improve all areas within the design environment. In balancing the demands to provide a solution that offers high performance combined with low power, ease-of-use, and low cost, it will be performance that will continue to enjoy the spotlight, however, all designers will benefit from an environment that delivers true ease of design.
Comments about this article? Share your thoughts by writingoureditorial director: .
********************
3. Viewpoint – Exclusive
*******************
Advances in Compiler Technology & Tools for FPGAs
By Neil Harold, Systems Technical Specialist, Nallatech (
<INSERT 0706_FPGAD_VP2.JPG>
FPGAs are one of technology’s “hot topics,” widely accepted in the market as a viable alternative to traditional processing technologies. They are used in pretty much every industry as embedded logic devices, in most cases replacing ASIC or low end processor support elements. Advocates of FPGAs suggest that they are capable of going beyond this, to become the processing heart of systems. To do this they will, in most cases, have to supplant the ubiquitous microprocessor. This presupposes that they are seen as a good technical and, more importantly, useable alternative to other processing technologies.
The histories of the microprocessor and FPGAs have some important similarities. The microprocessor was from the same “Von Neumann family” as the existing mini and mainframes, which meant that the established tools and techniques could be easily implemented on the new technology. Similarly, FPGAs have benefited from tools and techniques developed for general synchronous logic. Microprocessors, being cheap and easy to use were adopted into low end markets, and the resultant funding fuelled marked performance improvements. FPGAs offer lower cost for small volume and their ability to be reconfigured has meant it has successfully taken the place of many of the large scale logic designs in systems, causing a similar “disruptive technology” cycle of market revenue generating performance improvements.
What about the claim that FPGAs are capable of being the heart of a system? Here, the competition is traditional processing architectures and that is a far more difficult nut to crack. The languages that are accepted for synchronous logic design are not what the typical software programmer would recognize. Their pedigree as hardware design tools is obvious. They are generally at a level that allows the designer to have maximum control over the detailed timing and structure of the output. This is generally at the expense of verbosity and a lack of high level structure. The cost is acceptable for designing, for example, a memory controller. It is less acceptable when designing a complex data dependent signal processing function.
Recognition that FPGAs are capable of complex functions (if only the tools to describe those functions were available) has, over the last 5 or so years, generated a number of initiatives to provide high level software-like design tools for FPGAs. As the FPGA is not in the Von Neumann processor family line, this is a more difficult task than it might at first appear. Essentially, FPGAs have an additional degree of freedom. In traditional software, the “compiler” takes a high level description of the task and generates a series of instructions that cause a predefined machine to perform the described task. In the FPGA case, there is no predefined machine. Hence, a “compiler” must take a description of a task from which it must first design a machine that can perform the task. This machine can either do this implicitly, by following a series of instructions or a combination of the two.
As in the microprocessor case, for FPGAs budget may decide the tool complexity and effectiveness but we must also have access to a multitude of tool types depending on the specific task being undertaken. This enforced flexibility creates three fundamental issues. First, we must have a sufficient diversity of tools to cover all required tasks. Second, if we can access enough tools, this creates multiple costs. These costs are both ongoing (e.g. licensing) and startup (e.g. training). Third, multiple tools must somehow come together to form the homogeneous whole that is the final system.
Therefore, the question arises; do we today have the necessary diversity of tools to make FPGA based systems successful? Looking around, there is a wide variety of tools and design methods available. A less than exhaustive list would include:
- Multiple C compilers such as ImpulseC, MitrionC and HandelC, each with their own specific strengths and key application areas.
- Various other high level language compilers such as JHDL, Forge, Confluence or Accelchip.
- Graphical design tools including Nallatech’s DIMEtalk, Xilinx System Generator, Starbridge VIVA and Annapolis CoreFire.
- More traditional VHDL and Verilog based solutions.
- IP libraries covering everything from communications to complex mathematical functions.
- Simulators and verifiers at all levels from simple logic up to complex system level.
This list is growing daily and probably at a faster rate than ever before as more organizations become involved.
Cost is still a significant issue. Here the hardware/ASIC roots of many FPGA solutions are showing. The purchase and licensing costs of ASIC development tools are traditionally many orders of magnitude more expensive than software development tools. Whilst costs of FPGA-based tools have undeniably reduced in the last two-three years, they are still typically thousands or tens of thousands of dollars versus tens or hundreds of dollars for microprocessor solutions. In terms of staff training and availability costs, some vendors have tried to re-use existing skills by re-using existing standards such as the C language (though the C is not always entirely standard) or Matlab environment already familiar to microprocessor users. Others have made the decision that the technology is sufficiently different to require a fundamentally different design approach. Time will tell whether this can be sustained in a market where there is a shortage of skilled staff capable and willing to re-train in a non-standard technique.
The matter of bringing these techniques together is far less advanced. Tool vendors sell a single tool (in general) and hence have little or no incentive to include other tools in a design flow. The admission that more than one tool type may be required weakens any marketing effort in favour of any single tool. For the near term, there is unlikely to be a “one size fits all” FPGA design tool.
If the lack of a single solution is the downside, the upside is that there seems to be no shortage of tools being developed to fit the various niches that are required. As these tools mature (and hopefully reduce in cost) FPGAs will become practical alternatives for more and more systems applications as well as their more historical role of system support logic. Their specialist capabilities, where required, are hugely valuable but they are not always required. Jet engines are far more efficient by almost any measure than the simple piston engine. This makes them indispensable in the aircraft industry, but few of us drive to work in a jet-powered car. Similarly, FPGAs will increasingly be used in systems that have high requirements of efficiency and raw power.