Chipsets:
A Breakdown of the Evolution, Types and Functions of the Boss of the Motherboard
CS – 350 – 2: Computer Organization and Architecture
Spring 2004
Authors:
Andrew Kennedy
Adrian Romano
Pat Robertson
Table of Contents
Introduction: What is a Chipset?3
Evolution of Intel Chipsets3
Triton Series 3
440LX-450NX 4
800 Series 5
Chipset Types6
Intel Based Systems 7
AMD Based Systems 8
Functions of the Chipset9
Processor Support 9
Cache Support 10
Memory Support 10
Peripheral and I/O Bus Support 11
Embedded Features 11
Summary12
Appendix13Introduction: What is a Chipset?
In today’s modern personal computer systems, the motherboard is the central site of computer logic circuitry. Without a motherboard, a computer is nothing more than a number of independent, unconnected hardware devices with no means of coordinating with each other. Thus we see that the main function of the motherboard is to provide a platform on which all of the major devices of the computer can connect and communicate with one another. The motherboard also holds the most important microchip in the entire machine: the CPU. So what does this have to do with the chipset? The answer is rather simple. The chipset is what defines the motherboard’s capabilities. Furthermore, it defines the entire system’s capabilities; every major component in the computer—including the CPU itself—is reliant on the functional capabilities of the chipset. Just as the computer is useless without the motherboard, the motherboard is useless without a chipset, hence our nicknaming of the chipset as the “boss of the motherboard.”
The chipset is actually designed around the specifications for a given CPU, therefore the CPU is designed before the correlating chipset(s). This is why many chipset manufacturing companies have begun to design and manufacture CPUs, and vice-versa. For example, take a look at Intel, one of the leading manufacturers in today’s market. Intel is a leader because they have successfully designed and manufactured both CPUs and chipsets (among other devices). Clearly we can see that if the CPU’s functionality is dictated by the chipset, then the chipset must contain a great deal of information about how the computer is meant to work. What is truly remarkable about the chipset, however, is that it not only defines the computer system, but also controls a large number of tasks itself.
Evolution of Intel Chipsets
Though there are many chipset manufacturers in today’s market—SIS, VIA and Opti, to name a few—the chipsets that Intel produces are the most widely used in modern computer systems. While each individual company has made their own mark on the evolution of chipsets, we can best understand the origins of this component by analyzing the development of one particular manufacturer’s products. For this reason we will focus solely on the evolution of Intel’s chipsets from the 1995 Triton series through the release of the 845.
Triton Series
Starting off in early 1995 was the release of the 82430FX, also known as the Triton 430FX. This was the first in Intel’s Triton series. The major technology advances that were incorporated into the 430FX included the PCI 2.0 specification, EDO memory configurations of up to 128MB, pipelined burst cache, and synchronous cache technologies. This chipset was lacking in some areas though. There were several new technologies that were being developed that it did not support including SDRAM, USB and Concurrent PCI. This eventually led to the replacement of the 430FX in 1996, a little more than a year after its launch, by a pair of more advanced and higher performance chipsets; the 430HX and the 430VX (PC Tech Guide, 2003).
The 430VX chipset was designed more for the home user and the 430HX, in turn, was developed more for business. First off, the 430VX addressed the immediate issues the 430FX did not employ—universal serial bus (USB) and Concurrent PCI standards. Not having Concurrent PCI was a serious problem in the 430FX because whenever a bus master, such as a network card or disk controller, tried to transfer data over the bus, the bus locked up in order to have a clear path to memory. This was inefficient because it interrupted other processes that were transferring data, and the full 100MBps of the bus was not being utilized. With the addition of Concurrent PCI, if a bus master was idle, the chipset took control of the PCI bus to give other processes access. It used a timeshare method and increased data transfer rates by 15% from the 430FX. Its focus, as stated earlier, was on the home user. Because of this, the 430VX was designed to speed up multimedia and office applications by supporting SDRAM, which was optimized for intensive multimedia processing. An advantage to SDRAM, besides the performance gains, was that it did not need to be installed in pairs. As for the 430HX, the more business-oriented chipset, the upgraded features improved or allowed networking, video conferencing and MPEG video playback. It also supported multiple processors, was optimized for 32-bit operation, and worked with up to 512MB of memory. It did not support SDRAM, but it did supported error control (ECC) when 32-bit parity SIMMs were used. The main difference between the 430HX and 430VX, however, lied in the packaging. The 430VX had four separate chips, which were built using traditional plastic quad flat packaging. The 430HX included just two chips, the 82439HX System Controller (SC), which managed the host and PCI buses, and the 82371SB PIIX3 for the ISA bus and all the ports. The PIIX3 provided two buffered serial ports, an error correcting EnhancedParallelPort, a PS/2 mouse port and keyboard controller, a USB connector, and an infrared port (PC Tech Guide, 2003).
Following the 430VX and 430HX came the more advanced Triton 430TX. It still incorporated many things its predecessors had—such as Concurrent PCI, USB support, aggressive EDO RAM timings and SDRAM support—but it was optimized for the newly developed MMX processors for use on both desktops and laptops. The architecture follows the model of the 430HX with the two-chip format, the 82439TX System Controller (MTXC) and the 82371AB PCI/ISA IDE Xcelerator (PIIX4). Figure 1 in the Appendix shows a diagram of what this architecture looks like. Other advances included the use of Dynamic Power Management Architecture (DPMA), which provided a reduction of the power the system consumes. It also offered intelligent power-saving features like suspend of the RAM and disk. There was also a speed increase in the data flow from the hard disk to 33MBps by way of Ultra DMA protocol (PC Tech Guide, 2003).
440LX-450NX
Next we come to the 430LX (not the Triton 430LX because the name Triton was dropped). It was designed specifically for the recently released Pentium II processor and incorporated many of the same features as the 430TX. These included SDRAM and Ultra DMA support. The main hurdle the 430LX conquered was the bottlenecking from the graphics controller and system memory to the CPU. To solve this, AcceleratedGraphicsPort or AGP was developed. AGP was a fast, dedicated bus directly from the graphics controller to the CPU, which aided fast, high-quality 3D graphics. Another feature was the Advanced Configuration and Power Interface (ACPI). This provided quick power up and down, remote start-up over a LAN for network management, and included temperature and fan speed sensors (PC Tech Guide, 2003).
Following the release of the Pentium II, the Celeron processor was developed as a lower-cost alternative. The 440EX was developed around this and had all of the same features as the 430LX for a smaller price tag (PC Tech Guide, 2003). By 1998, the CPU speeds had become so fast that the system bus could not keep up. Other chipset manufactures overtook Intel in this issue and developed Socket 7 architecture for their motherboards. Intel, of course, followed suit in April of 1998 with the 440BX chipset. The 440BX boasted a 100MHz system bus and SDRAM speeds. It was also backwards compatible so that the older 66MHz bus Pentium II could be used as well. QuadPort Acceleration (QPA) was also added to the 440BX, along with support for dual processors and AGP2x. QPA enhanced system performance by combining enhanced bus arbitration, deeper buffers, open-page memory architecture, and ECC memory control to improve system performance (PC Tech Guide, 2003). The 440BX was later used with Pentium III processors (Intel, 2004).
Not long after the 440BX was released, a new processor was developed—the Pentium II Xeon. With it came the 440GX chipset. See Figure 2 in Appendix for the diagram. The 440GX with Xeon processor was geared toward workstations and servers. It included most of the features of the 440BX like AGP2x expansion slot, dual CPUs and a maximum of 2GB of memory. It also had a backside bus linkingthe L2 cache and the CPU with a dedicated bus line. This allowed the L2 cache to run at the same speed as the core of the CPU (PC Tech Guide, 2003).
At the same time, the 450NX was released. This was designed specifically for servers and incorporated a 64-bit PCI bus. This enabled a second PCI bridge chip to be added to the motherboard, which was capable of supporting six 32-bit slots, three 64-bit slots or a combination. The reason for this feature was to support devices such as network and RAID cards. The 450NX also supported 1-to-4-way processor operation, up to 8GB of memory and 4-way memory interleaving, which provided up to 1GBps of memory bandwidth (PC Tech Guide, 2003).
800 Series
Now we come to the period of the much more advanced 800 series chipsets. The first in this series, the 810 chipset, had three variants: the 810, the 810-DC100, and the 810E. The 810 and the 810-DC100 were released in the summer of 1999 and were the same except for the 810-DC100 provided for 100MHz processor bus and a 4MB on-board graphics memory while the 810 only ran at 66Mhz and had no graphics memory. The 810E was released soon after in the fall of 1999. The perk to the 810E was that it could run processor bus speeds of 66 MHz, 100 MHz, or 133 MHz (PC Tech Guide, 2003).
In November of 1999, the true power of the 133MHz system bus of the Pentium III was shown with the development of the 820 chipset. The project, which was originally slated to come out with the release of the Pentium III in spring of 1999, hit a roadblock with production of Direct Rambus DRAM (DRDRAM), which was a key component of the 133MHz platform strategy. There were many advantages to RDRAM, one of which being it provided a memory bandwidth capable of delivering 1.6GBps, which was twice the peak memory bandwidth of 100MHz SDRAM systems. The 820 also added AGP 4x technology, which allowed main memory to be accessed by graphics controllers at more than 1GBps—twice that of previous AGP platforms. This, in turn, significantly improved graphics and multimedia handling performance. The 820 differed from earlier chipsets in that it employed a three chip hub architecture. It had a Memory Controller Hub, an I/O Controller Hub, and a Firmware Hub. A diagram of this chipset can be found in Figure 3 of the Appendix. The Memory Controller Hub (MCH) linked the CPU, memory and AGP, and supported up to 1GB of memory via a single channel of RDRAM using 64, 128 and 256Mbit technology. The I/O Controller Hub linked the I/O devices to the main memory, which increased the bandwidth and significantly reduced arbitration overhead. The Firmware Hub stored system and video BIOS, and included the first hardware-based random number generator (RNG) for the PC platform. The Intel RNG used thermal noise to generate truly random numbers that provided stronger encryption. Shortly thereafter Intel realized the price of RDRAM was not decreasing any time in the near future, so they developed a Memory Translator Hub (MTH). The MTH went between the MCH and the RDRAM slots, and translated the Rambus memory protocol that RDRAM used into the parallel protocol required by SDRAM. This allowed the 820 to use lower priced memory (PC Tech Guide, 2003). A few months after the MTH was added to the chipset, a bug was discovered that caused the system to reboot intermittently or hang during operation (Intel, 2001). To correct this problem, the MTH could not simply be removed; a consumer received a whole new motherboard plus the RDRAM to make it work (PC Tech Guide, 2003).
To avoid the fiasco that was the 820, Intel dumped the RDRAM and went to PC133 SDRAM in the middle of 2000 with two chipsets: the 815 and 815E. Both the 815 and 815E utilized the Graphics and Memory Controller Hub (GMCH), which supported both PC133 and PC100 SDRAM. They provided onboard graphics by using a 230MHz RAMDAC, which converted the data in the frame buffer into the RGB signal required by the monitor and provided limited 3D acceleration. This left the option open forusing the on-board graphics for lower cost systems, or using an external graphics card via AGP 4x or 2x for advanced systems. In addition, the 815E featured a new I/O Controller Hub (ICH2), which gave greater system performance and increased flexibility. The ICH2 included an additional USB controller, a Local Area Network (LAN) interface, dual Ultra ATA/100 controllers, and six-channel audio capabilities. By including the Ethernet controller directly into the chipset, it was easier for the computer manufacturers to implement low-cost network connections into PCs. The 815E even included a 6-channel audio connection for full surround-sound for Dolby Digital audio formats found on DVDs (Intel, 2004).
In 2001 the Pentium 4 was released, and the 850 chipset was developed along with it. The 850 extended the hub architecture of its predecessors by the new 82850 Memory Controller Hub (MCH). Some of the things the 850 featured were 400MHz front-side bus, dual RDRAM memory channels, 1.5V AGP 4x, two USB controllers (four ports), and dual Ultra ATA/100 controllers. A variant of the 850 was released in fall of 2002, 18 months after the original 850 came out, called the 850E. Everything on the 850E was the same except it supported Hyper-Threading technology, had a 533MHz system bus and worked with PC1066 memory. Hyper-Threading allowed a single processor to be treated as two logical processors (PC Tech Guide, 2003).
This brings us to the final chipset in the evolution, the 845. The 845 addressed the same problem the 820 had: no support for SDRAM. Ever since the Pentium 4 came out, Intel had only provided chipsets with RDRAM. SiS and VIA both released Pentium 4 chipsets that utilized SDRAM, and Intel soon followed suit. The 845, released in summer of 2001, supported PC133 SDRAM. Even though it was significantly cheaper to run a machine with SDRAM, it was far less efficient. The speed of the SDRAM bus was about three times slower than that of the Pentium 4 system bus.
The 845 could have supported the faster DDR SDRAM, but the contract Intel had with Rambus did not allow them to do this until the start of the following year. So, at the beginning of 2002, a new variation of the 845 was released: the 845D. The 845D provided a memory controller that supported PC1600 and PC2100 SDRAM—or DDR200 and DDR266 respectively—in addition to PC133 SDRAM. With the release of USB 2.0 came the release of three more chipsets: the 845G, the 845E, and the 845GL. The 845G incorporated a new generation of integrated graphics called “Intel Extreme Graphics” and targeted the high-volume business and consumer desktop markets. The 845E worked with discrete graphics components, and the 845GL was designed for Celeron processor-based processors.
Finally, we have the 845GE. It was designed to support Hyper-Threading technology and was released at about the same time as the 850E. It also supported a 266MHz version of Intel's Extreme Graphics core, a system bus speed of either 400 or 533MHz, and DDR333 main memory (PC Tech Guide, 2003).
The 845 chipsets by no means marked the end of the evolution of the chipset. In fact, since the release of Intel’s 845 chipsets, a variety of yet more advanced chipsets have been developed, such as the E7205, the 875P, and the 865P, 865PE (Figure 4 in Appendix) and 865G. However, we will now move on to discuss the most prevalent chipset models that one would find in today’s market.
Chipset Types
Chipsets are often based upon certain CPU’s and are designed to work with certain I/O protocols. One chip may be designed for an AMD Athlon CPU with a SCSI interface, while another may be designed for an Intel Pentium 4 CPU using IDE. Since chipsets are designed with a specific CPU in mind, often manufacturers of CPU’s will also build their own chipsets. Intel has been offering its own chipsets since it began offering the Pentium line of processors. AMD began producing its own chipsets around 1997, but still relies heavily on other companies to manufacture chipsets (Newsom, 1997). The manufacturing of chipsets is not limited to Processor manufacturers; other companies have been very successful at producing chipsets as well. For example, graphics card producers, such as Nvidia and ATI, and other companies such as VIA have been fairly competitive in the chipset market. Due to the large selection of chipset manufacturers, the number of different models of chipsets on the market has expanded significantly. This makes finding a chipset with appropriate features for your computer a daunting task. For this reason we will now outline some of the most prevalent manufacturers and models on the market today.