Matrox G200

The G200 is a 2D, 3D, and video accelerator chip for personal computers designed by Matrox. It was released in 1998.

Matrox Millennium G200 AGP with 8MB SGRAM (1998)

History

Matrox had been known for years as a significant player in the high-end 2D graphics accelerator market. Cards they produced were excellent Windows accelerators, and some of the later cards such as Millennium and Mystique excelled at MS-DOS as well. Matrox stepped forward in 1994 with their Impression Plus to innovate with one of the first 3D accelerator boards, but that card only could accelerate a very limited feature set (no texture mapping), and was primarily targeted at CAD applications.

Matrox, seeing the slow but steady growth in interest in 3D graphics on PCs with NVIDIA, Rendition, and ATI's new cards, began experimenting with 3D acceleration more aggressively and produced the Mystique. Mystique was their most feature-rich 3D accelerator in 1997, but still lacked key features including bilinear filtering. Then, in early 1998, Matrox teamed up with PowerVR to produce an add-in 3D board called Matrox m3D using the PowerVR PCX2 chipset. This board was one of the very few times that Matrox would outsource for their graphics processor, and was certainly a stop-gap measure to hold out until the G200 project was ready to go.

Overview

With the G200, Matrox aimed to combine its past products' competent 2D and video acceleration with a full-featured 3D accelerator. The G200 chip was used on several boards, most notably the Millennium G200 and Mystique G200. Millennium G200 received the new SGRAM memory and a faster RAMDAC, while Mystique G200 was cheaper and equipped with slower SDRAM memory but gained a TV-out port. Most G200 boards shipped standard with 8 MB RAM and were expandable to 16 MB with an add-on module. The cards also had ports for special add-on boards, such as the Rainbow Runner, which could add various functionality.

G200 was Matrox's first fully AGP-compliant graphics processor. While the earlier Millennium II had been adapted to AGP, it did not support the full AGP feature set. G200 takes advantage of DIME (Direct Memory Execute) to speed texture transfers to and from main system RAM. This allows G200 to use system RAM as texture storage if the card's local RAM is of insufficient size for the task at hand. G200 was one of the first cards to support this feature.

The chip is a 128-bit core containing dual 64-bit buses in what Matrox calls a "DualBus" organization. Each bus is unidirectional and is designed to speed data transfer to and from the functional units within the chip. By doubling the internal data path with two separate buses instead of just a wider single bus, Matrox reduced latencies in data transfer by improving overall bus efficiency. [1] The memory interface was 64-bit.

G200 supported full 32-bit color depth rendering which substantially pushed the image quality upwards by eliminating dithering artifacts caused by the then-more-typical 16-bit color depth. Matrox called their technology Vibrant Color Quality (VCQ). The chip also supported features such as trilinear mip-map filtering and anti-aliasing (though this was rarely used). The G200 could render 3D at all resolutions supported in 2D. Architecturally, the 3D pipeline was laid out as a single pixel pipeline with a single texture management unit. The core contained a RISC processor called the "WARP core", that implemented a triangle setup engine in microcode.

G200 was Matrox's first graphics processor to require added cooling in the form of a heatsink.

Performance

With regards to 2D, G200 was excellent in speed and delivered Matrox's renowned analog signal quality. The G200 bested the older Millennium II in almost every area except extremely high resolutions. With 3D, it scored similar to but generally behind a single Voodoo2 in Direct3D, and was slower than NVIDIA Riva TNT and S3 Savage 3D. However, it was not far behind and was certainly competitive.[2][3] G200's 3D image quality was considered one of the best due to its support of 32-bit color depth (assuming driver bugs weren't a problem).

G200's biggest problem was its OpenGL support. Throughout most of its life G200 had to get by, in popular games such as Quake II, with a slow OpenGL-to-Direct3D wrapper driver. This was a layer that translated OpenGL to run on the Direct3D driver. This hurt G200's performance dramatically in these games and caused a lot of controversy over continuing delays and promises from Matrox.[4] In fact, it would not be until well into the life of G200's successor, G400, that the OpenGL driver would finally be mature and fast.

Early drivers had some problems with Direct3D as well. In Unreal, for example, there were problems with distortions on the ground textures caused by a bug with the board's subpixel accuracy function. There were also some problems with mip-mapping causing flickering in textures. As drivers matured these problems disappeared.

2000s, 2010s and 2020s

Matrox G200 series, especially the G200e is still a popular choice for server motherboard manufacturers, like Dell's PowerEdge series, due to its robustness, low power consumption and limited features needed just for VGA display.[5]

G200A & G250

Around 1999, Matrox introduced a newer version of G200, called G200A. This board used a newer 250 nm manufacturing process instead of G200's original 350 nm. This allowed Matrox to build more graphics processors per wafer at the factory as well as to reduce heat output of the chip, so G200A had no need in a heat sink while operating at the clock speed of G200. The last revision of G200A was named G250 featuring somewhat higher core and memory clock speeds. It also made use of the same 6ns SGRAM chips as G400, though only two compared to G200 and G200A with four 7ns SGRAM chips. G250 was offered only to OEMs, with Hewlett-Packard perhaps being the only buyer. [6][7]

Models

Board
Name
Core
Type
ProcessCore
(MHz)
Memory
(MHz)
Pipe
Config
T&L?Memory
Interface
Notes
Millennium G200 Eclipse 350 nm 84–90 112–120 1×1 No 64-bit SGRAM. "SD" model uses SDRAM. "LE" max 8 MB SDRAM. 250 MHz RAMDAC. AGP/PCI
Mystique G200 Eclipse 350 nm 84 112 1×1 No 64-bit SDRAM. 230 MHz RAMDAC. TV out. AGP.
Marvel G200 Eclipse 350 nm 84 112 1×1 No 64-bit SDRAM. 230 MHz RAMDAC. TV in & out. Breakout box for extra I/O. AGP/PCI
G200 MMS Eclipse 350 nm 1×1 No 64-bit Quad GPU graphics card for 4 monitor support. Some have TV input. PCI
Millennium G200A Calao 250 nm 84 112 1×1 No 64-bit Die-shrink G200. "LE" max 8 MB SDRAM. 250 MHz RAMDAC. No heatsink. Power Consumption 4 Watts. AGP/PCI
Millennium G250 Calao 250 nm 96 128 1×1 No 64-bit overclocked G200A, OEM-only.

References

  1. AnandTech: Matrox Millennium G200 - Date: 10 August 1998 / Topic: Video Card / Manufacturer: Matrox / Author: Anand Lal Shimpi
  2. 3D Game Benchmark Results - Forsaken Mark - Tom's Hardware : New 3D Chips - Banshee, G200, RIVA TNT And Savage3D- 1:01 PM - 18 August 1998 by Thomas Pabst / Source: Tom's Hardware US
  3. iXBT: Matrox G200 - First PreView
  4. Hardware Upgrade- icd driver g200
  5. Dell Matrox Graphics driver
  6. G200 core - MURC - 5 July 2000, 13:22
  7. G250? - MURC-11 August 2000
Notes
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.