Sie sind auf Seite 1von 3

Video Card Standards

Through the course of PC development, there have been a variety of different and competing video
display standards developed. Since the Video Electronics Standards Association (VESA) came out
with an agreed-on SVGA spec, there has not been much volatility in this area. This is good news for
consumers because you don't have to worry about getting hardware to support certain video
standards. But, before SVGA, there were several, and some of them are outlined below.
MDA (Hercules)
Monochrome Display Adapter, or MDA, came out in 1981 and is the oldest display standard for the
PC. This was the original display adapter on the IBM PC. Technically, it was a character-mapped
system, meaning it was capable only of 256 special characters in set positions on the screen. This
was text-only and no color. Text was available in 80 columns and 25 rows, and each character was
of a set size. It is not capable of pixel-by-pixel control, therefore no graphics can be shown with
it. It was ideal for simple DOS based applications with no graphics, like word processing. As a plus,
IBM included an integrated printer port, thereby saving another slot.
CGA
A few months after the release of the MDA, the CGA adapter came out. It worked with an RGB
monitor and worked off the bit-mapped method, meaning it was capable of the pixel-by-pixel control
needed for graphics. It could also do 16 colors, 4 at a time, on a 320 x 200 display. The pixels were
quite large and the resolution was bad, but it could do graphics. CGA offered a high-resolution
mode of 640 x 200, but then it could only do two colors. Besides its limitations, this card remained
very common for quite a while. It had a couple annoyances, which were flicker and snow. By snow,
I mean one would sometimes get random dots on the screen.
EGA
After a few years, the llimitations of CGA grew quite apparent to users. The Enhanced Graphics
Adapter was next in the line. It stands between the CGA and the good old VGA cards. It was
introduced in 1984 and was continued until 1987, when the first IBM PS/2 systems were set to
market. It was a nice graphics card at the time, but it couldn't deliver the vast array of colors we all
like today, so it is thus forgotten. It could produce 64 colors, but displayed only 16 of them at one
time when used with an EGA monitor. It had a high-resolution mode and a monochrome mode, and
was compatible with all previous monitors, including CGA and monochrome.
One new feature on the EGA adapter was the memory expansion board. The EGA card came
standard with only 64K of memory. With a memory expansion card, you got an extra 64K, for a
total of 128K. Then, with the addition of a special IBM memory module kit, you could add another
128K, for a total of 256K of graphics memory. One good thing, though, was that most aftermarket
EGA cards came equipped with the full 256K of memory.
PGA
In 1984, IBM introduced the Professional Graphics Array, or PGA. The name gives away its
intended audience. This system, priced at almost $5,000, was intended for serious scientific or
engineering applications. With a built on 8088 processor, it could perform 3D manipulation and
animation at up to 60 frames per second at a full 256 colors at 640x480 resolution. Besides the
price, this system took up a total of three motherboard slots. Obviously, the cost precluded this system
from ever taking on to the general public, and was later dropped. It would be quite a feat to see one
of these things today.
MCGA
MultiColor Graphics Array, released in 1987, marked a switch-over in video standards that would
eventually lead into VGA and SVGA which we now use today. It was integrated into the
motherboards of old PS/2 models 25 and 30. When coupled with a proper IBM display, it supported
all CGA modes, but it was not compatible with previous monitors due to the fact that MCGA used
analog video signals rather than the TTL signals, which is what the older formats used. TTL stands
for transistor-to-transistor logic, and is basically a form of logic which used voltage levels in the
transistor to determine 0 or 1, on or off. TTL-based video signals did not allow for variability in the
colors because on or off were the only two choices on any given signal. With analog signals, many
more color variations were allowable, and the MCGA interface could generate up to 256 colors.
With the switch from TTL to analog, MCGA also heralded the change from the 9-pin connector to
the full 15-pin connector which we are used to using today.
8514/A
8514/A is a standard produced by IBM in 1987 to work with its MCA bus. It worked well,
producing high resolutions on interlaced monitors. A later adaption allowed fast refresh rates on
noninterlaced monitors, producing high quality flicker free images. 8514/A works quite differently
than a VGA, although they both use the same kind of monitor. On a 8514/A, the computer tells the
video card what to do and the video card figures out how to do it. For example, it says "Draw a
circle" and the card figures it out. These are higher level commands and are quite different than the
pixel by pixel instructions which must be calculated by the CPU in standard VGA cards.
This standard was ahread of its time. 8514/A cards are much faster than VGA cards and often
provide higher quality images than the VGA card. Nevertheless, lack of support dealt the standard
an early death. IBM discontinued this format in favor of the more advanced XGA, which was
released in 1990 and basically allowed more simultaneous colors. XGA went on to become the
standard for MicroChannel PC platforms.
VGA
VGA stands for Video Graphics Array. It was introduced on April 2, 1987 by IBM, the same day it
introduced the MCGA and 8514/A adapters. Although all three were advances for the time, only
VGA became increasingly popular. VGA, although now more advanced, has become the standard
for desktop video, leaving both the MCGA and 8514 forgotten despite the fact that it is strikingly
similar to MCGA.
IBM PS/2 systems contained the VGA circuitry on a single VLSI chip which was integrated onto
the motherboard. In order for users to use the new adapter in earlier systems, IBM developed the
PS/2 Display Adapter, or the VGA video card. This card contained all the circuitry needed to
produce VGA graphics, and like all expansion cards, it plugged into a slot on the motherboard, via
an 8-bit interface. In the light of advances, IBM has discontinued this basic VGA card, although
many third party cards were available for the PC. Today, the VGA card is not much used, and
usually serves as a "spare".
VGA offered clean images at higher resolutions. The standard VGA could produce as many as 256
colors at a time from a palette of 262,144 colors. The original VGA, though, had to be at a 320x400
resolution to display this amount of color. At the standard 640x480 resolution, it was only capable
of 16 colors at a time. Also, VGA extends into the monochrome world. IT uses color summing to
translate color graphics into graphics using 64 different shades of grey. This, in effect, simulates
color on a monochrome monitor. VGA requires a VGA monitor, or one capable of accepting the
analog output of a VGA card.
SVGA
Super VGA is really a broad category of video standards, as it encompasses everything from the
initial days of SVGA all the way to video cards of today and their capabilities. The broad range of
cards and capabilities under this umbrella led to the use of video drivers (something that was not
needed before). These video drivers come on a diskette or CD with the video card and, when
installed, act as a kind of translator to tell your operating system and software how to use the
specific video card you had. Only with correctly installed drivers will your computer be able to
operate your video card as it is supposed to be run, at the capabilties it can deliver.
SVGA is much more advanced than VGA. In most cases, one SVGA card can produce millions of
colors at a choice of resolutions. But, the abilities depend on the card and the manufacturer. Since
SVGA is a loose term created by several companies, there is no actual standard to SVGA.
In order to create some standard out of the chaos of SVGA, the Video Electronics Standards
Association (VESA) introduced a SVGA standard. This SVGA standard did not deal with certain
methods of implementation of capabilities, but, instead, defined a standard interface called the
VESA BIOS Extension. This provided programmers with one common interface to write for instead
of trying to tailor their programs to work with several different SVGA cards, all different. All
SVGA cards in use today comply to the VESA standard. At first, the VESA SVGA standard was
criticized, and manufacturers were slow to integrate it. At first, they distributed the extension as a
program to be loaded each and every time you booted the computer. Finally, though, manufacturers
integrated the extension as a part of their SVGA BIOS.
Conclusion
That pretty much brings us up to date from the early days of strict video standards to today's loose
SVGA category which we are all so used to. There are many other proprietary video standards that
came out along the way that were not mentioned here. I have mentioned the most common ones, but
if you require information on these less common standards, I encourage you to search the internet
further.

Das könnte Ihnen auch gefallen