Talk:Enhanced Graphics Adapter
From Wikipedia, the free encyclopedia
The page previously said "Introduced in 1984 by Microsoft [sic]." I'm assuming this was just a mistake, and have changed it to IBM. I don't recall Microsoft having anything to do with display hardware (although they've had more hardware products than many people thing).
If I'm wrong about this, please revert the change and leave a note here. Dpbsmith 15:26, 10 Jan 2004 (UTC)
- I figured this was a good topic for Google Groups. Joel talks about his woes with his IBM card in December, 1984 in the first message mentioning either "Enhanced Graphics Adapter" or "EGA." He discusses required BIOS dates of October 1982. Further details in this thread discuss an alternate card, the Princeton SR-12, in addition to some workaround information for the memory problem Joel encountered. Based on those early mentions, I'd say IBM did it. -- Ke4roh 15:56, 10 Jan 2004 (UTC)
Currently the article states "EGA also included full 16-colour versions of the CGA 640×200 and 320×200 graphics modes; only the 16 CGA/RGBI colours are available in these modes." However, I'm quite sure the pseudo 64 color mode was also supported at 320x200. Only one game used this to its advantage; Ivan Ironman's Off Road. The mode was called "EGA64" in it.
- Oooh! Pseudo 64 colour mode? Any idea how this was achieved? The EGA has registers I believe that allow for very fast switching from one of four palettes. This could perhaps have been used to switch the palette during display time such that all four palettes were displayed in a single frame, resulting in the EGA's full 64 colour palette being displayed. Perhaps this was the trick? Fast pallette switching has often been used to this effect, think 'copper' bars. -- Funkymonkey
- It'd be nice if someone could root out a screenshot or any other info on this, all I've found with a quick google is an abandonware version for download (which I *may* try out, still having an AT with an old ISA VGA adaptor squirreled away somewhere deep in a cupboard - it may show up what's going on if it's got any command line switches which allow you to force the display mode)... Somewhere out on the interweb must have a working EGA PC and a copy of the game to figure out what's going on here. The best I can estimate at are several possibilities --- 1, it uses fancy hardware tricks (as found on enough other computers) to force the hardware to change the palette against spec, and also maybe to perform palette switching (though even the VGA screenshots show a game screen remarkably similar to my Atari version, which is a solid 16 colour (from 512) game) in 320x200 mode... 2, it does the LucasArts trick of using 640x200 mode and dithering alternate pixels (ick)... 3, colour flickering? (as seen on Sega GG/MD)... 4, as the graphics used in the game are quite small and limited-motion, it may have just used 640x350 as-is, with higher rez trucks and high res or pixel doubled backgrounds, and varied the 16 colour palette within-spec (or used extra tricks)... 5, it set 640x350 and displayed all graphics doubled for an effective 320x175 rez, with 16 custom colours and again maybe tricks... 6, it set a custom mode of the hardware, using 640x350 as a basis but reducing vertical rez to 240 or less and doubling horizontally (i've seen demos that supposedly run on stock EGA hardware and can give upto 640x700 rez in 64 colours, so reducing the resolution instead probably wouldn't have been too difficult - or alternately increasing it to 400 then doubling everything). So yeah. Complicated. Needs actual game experience to figure out. 82.46.180.56 (talk) 18:10, 30 March 2008 (UTC)
-
- Not sure if this is the same thing or not, but I remember some of the later LucasArts adventure games (Monkey Island 2, Fate of Atlantis) claimed to require VGA, but did actually run on EGA cards: the way it worked was to run in 640x200 mode, with each 320x200 logical pixel using two actual EGA pixels, i.e. in effect a rather crude kind of dithering to give the impression of displaying more than 16 colors. It was kind of hideous, though, and pretty slow at times, which is probably why they never documented the feature! 81.86.133.45 (talk) 21:36, 10 February 2008 (UTC)
- It wasn't undocumented. Indiana Jones and the Fate of Atlantis allowed you to enter a switch to choose the display mode, and there was an option for EGA. It produced a 320x200 image that was stretched to 640x200. It was hideously ugly, and it was obvious that each pixel was vertically doubled. DOSGuy (talk) 21:46, 10 February 2008 (UTC)
- Not sure if this is the same thing or not, but I remember some of the later LucasArts adventure games (Monkey Island 2, Fate of Atlantis) claimed to require VGA, but did actually run on EGA cards: the way it worked was to run in 640x200 mode, with each 320x200 logical pixel using two actual EGA pixels, i.e. in effect a rather crude kind of dithering to give the impression of displaying more than 16 colors. It was kind of hideous, though, and pretty slow at times, which is probably why they never documented the feature! 81.86.133.45 (talk) 21:36, 10 February 2008 (UTC)
Should we say that EGA is the one which introduces "character generator" for text mode? Where you can modify text-mode character fonts instead of using the hardcoded BIOS ones only... -- FourBlades 20:11, 12 August 2006 (UTC)
"EGA can drive an MDA monitor by a special setting of switches on the board; only 640×350 high-res is available in this mode." EGA can also drive a CGA monitor by doing nothing special, just avoiding 640x350 modes. The max you could do was 640x200@16 colors like you could see in Thexder II. Most sierra games used 320x200@16 colors because of that.
[edit] MC6845
Some sources claim that EGA doesn't include MC6845 controller (this article and the book "Programmer's Problem Solver for the IBM PC, XT & AT" by Robert Jourdain). Can anyone confirm that? --Anton Khorev 11:14, 24 October 2006 (UTC)
- The 640x350 mode uses 28000 bytes of contiguous memory, but the MC6845 can only address 16384 bytes, so clearly it can't be used in that mode. Of course this doesn't completely rule out the possibility that EGA cards had MC6845 for other modes, but that doesn't seem very likely. --Derlay 23:40, 16 June 2007 (UTC)
[edit] The Palette
Does anyone know the exact reason why custom palettes were available in hi-res (640x350) mode, but not in the 320x200 and 640x200 modes, at least not without some rumored heavy tweaking as mentioned above? I find it interesting (and a bit strange) that IBM waited until the VGA card to introduce a 320x200 16-color mode with user-defined palettes... 80.178.137.247 10:10, 28 May 2007 (UTC)
- Maybe they already had a future hardware roadmap that included introduction of VGA and XGA when the hardware was mature and economically viable, and considered the upgrade from 320/640x200x16 (fixed) to 320/640x200/240x256 (or 16, variable, and all of VGA's other official and tweaked modes) more of a selling point than upgrading from 320/640x200x16 (variable, and full 64 with tricks) and 640x350 to 640x480 would have been... The desire to ditch the ugly EGA low mode for VGA must have been more of a (lucrative) upgrade driving force than swapping passable Master System-ish graphics for half decent Amiga-ish ones would have been.
- Or possibly they just rushed it, as seemed to be the case for original CGA (how else do you explain away those horrendous fixed 4-colour palettes when making them user-definable was demonstrably quite easy?), and made it work at the minimum level possible to be a worthwhile upgrade at all levels, as well as providing the 640x350 resolution... which itself wasn't available in full colour unless you got the memory upgrade (64kb = monochrome, or maybe 4 colours/4 greys if such a thing was ever offered, only 128kb gave full colour, and 256kb smooth (page flipping) display of such) 82.46.180.56 (talk) 18:00, 30 March 2008 (UTC)
-
- Good points! Perhaps another explanation lies in the fact that "EGA can also drive a CGA monitor by doing nothing special, just avoiding 640x350 modes" (as mentioned elsewhere in this talk page).
- That is, maybe they wanted to provide some form of backward compatibility (and a cheaper upgrade option) for CGA owners - for the two resolutions that are available on a CGA monitor, EGA sticks to the RGBI signals that the CGA monitor can handle, so if you replaced only the card, the 320x200 and 640x200 modes would still work on your CGA monitor, but with the enhanced memory allowing all 16 colors to be displayed simultaneously.
- Of course, that begs the question why *separate* low-res modes (utilizing the full "rgbRGB" color range) were not added... I had an EGA for a while as a kid, and I remember being annoyed at the fact that many 16-color games looked much better on VGA with their custom palettes, while on EGA they were (seemingly) needlessly "crippled" - your "they just rushed it" theory seems like a good answer to that one. ;)
- 77.125.144.117 (talk) 18:23, 5 May 2008 (UTC)