A few posts back, I discussed some of the notable features of the Monochrome Display Adapter and Hercules Graphics Card. In this post, I intend to talk about monochrome graphics on other adapters of the time, namely CGA (and its derivatives Tandy and PCjr.), EGA and VGA. All these adapters could support monochrome monitors, although they were at their best when connected to color displays.
CGA, Tandy & PCjr.
At first, if you wanted to connect a CGA card to a monochrome monitor, you typically would use a monochrome composite monitor. The RCA jack on the CGA card would be used. If you were really lacking for a monitor, a black and white monitor would do in a pinch, but you probably would have to run the composite output of the CGA card through an RF converter box. The resulting graphics would be nowhere near as sharp as with a straight composite connection. Lets start with our test images :
|
Space Quest II - 320x200x4c "RGB" Mode (Mode 4) |
|
Space Quest II - 640x200xMono "Composite Color" Mode (Mode 6 with Composite Color bit set) |
Monochrome computer monitors typically would offer a high-contrast color : white on black, green on black or amber on black. Text was sharp at 40 columns and tolerable at 80 columns, especially when the screen size was 12" or less. Since composite monitors (in North America) refreshed at 60Hz, there was less need for the long persistence phosphors (and resulting ghosting) that was seen on the 50Hz MDA monitor.
The Compaq Portable used a 9" green composite screen. The IBM PC Portable used a 9" amber monochrome composite screen connected to a CGA card. Even though the IBM machine is close to a clone of the Compaq machine (which in turn is a clone of the IBM PC), their monitors are quite different. The IBM monitor is a bog-standard monochrome composite screen and connects with an internal version of the RCA jack on its CGA card. For this reason, you will see jailbars on the screen as you would see them on a color screen because the of the NTSC color decoding.
The screen on the Compaq is quite interesting. The Compaq's video adapter supports both MDA and CGA graphics, and in the 80x25 text mode will act like an MDA card, not a CGA card. Snow be gone! The display is a "dual-sync" display apparently capable of accepting both a 15KHz and 18KHz horizontal scan frequency signal. Otherwise the card acts like a CGA card. The input must be able to accept the TTL signals from the CGA card and convert them to grayscale. The resulting image is much sharper than on the IBM card (especially in 640x200 graphics) and will not show jailbars.
Early IBM CGA cards had a very basic monochrome support capability through the composite connector. Essentially there for the low intensity eight colors would transform into black, gray and white. For the high intensity colors, the same pattern would be seen, slightly brighter. Later CGA cards took hue into account when converting RGB colors into luminances, so that there would be a unique shade for each of the 16 colors. Here is an approximation of how the test images would look with the IBM PC Portable's amber monochrome screen and the early and late CGA cards.
|
IBM PC Portable with Early CGA Card, 3 Intensities Available |
|
IBM PC Portable with Late CGA Card, 4 Intensities Available (note extra detail on the broom) |
CGA Mode 6 was a 640x200 monochrome mode, where the active pixel could be a color, but the background color would always be black. Any of the 16 CGA colors could be used, but typically the color would be white/light gray or white/high intensity white. The IBM PC BIOS did not support selecting the foreground color in this mode, to do this you had to use the Color Select register. It was not typically used because most CGA games preferred the more colorful if lower resolution Modes 4 & 5. Often, if you saw these graphics, it was intended for color composite graphics. The Wizardry games have intentional support for this mode, as do other games like SimCity, which benefits more from a high resolution than colorful graphics.
When the Color Composite bit was set in the Mode Control register, on a color composite monitor, the mode would display artifact colors. On an RGB or monochrome composite monitor, the color composite bit made no difference to the graphics in a real CGA card.
CGA Mode 5 was a 320x200 color mode on an RGBI monitor, which a palette of cyan, red and white (intensity selectable). On a composite monitor, whether color or monochrome, it displayed 2-3 (early CGA) or 4 (late CGA) shades of intensity, no color. Mode 5 would typically be supported as an option if the game supported Mode 4 color graphics or Mode 6 color composite graphics or Tandy/EGA/MCGA/VGA.
Eventually, there would be monochrome TTL monitors that would accept a CGA signal. In this case, the monitor converts RGBI into intensities of one color. Early LCDs were also monochrome devices, and brightness levels for each individual pixel were usually limited to on and off, although the IBM PC Convertible did advertise three levels of brightness for the CGA 4-color graphics modes. The PC Convertible's LCD had a native resolution of 640x200 and stretched 320x200 graphics modes horizontally by two. The aspect ratio would seem wrong except for applications intended for the Convertible's screen (typically from IBM). Here are approximations of how the test graphics would look on the Convertible's LCD screen :
|
640x200xMono Color Composite Mode (Mode 6 with Composite Color bit set) |
|
320x200x4 RGB Mode (Mode 4) |
Tandy and PCjr always displayed 16 intensities of one color when displayed on a monochrome composite monitor. It looks something like this :
EGA
IBM's EGA card could be connected to a color RGBI TTL monitor like the 5153 and support 8x8 text (Modes 0/1 and 2/3) and color 200 line modes (Modes 4, 5, 6, D & E). It could also a high resolution 6-bit RGB TTL monitor like the 5154 and support 8x14 text, color 200 and 350 line graphics modes (mode 10). Finally, it could be connected to a monochrome TTL monitor like the 5151 and support text at 9x16 (Mode 7) just like an MDA card, and additionally supported a 640x350 line monochrome graphics mode (Mode F). A real MDA monitor must be connected to use Mode F, which was similar to the Hercules Graphics in that it allowed each pixel to be turned on or off. It did not work in the same way as Hercules, and far fewer games supported it. While Hercules graphics were 720x348 pixels, usually only 640 horizontal pixels were actively used in Hercules as it was much easier to convert from 320 pixel graphics modes, so Hercules did not often have an advantage in resolution. However, since IBM derived the graphics mode from the text mode, each individual pixel can be made to blink or display in intense video. I do not know of any game which may have supported this functionality. Mode F is the only graphics mode supported when the EGA is connected to an MDA monitor.
|
Microsoft Flight Simulator 3 - 640x350x16c Mode 10 |
|
Microsoft Flight Simulator 3 - 640x350xMono Mode 0F |
In order to use an MDA monitor with an EGA card, the dipswitches or jumpers on the card must be set correctly. Switch 3 must be ON, switches 2 & 4 must be OFF. The EGA card, when connected to an MDA monitor, can be used with a CGA or a secondary EGA card connected to a color monitor.
Even though EGA can be connected to an MDA monitor and at first glance look identical to the MDA's text mode, using unusual text character attributes can show differences compared to the MDA. This can be seen in 101 Monochrome Mazes, a game IBM released strictly for the MDA. When you run this game on an EGA or VGA card, you will not see exactly the same "graphics". This game uses text attributes that fall outside the canonical 16 IBM valid MDA attribute choices for the EGA and VGA. Compare the following :
|
101 Monochrome Mazes MDA |
|
101 Monochrome Mazes EGA/VGA |
MCGA/VGA
When IBM released its PS/2 computers, it advertised monochrome analog high scan-rate monitors like the 8503 alongside more expensive RGB color analog high scan-rate (VGA) 8512 or 8513 monitors. Systems with built in monitors like the PS/2 Model 25 could be ordered with a monochrome or color monitor. The price difference was several hundred dollars at the beginning, so monochrome monitors were much more attractive for certain business and educational purchasers.
IBM's early VGA monitors designated two pins in the HD-15 VGA monitor connector to inform the card whether it was a mono or a color monitor. Three of the bits would inform the VGA whether a monochrome monitor or a color monitor was being used. Later, two more pins would be added to identify the specifications of other IBM monitors, and later these four pins would be repurposed to send the monitor's capabilities to the PC. A real IBM monochrome monitor and IBM VGA implementation would output pixel information on the green pin only.
VGA used an 18-bit color palette, with 6-bits for red, 6-bits for green and 6-bits for blue. Each pixel on the screen could be set to any of 256 colors from the 18-bit palette. When the monitor informs the card that it is a mono card, the VGA BIOS will convert the 18-bit color values into a 6-bit grayscale value by color summing the R, G and B values in a 30, 59, 11 ratio. In this manner, 64 intensities of the monitor color could be displayed using the 256 color graphics mode 13h. However, if the program wrote the palette registers directly, and most did, then only the information on their green channel would be sent to a monochrome display. In other modes, including 4-color and 16-color legacy CGA and EGA modes, the BIOS would set up the palette registers and convert them to grayscale unless a program wrote to the palette registers directly.
|
Conquests of the Longbow - 320x200x256c Mode 13 Color |
|
Conquests of the Longbow - 320x200x"256c"Mode 13 Grayscale |
|
Conquests of the Longbow - 320x200x256c Mode 13 Color as Displayed on a Monochrome Display
|
By the late 80s and early 90s, monochrome plasma displays were common on laptops, and this design allowed them to give some definition to what would otherwise have been color graphics. However, this display technology was somewhat limited, as the gas plasma display in the IBM PS/2 Model P70 (a luggable) could only manage 16 shades of the orange color of that display. In 320x200, IBM would selectively dither a 2x2 block of pixels (remember that the display stretches 320x200 to 640x400) to simulate the 64 intensities that should be available.
Some VGA games, including many of Sierra's SCI engine adventure games, had a driver which Sierra claimed would allow for up to 256 shades of "gray". This was nonsense, the hardware was limited to 64 shades of gray due to the 6-bits per color regardless of being displayed on a monochrome or color VGA monitor. This driver would show the same graphics whether used on a color or a monochrome VGA monitor. However, the color graphics driver wrote palette values directly to the VGA palette registers, so color information would be lost because only the green channel was being represented. Sierra could ensure a better representation of the color graphics in grayscale by using BIOS writes to the palette, which forces the VGA card to send color summed monochrome values to the monitor. This works whether the display is a color monitor or a monochrome monitor.
Sierra's driver did a very good job of preserving the detail of the color graphics for those forced to use color screens. Pinball Fantasies also supported a "mono" graphics display setting, using up to 64 grayscale shades in the Mode X modes it supports, 320x240 amd 360x350.
|
Pinball Fantasies - 320x240x256c Mode X Color |
|
Pinball Fantasies - 320x240x"256c" Mode X Grayscale (note the loss of detail on the table art) |
Monochrome (even color CGA and EGA) were always a joke for us coming from the 8bit commodore 64/128 and the 16-bit Amigas...
ReplyDeleteAt the time most PCs had awful, ugly CGA screens with "beeps", we played 16-color games on the c64 and edited 4096 color bitmaps in HAM mode on the Amiga. :)
IMHO, the PC only made itself a nice gaming platform in the ages of 486s and color SVGA displays. Games like DOOM and Wing Commander made the PC a decent gaming platform...
FC
Monochrome is bad for most games, sure (though there were some that excelled on e.g. the original Mac), but it has its benefits in usually offering higher resolution and less eyestrain, at least when done right (CGA doesn't count).
DeleteEG, MDA / Hercules on an Amber monitor was actually quite soothing to look at (when I put together a trashpicked-parts bedroom PC in the 90s to do schoolwork on, I'd sometimes leave the Windows clock fullscreened as a nightlight) despite the 50Hz refresh thanks to the long persistence, and gave you something not far removed from VGA resolution (certainly, more than EGA, in the more useful of the two directions at the time), with the lack of colour not really being an issue if you're just typing up essays and maybe including a couple of simple lineart diagrams or Excel charts (well OK, Pirate Copy Of MS Works spreadsheet charts) that are going to be printed on an old Deskjet (or even dot-matrix) that can only do black and white anyway.
Similarly the the original Mac, and the hi-rez mode of the Atari ST (pretty much MCGA-mono, but using 400/70 mode not 480/60, so could give a low-flicker screen with medium persistence phospor and not suffer much ghosting or bloom), and various Tandys and other clones which used a 640x400 non-interlace mode.
The other benefit being the lack of an aperture grille for aiming the beams at the coloured phosphor stripes, so it could be a lot sharper and easier to read without having to be huge and/or over engineered, and therefore quite a bit cheaper even so.
The C64 never had anything like it (heck, a CGA-like 80-column mode, but with monochrome-only text, needed add-ons), and the Amiga was somewhat slow to catch up, not offering anything similar until ECS came along, and even then the results were typically a strangely flickery mess as for some reason the emphasis still seemed to be on attempting to wow people with hi-rez colour even though monochrome or maybe limited greyscale with a better refresh rate would have been better (and VGA already equalled or exceeded it). And the top clock rate for building modes never got any higher than that of VGA (or equalled that of the ST, though it did beat the Mac and Tandy) for ECS and AGA.
And of course even Commodore had a stab with the god-beast that was the A2024. Something whose mode of operation I've somewhat got a taste of right now as I'm having to run my secondary monitor off an old, catastrophically slow (but good enough for text/webpages and static imagery) USB2 VGA adaptor. Extra high resolution, very monochrome (only 4 grey levels, so equal to the MDA/EGA-mono/Herc text), and damned expensive both because of the industrial grade workstation monitor it was based on (completely leapfrogging the in-between stage most affordable solutions settled for) and the onboard high-speed memory-dense framebuffer it had to use because the computer could only throw out image data at about 10 to 15fps equivalent due to the resolution, and a direct output solution would be horrifically flickery even on something with similar phosphor persistence to a radar scope, or a Tektronix vector CAD terminal.
The technology of the time wasn't really quite up to providing the ideal triangle of good gaming graphics AND good hi-rez productivity modes AND a reasonable price (though Atari at least had a bash at that). To do it right you had to choose. Commodore chose the gaming side, IBM chose the business side, and both kind of sucked at the other function. That doesn't intrinsically make one better than the other, it just depends on what you happen to value more in your machine.
(agh, i've hit the length limit again)
(2/2)
DeleteAnd IBM tried to get into that low cost half-decent-AV sphere with the PCjr ... it could have been alright if they hadn't absolutely bungled certain key elements (like the keyboard, and sideways compatibility with the full PC), as Tandy later proved. Like vaguely in the realm of the C64, or at least its other rivals.
Commodore et al tried the reverse and had better success... by building PC clones. Because it turns out that's actually not too difficult due to the more straightforward design (it just demands a lesser amount of somewhat better quality components is all). Even Atari and Amstrad did (and Sinclair at least rebadged Schneiders). So their home machines didn't even really need to cover those bases.
Or in other words, you're kinda comparing a Commodore PC to a Commodore 64 or Amiga, rather than everything IBM did to everything Commodore did ;)
And if they'd figured some way to make 15kHz EGA mode also use a 64-colour palette, that would have been pretty much the equal of the C64, or in fact better (much of the way to, say, the Master System or NES). The simultaneous colours in either case are equal to the C64's entire palette, just the fixed RGBI are somewhat different to the distinctive pastel tones of the breadbin; with 6-bit colour they could have offered a fair recreation of them, and certainly done better than, say, the Amstrad CPC. Just a shame hardly anyone ever used the 640x200 16-colour mode, possibly to keep compatibility with 64KB EGAs (that could have shown those screens but not updated them so smoothly, due to only being able to hold one page at a time, vs two for 320x200), as Lucasarts and a few others later demonstrated it could be used to simulate a somewhat higher colour depth quite nicely and without losing so much speed. That plus the high rez mode's 6-bit palette could have looked quite good. And the EGA already incorporated quite a lot of the acceleration features better known from VGA, which would have helped make things move around with reasonable smoothness when held against the 8-bit competition, and indeed perhaps even the ST.
It might have been a harder battle against the Amiga (which had more colour and better hardware acceleration), but it wouldn't have been quite so embarrassing as the real thing. But still, 320x200 with 16 colours of some kind and fairly smooth motion (especially scrolling) on what might well be a 4.7MHz 8088 underneath, pre-dating the Amiga by 4 to 5 years, ain't so bad. Especially as it could then turn around and throw out a 640x350 non-interlaced mode (remember all those Amiga flickerfixers because of how migraine-inducing laced mode was?), or even higher rez with an S-EGA and multisync, in the same colour depth (or at least 4c) with similar speed on the same monitor.
A straight still-graphics comparison between the two isn't entirely fair, especially if you're picking CGA instead of EGA (which was introduced between the C64 and Amiga, and about the same time as the Mac whilst actually having better capabilities than it).
I'm not even really an early PC fanboy. I was an ST kid, and simply getting lazy conversions from the system with that distinct 16-colour primary-based palette (and, if it was PCjr or early Tandy, half the proper horizontal resolution and maybe even half the vertical) annoyed the crap out of me. But I've played enough old PC games since and, well... when used properly, it ain't so bad. CGA will always be a crawling horror, being born of a mix of deadline crunch and no budget to make custom ICs so having to compromise extremely hard on what it could do, but EGA, eh, it's a fair effort. If only they'd pushed to a 400-line mode and analogue output, but that's what VGA then provided.
+1. I mostly agree although I had a soft spot for the Tandy's PCjr-esque modes and sound chip.
ReplyDeleteI was indeed forutnate enough to have first play-ed SQ2 on an Acer 710 w/ a three-way adapter (my favorite of course) this adapter, as you likely know, could toggle between CGA, Hercules, or Plantronics. I find that the Hercules best embodied SQ1 & SQ2 in amber monochrome (as per the Acer's preference) compared to standard CGA. I have noticed that the Acer preferred Test Drive in standard CGA conversely. I am quite proud that this machine was quite so discerning as too choose the superior monochrome mode; depending the the application, seeing as this machine was brushed off as pedestrian at the time.
ReplyDeleteThe game "Armor Alley" support EGA mode 0F.
ReplyDeleteI WOULD LIKE TO KNOW IF IT IS POSSIBLE TO CHANGE THE COLORS ON A NEWER FLAT
ReplyDeleteSCREEN COMPUTER MONITOR TO AMBER LETTERS ON A BLACK BACKGROUND???
IF ANYONE CAN FIND THE ANSWER COULD YOU PLEASE TEXT AN ANSWER TO
lesmartinjr70@gmail.com
Go into the Windows control panel and play with the colour schemes. You can probably figure something out that'll work. Maybe start with one of the predefined high contrast schemes as a base. Or if you're using Win10 and later, activate the "nightlight" function but set it to run 24/7 and turn the slider pretty much to maximum.
DeleteMonitors themselves often have similar colour settings in their menus, though the amount of influence you can wield varies quite a bit. Some it's just a mild tint, others you can end up blocking or whiting-out a particular channel entirely. Similarly some video drivers include settings tools that offer a degree of control over the same things, which you could combine with the monitor (or use in place of monitor controls if you're on a laptop).
These however don't tend to allow you to invert the colours, only how much the individual R/G/B channels are attenuated or boosted. That's more a thing for the Windows control panel, or maybe the software you're using. EG, I don't know if you can still do it, but Word used to have a sub-setting in its "Wordperfect compatibility mode" options that would turn everything to white text on a blue background regardless of the actual page or text colour, sort of like using the DOS version of WP (or Edit, or pre-graphical Word), which I found rather more soothing on the eyes when using an old laptop or CRT (sure, it's still blue, but it wasn't super bright, and far better than black on bright white). If you were to filter that using the monitor/driver settings, or even just putting a piece of orange tinted plastic film over the screen (which could be the easiest way to achieve the desired effect...), the overall result could well be orange lettering on a black background... Orange x White = Orange, Orange x Blue = Black...
Beyond that, there's probably specific visual accessibility tools that would give you what you want, but I'm not really familiar with any of them.
I have a PC clone with CGA/MDA on the motherboard and only a DB-9 pin connector.
ReplyDeleteI would like to connect this computer to a composite video 1v peak-to-peak CCTV monitor. I think there is a SIMPLE lead I can make up. I know that I do NOT need a scan converter, just combine R-G-B to become mono/white to display on a simple MONO monitor.
Can anyone give me the right answer or sell me a lead?
Stephen in UK.
https://www.facebook.com/mister35mm
TTL if I remember rightly is 5V output...
DeleteSo, get a CGA colour cable, cut the end off. Ditto whatever plugs into your CCTV monitor (I'm assuming separate sync here; if not, that's another conversation and essentially you're going to need some kind of modulator or at least sync combiner). Syncs and ground run to each other. On the video side... use whatever resistor will drop 5V to 0.25V (damned if I know, there's probably a website that has a suitable calculator on it, maybe a couple hundred ohms? if you have a multimeter you might just be able to trial and error it) on each of the four video lines (RGB *and* Intensity), plus a diode afterwards facing outwards on each line so you don't go blowing up the motherboard, then just gang those all together onto the monitor's video input line. Wrap the whole sorry affair in heatshrink and duct tape and off you go.
That'll give whatever kind of result IBM originally managed, I would expect. Very limited and muddled range of grey levels output with a lot of duplication (as, e.g., dark red, dark green, dark blue, and dark grey would all have the same output intensity, as would each of bright red, bright green, bright blue, brown, dark cyan, dark magenta... etc), but it would be something and you'd at least be able to tell black from white (or light grey/yellow/brown), even if the intermediate CGA colours would merge into each other (light/dark cyan and magenta, and red/green, red/cyan, all have the same number of bits set active).
If you want to be more sophisticated then each line should have a different peak voltage, which all together add up to 1.0V. I've no idea on the exact figures or the necessary resistances, but I'd suggest green as the highest peak value, then red, followed by blue... intensity fitting somewhere in the middle of those, or maybe even being the lowest output overall (maybe between red and blue?). Might have to play with it and see what looks most logical, especially when putting black, dark grey, light grey, and white next to each other.
Don't know if there's any monitor-type sensing within the standard, but if the CCTV is standard 15kHz, you certainly want to make sure that the board isn't accidentally told that it has an MDA-compatible screen connected instead.
Might also be a need to actually have "black" raised a little above 0V for some displays, and blanked to actual zero either outside of the true active area, or during the specific blank/sync times. The engineering of THAT is very much left as an exercise for the reader.
Alternatively you can probably just find 15khz RGBI to composite converters on eBay, along with a suitable RCA to BNC plug converter, and they'll do the job just fine so long as you either don't mind whatever slight interference they add due to the colourburst (same as watching colour broadcasts on a B/W TV), or it can be set to monochrome output. Better yet, an RGBI to S-Video converter (or even a janky composite-to-SVideo adaptor) should give you pure monochrome output on one line and the colour component on the other, which you can then just leave disconnected from the screen.
Turn your monitor into a monochrome one of yesteryear:
ReplyDeletehttps://hackaday.io/project/166041-monochromevga
Actually this answers some of the other questions, both my own and that of others who already commented.
DeleteThe typical colour mixing formula is along the lines of 0.2126*R + 0.7152*G + 0.0722*B, or in other words 71.5% of the original Green, 21.3% of the original Red, and just 7.2% of the original Blue. This is essentially a recreation of the colour sensitivity of the eye, where blue just looks darker than red, and red darker than green, most of the time, and certainly at the same physical intensity.
With the resistor values *for VGA* being two 35ohm in parallel for red, two 9.1ohm for Green, and two 100ohm for blue.
That sort of thing shouldn't have been too difficult for IBM to achieve on the VGA, for example, as you could just switch the DAC output from going straight to the output port (which it would also do for "pure green" monochrome), to routing through a small set of pretty rudimentary discrete components (three better-chosen single resistors and three diodes) before being dumped back on the green output line if wanting to convert unspecialised colour software for mono output.
Unfortunately that doesn't directly help for converting RGBI to mono composite (not least because VGA is already 1V or maybe even 0.7V p2p, not 5V), but it's a start.
(odd memory that it also somehow triggered; Windows would essentially be running as colour converted to greyscale on a mono VGA if you wanted anything better than pure black and white anyway, so there'd have to be some kind of differential mixing to avoid things looking absolutely unusable... any time I've seen a "VGA mono" driver for Windows 3.x or 9x, it's actually been a synonym for "MCGA hi-rez". IE, it only offers 1-bit black and white, no greyscales at all. Any mono system, laptop LCD or desktop/POS CRT, which has actual pure greys rather than dithering, turned out to be using the default 16-colour mode if I was able to check it... and I think in one case, 256-colour... and no, not 256-grey...)
Two small mistakes: "which a palette of cyan, red and white" --> with a palette...
ReplyDelete"for those forced to use color" --> for those forced to use mono
The question is, how does the VGA "sum" the three channels onto green for mono output? If you want vaguely real-seeming greyscale (and moreover, any kind of definition between the three primary colours, and between the three pure secondaries - pretty crucial for any software that uses a legacy RGBI colour mode for example... and any of them to look reasonably bright...), the simple method of dividing each in three equally just doesn't work. And even then, you'd still have 192 levels all up, just with many duplicates.
ReplyDeleteTypically for monochrome TV and "greyscale" filters in image editors etc, there's a certain formula used to balance the three channels with their relative perceptual intensities. Green gets the lion's share, blue very little, and red somewhere in between. So a red flag flying over a background of a blue sky and grassy fields will actually show up, though the sky may actually come over darker than the ground unless it's quite pale cyan rather than deep blue.
If that's what IBM used to produce the greyscale output, then it's entirely possible that the "256 greyscale" claim is valid. Sierra would have just adjusted the RGB colours in the palette such that the summed values gave a reasonably smooth progression beyond the (already pretty good) 64 normally available, likely with green as the main driver and varying amounts of red and blue to nudge the overall value up a little (the simplest though not entirely precise way would be just to have pure green as the XXXX-XX00 value, then a matching intensity of blue added for XXXX-XX01, swapped for red to give XX10, and then both for XX11). Or perhaps they just forced colour-summed output of the regular graphics instead of accepting some BIOS-driven switching to green-only...? (in one case, if you could tap off the pre-sum signals to a secondary, colour monitor, the output would look a complete mess on the chromatic front, though still have a discernable dark to light progression; in the other, it'd just look normal).
Similar methods have been used for other instances of higher colour depth simulation with greyscales after all. I've used it for tuning wallpapers on an old monochrome laptop (where the colour to greyscale conversion was extremely weird and meant pre-converting to a certain number of greys then replacing the palette outright), Atari did something similar for 256-grey mode on the TT (natively only having a 4096 colour palette; two colour channels bridged with one attenuated to 1/16th intensity gave twice the bit depth), and if a little chromatic artefacting is acceptable, it can be used on colour screens to extend the effective palette depth in much the same way (either incrementing each channel in turn to quadruple it, or boosting green then red+blue alternately to double it), so long as the existing colour range isn't too restricted. It's basically how 16-bit 565 mode works (though that alternately increments green, then all three channels, because of green's greater depth), can give a similar illusion in 15-bit 555, and works reasonably well down to 9-bit (I've used it on an original Atari ST to convert greyscale pictures, without the benefit of the STe's 12-bit palette matching up more neatly with the 16 colour registers), with a bit of user preference exercisable over the "tone" of the picture in terms of the order in which the levels change (and if they ping-pong instead of being truly alternate). No reason why it couldn't also work even better with 18-bit. And in all of those, if you use a mono monitor (especially composite) or zero out the colour saturation on a TV or colour 15khz, it pretty much just looks like a smoother greyscale.
(4096 character limit, gah)
(cont)
ReplyDeletePlus there were cards which seemed to genuinely produce 16 greyscales even on an IBM MD (a lot of those monitors possibly have a relatively analogue "video" line, just with a more digital "intensity" one, so if you can engineer a way to give 8 distinct voltage levels on the former, plus on/off on the latter, you get 16... it might actually be some kind of PWM "cheat" where the pixel clock's analogue (sinewave) oscillator is used to modulate the output to a certain amplitude - that's what an oscilloscope trace of it suggests - and so the beam output only turns on (or is uplifted from "dim" to "bright") for a certain fraction of each pixel width where the voltage exceeds the level where the transistor actually triggers, but the inherent smoothing of the downstream circuitry and the not insignificant width of the beam spot itself - both of which give a slightly blurred edge to each regular pixel - convert that into a steadier greyscale instead. So there's all kinds of ways that the original standards can be cheated and worked around.
(Like, an alternative might be that they used multiple buffers and swapped between them at 60 or 70Hz, each with a slightly different image in / assigning different 6-bit grey levels to each colour index so the underlying data renders differently, and the frame-sequential colour interlacing effect of that - especially on a longer persistence monochrome screen - would produce the impression of a greater number of grey levels, without anywhere near as much shimmer as more conventional all-or-nothing, half line shifted interlace. Again, this is a technique that's been used in many other areas, including commercial LCD panels and LCD/DLP projectors (a lot of them are only 6 bit per channel, or sometimes 7, and use per-pixel temporal modulation to simulate an extra bit or two), sometimes more noticeably actual laptop LCDs (particularly dual-scan monochrome...), and older computers with trick code giving extra colours (dunno if it otherwise happened much in the PC scene, but certainly Atari, Amstrad and Sinclair coders, maybe C64 too, have taken advantage of that particular persistence of vision) - though that tended to work rather better on 15kHz CRTs than any other type of display (again, because of the slightly longer persistence phosphor, and how they effectively scanned each active line twice as often as usual, whilst leaving the other half unscanned, so there was still some afterglow left to build on each time). EG Photochrome which could pull the equivalent of about 14.5 bits of colour from the 12-bit STe (and quite a lot of colours per line compared to the usual 16, or even the 48 of more mundane single-buffer beam-racing pallete-swap techniques).
It's not something that's really that easy to do as a live calculated thing, especially in 3D or the like, but if you're working within a still fairly limited colourspace and can precalculate all the values you need (the two palettes for extending the greyscale range, plus the encoded bitmaps for all the on screen elements, or the exact colour sequencing for a photoreal static image on a machine not made for that) it can work quite well and maybe not even take up any significant extra CPU time (if all you're doing is reloading perhaps 192 palette registers in each VBlank, essentially as a straight blit).
(still too long?!)
(3/3)
ReplyDeleteShades too of, say, the "14 bit" sample replay mode on the Amiga (combining two 8-bit audio channels into one, with one at full volume and the other at minimum, i.e. 1/64th amplitude, being summed together very simply in the analogue domain), or modern Viterbi algorithm based ways to improve the effective dynamic range of samples hacked out of something clearly never designed to produce them, at least not with any quality, such as the straight OPL2 of an Adlib, or the trusty old YM2149/AY8910 (which does a shitload of precalc to figure out which order to arrange each update of the individual channel registers that are then summed to produce the final output voltage).
Combine that together even with having 192 levels when combining three lots of 64/3, and some interframe palette switching, and there's no reason to really doubt that they could have achieved it. Vintage computing is all about the hacks. Even red/cyan CGA mode counts as that, really. It was only meant to be a greyscale mode, but somewhere they messed up and only managed to make it sort-of-work on composite (and not at all on RGB), so just kept the unsuccessful result undocumented, then tweaked it a little for composite when the tech improved just enough that the equivalent of an extra IC or three could be squeezed onto the board. Probably something to do with how the existing palettes tend to only really define the R+G bits for three of the colours and have B+I defined by the mode itself (plus one freely defined colour)... red/cyan is a wierd flip flop in that case, and certainly not the true greyscale of defining R+G+B as one bit and I as the other. Which is a bit of a shame, because it probably would have been the most commonly used mode even with RGB if it had been possible to implement it, and would have made compatibility between colour and mono rather easier...