Sunday, October 10, 2021

Apple II Composite Artifact Color - NTSC, PAL and Filters

The Apple II computer are unique in that not only was it the first home computer ever released to the mass market, it was the first computer released to support color graphics, all the way back in 1977.  It worked by exploiting quirks in the NTSC color system called artifact color which TVs were attempting to suppress.  The design of the Apple II was so solid that its color works rather well on almost anything that can accept a composite signal, even today.  But the color method used did not translate to PAL countries and later improvements to color filtering could modify the colors shown.  In this article, let's take a deep dive into how artifact color works on the Apple II and how it was adapted for systems where artifact color could not exist and how artifacts can change according to the display technology inside a display.  

How Artifact Color Works

Artifact color is based on a quirk of the NTSC method of decoding color.  A color NTSC signal is made up of three components, brightness/luma, color/chroma and sync.  Brightness and sync were present when the NTSC defined as the black and white television standards in 1941.  Color was added to by the NTSC to be backwards compatible with the black and white standard in 1953.

In color NTSC, a 3.58MHz color subcarrier sine wave is transmitted alongside a bandwidth-unlimited luma signal (generally 4.2MHz due to over the air broadcast limitations).  While the luma signal was an analog waveform of varying voltage levels corresponding to the brightness of the image as captured. the color or hue of a chroma signal was determined by the phase of the 3.58MHz sine wave relative to a "color burst" reference phase and the saturation of the color was determined by the amplitude of the sine wave.  

NTSC defined a system of 525 lines being displayed at 60Hz with the odd lines an image being displayed the first field of 262.5 lines and the even lines being displayed the second field of 262.5 lines.  This interlaced display would alternate continually between odd and even fields, 60 times per second (Hz) for monochrome NTSC and 59.94Hz for color NTSC. 

TV Line Signal

In color NTSC, each line would begin with a "back porch" and during this back porch a color burst signal of approximately 8 cycles of a 3.58MHz sine wave with a phase of 180 degrees would be transmitted.  (Phase is defined as the sine of an angle, if you remember your basic)  This was to signal the TV that color information would be displayed on this line.  Then as the line traveled across the screen, the sine wave would phase shift relative to the burst to define the color that the display should be displaying at that point on the line.  The phase shifts can be plotted on a color wheel according to the sine of the phase, so 180 degrees is something of a greenish-yellow, 270 is close to cyan, 360 is blue and 90 is light red.  (Some diagrams like the one above refer to the color burst as phase 0).

The color NTSC standard defines each line as having 227.5 color clocks or phase shifts per line, and this includes portions of the screen which are not visible and have no corresponding luma signal.  You might think the resulting color would have poor resolution, but the human eye is much more sensitive to changes in brightness than color.  The luma signal has greater definition and, because color is being overlaid onto luma, can give great color detail than the limited phase shifting of the color signal can on its own.  Also, a consumer-grade standard definition CRT television has a horizontal definition of about 250 horizontal TV lines, so the actual horizontal resolution of a CRT is not that detailed.  

NTSC Color Wheel Phases

When a display device, whether it is a TV camera, a VHS or DVD or a game console is manipulating the phase of the color signal directly, we call this "direct NTSC color."  The phase shifting in this case is not only intentional but directly caused by the display device.  But one property of NTSC is that when the luma frequency is significantly greater than the chroma frequency, the NTSC decoding process can erroneously decode high frequency luma as chroma.  Moreover, repeating patterns tend to display as solid color.  This is the key to how artifact colors can be generated.

Color on NTSC Apple II Systems

The Apple II series consists of six computers, all of which can display NTSC composite artifact color : Apple II, Apple II+, Apple IIe, Apple IIc, Apple IIgs & Apple IIc+.  The 8-bit NTSC Apple IIs are the only computers which rely exclusively on artifact color, they cannot generate direct color.  In other words, it has no control over the phase of the color signal. The Apple II supports three graphics modes, Graphics Mode, High Resolution Graphics Mode, and in the later systems, Double High Resolution Graphics Mode.  The High Resolution Graphics Mode is the easiest to explain and the mode most often used to produce color on an Apple II, so let us start there.  

Apple II Color Reference to Pixel Clock Compariosn

High Resolution Graphics Mode displays 280 pixels x 192 lines.  Pixels can be individually turned on or off in this mode, but they are either white or black.  Each memory byte allowed for defining individual pixels in 7-bits of an 8-bit memory location  As this pixel clock is twice the NTSC color frequency, one of two artifact colors can be selected from a pair of on and off pixels.   The colors which were produced were green and purple when the Apple II was first released.

Two Hi-Res Pixels and the Color Cycle, half-pixel periods shown 
Why green and purple?  Well, because the luma signal frequency is 2x the chroma signal frequency, the conditions are ideal for artifact color.  As the pixel is white, it does not carry color information but the NTSC color signal is still lying underneath it.  When a pixel is displayed on a particular point on a line, the decoder looks at the NTSC color wheel as a continually revolving circle and selects that color where the timing dictates would be at the phase for each cycle of the wheel.   So when the Apple II displays a purple pixel, the color wheel happens to be at phase 45 degrees when that pixel is being displayed, and similarly at phase 225 when a green pixel is being displayed.  Alternating patterns of on and off pixels produce (mostly) solid lines of purple or orange because the composite video signal does not turn off the intensity signal fast enough to stop the color generation for the next pixel period, so the result is a blur of color that can vary a little in intensity.  White pixels could be displayed by turning on two adjacent pixels, and black pixels by leaving two adjacent pixels off

Very soon after the Apple II's release, it was discovered that with a little extra circuitry, one could use the eighth bit of a memory location to delay the phase by 90 degrees and produce two more colors.  These new colors were orange (phase 135) and blue (phase 315).  Due to the nature of this improvement (really a hack), the system cannot mix blue and orange with green and purple within the same data byte.  You can see how the color signal delay plays out, a purple pixel is offset to a green pixel vertically above or below it by one pixel clock to the left, as is a blue pixel to an orange pixel.  Purple pixels are offset half a pixel clock to blue pixels and green pixels half a pixel clock to orange pixels.  In the revision 1 mainboards, this color delay was implemented without having to hack into the board, and with the plummeting costs of memory (High Res graphics officially requires 16KiB), usage of the High-Res graphics mode became the norm by 1980.

Artifact color only works well if the timing of the luma is an integer multiple of the chroma frequency and the number of color clocks is an even number.  Canonical NTSC uses 227.5 color cycles per line to reduce the visibility of artifact color.  The Apple II uses 228 color cycles per line (160 visible), which a TV can easily tolerate, to ensure that the colors remain the same line after line in the same pixel column.  Relatedly, the Apple II outputs 456 monochrome pixels (displayed and non-displayed) per line.

Graphics Mode Color

(Low-Resolution) Graphics Mode is similar to (40-column) Text Mode in that the screen is arrayed in cells of 7 pixels by 8 lines, but the top 4 lines of the cell and the bottom 4 lines of the cell can be different colors, and any of the 15 artifact colors the Apple II can produce can be used for each half cell.  The resulting resolution is 40x48 with each byte of graphics memory identifying the color of the top and bottom half of the cell.  

Graphics Mode Monochrome

In this mode each 4 bits selects a particular square wave pattern, generated by the Apple II's circuitry, for the color signal which corresponds to one of sixteen patterns of pixels for a 7 pixel by 4 line block. You can see the patterns generated in the monochrome screen capture above.  And there are only 15 colors most of the time, the two gray colors which can be produced are usually identical, although some displays may differentiate them a little.  The blue, violet, orange and green colors are shared between the High Resolution Graphics Mode and Graphics Mode.

Double High Resolution Graphics Mode displays 560x192 pixels.  In double high resolution mode, each half of a byte defines the color of a group of four contiguous horizontal monochrome pixels and all the colors which the Apple II can display in the lower resolution Graphics Mode are available. Essentially Double High Resolution Graphics is what you would get if you had finer control of the pixels in Graphics Mode, but it required significantly more memory than Graphics Mode, more than what was affordable in 1977.   Double High Resolution Graphics requires an Apple IIe with an Extended 80-column Text Card installed and a revision B or later motherboard, or a IIc, IIc+ or IIgs.  The Extended 80-column Text Card also unofficially permits a Double Low Resolution Graphics Mode with an 80x48 resolution.

Background and border colors on an Apple II are always black with the exception of an Apple IIgs simulating the 8-bit graphics modes.  The Apple IIgs shows a black background in graphics modes but can show a blue border.  Text modes are white on blue on the IIgs instead of white on black.  

The color burst signal is disabled in text modes (40-column or 80-column by 24 rows) for all but the first revision of the original Apple II, so text in these modes will appear in monochrome without color fringing.  However, text characters displayed in Mixed Text/Graphics Modes, either with High Resolution Graphics or Low Resolution Graphics (taking up the bottom 32 lines/4 rows), will show color fringes on the text on a color monitor or TV.  Any screenshot showing no color fringing on the text in these modes is an emulator feature and not authentic to original hardware (at least not on a composite monitor).  

The genius of this artifact color system is "free color" on standard displays of the day.  Instead of implementing palette registers and lookup tables for color values or delay lines where colors can be chopped up into discrete phases, the Apple II relies solely on the color burst and data values underlying the pixels to produce color and makes the TV or monitor do the rest.  Colors are stable (by NTSC standards) and can be placed almost anywhere on the screen you want.  In other words, a program can draw an orange line diagonally across the screen and the programmer can be assured that the line will appear as some form of orange on any color monitor or TV set.

Color on Apple II PAL Systems

PAL color was developed around a decade after NTSC color and has certain advantages over NTSC.  The most important is that the phase is reversed every line and the comparison of the phases between two lines tends to eliminates color errors from phase misalignment except in extreme cases.  This high frequency luma is not going to be misinterpreted as chroma (and PAL has a higher luma bandwidth of around 5MHz).  For this reason, artifact color cannot be generated on a color PAL display as it can on a color NTSC display.  Artifact NTSC colors must be converted into direct PAL colors.  

While there were other computer systems that supported artifact color and were released in PAL countries or had very similar PAL machines (Tandy Color Computer and Dragon 32/64 lines, Atari 8-bit line, Atari 7800, IBM CGA clones) none bother to translate NTSC artifact color into PAL direct color.  These PAL machines show monochrome graphics when their high resolution modes were used to display NTSC artifact color.


There were PAL Apple IIs starting with the Europlus (derived from the II+) and continuing to PAL and "NTSC International" versions of the IIe.   Apple had supported PAL frequencies since the Rev. 1 motherboards, which were rather early in the Apple II's lifecycle.  The Europlus did not output color on its own, it was strictly monochrome 50Hz out of the box.  The Europlus could be fitted with the PAL Color Encoder Card in Slot 7 to translate the pixel patterns of the Apple II graphics modes to represent the colors that would have been displayed on an NTSC display on a PAL display.  The PAL Color Encoder Card would tap into the video signal, convert every pixel into a 4-bit value, assign the correct color and generate a PAL-compatible 4.43MHz color signal running at 50Hz with colors assigned according to the pixel patterns.  The resulting display may not have been perfect, but it worked.

Apple IIe PAL Motherboard

Color on the Apple IIe was a little weird.  The Apple IIe PAL motherboard implemented the PAL Color Encoder Card on the motherboard , so you could get PAL color with a normal PAL color TV or monitor.  In fact, it may even have an improved signal compared to the Europlus II and its PAL Color Encoder Card

The "International NTSC IIe" motherboard could output color, but it uses a color clock frequency of 3.56MHz and a refresh rate of 50Hz.  It was released in the Platinum IIe style with the numeric keypad in Australia and New Zealed and without the keypad in Europe.  Only Apple's own monitors supported a 3.56MHz color signal at 50Hz.  Regular color NTSC monitors did not like the 50Hz and may not have found the slightly-off color frequency to be a valid color signal.  PAL monitors would not recognize the 3.56MHz color signal, being far off the standard 4.43MHz.

With both the PAL and International NTSC IIe motherboards, the Auxiliary Slot was moved from the regular NTSC boards to align with Slot 3.  The Extended 80-column memory cards for the PAL and International IIe motherboards plug into both the Auxiliary Slot and Slot 3, requiring a single signal from Slot 3.  

Apple IIe NTSC International Motherboard

There was //c PAL variant released.  The PAL //c could have color added to it by plugging in the PAL Modulator/Adapter into the port where the LCD was plugged into.  The LCD port had the appropriate video-related signals from which the video could be reconstructed.  It also supported, out of the box, the color monitors sold by Apple like the Apple IIe NTSC International motherboards.

Perhaps the best explanation for the reason for the NTSC International machines was that the method of conversion to PAL invariably added dot-crawl visual artifacts.  The NTSC International systems, when paired with the 220/240V Apple Color Monitor IIe, do not exhibit dot crawl issues.

Apple IIgs systems sold outside NTSC countries had a setting to change between 50Hz and 60Hz but with a few exceptions, their composite video output NTSC color only.  There may have been some systems sold for schools in Australia which had a daughterboard mounted on the video chip to permit PAL color output.  

PAL Apple IIs are not clocked lower than NTSC Apple IIs, all Apple IIs run the CPU at 1.02MHz.  This means that programs do not run slower due to a slower clock speed, as PAL C64 may run slower than an NTSC C64 (.89MHz vs. 1.02MHz).  Nor do music and sound effects run at a slower tempo or lower pitch if not corrected for a lower speed because the Apple II has no software timers available, all music is timed by CPU clock cycles.  Games often do not appear to run slower, the video hardware of the Apple II has no vertical blanking interrupt or even a way to poll for vertical blank until the IIc, so it was up to the programmer to keep the graphics in sync with the display,  As very few Apple II games tend to update the display anywhere near 60 times per second, graphics should fit comfortably within 50 frames per second but tearing may be present on moving objects.

Filtering and Artifact Color

Inside every TV is a demodulating circuit that separates the luma signal from the chroma signal.  However, any signal separation is imperfect and will lead to issues.  S-Video's chief innovation over composite video was to always keep the signals luma and chroma signal separate so they could not interfere with each other.  S-Video eliminates artifact color because the necessary mixing of chroma and luma and their resulting quirks of separation are no longer present.  Various filtering methods are used with composite video to try to reach the result of S-Video, but they cannot achieve perfect separation.

The most common method in use to separate the signals during the Apple II's lifetime (1977-1993) was to use a notch filter combined with a bandpass filter.  The bandpass filter only looks to the area around the 3.58MHz range to capture the chroma signal while the notch filter, which is a kind of band stop filter attempts to capture all but the signal in the 3.58MHz area (0-4.2MHz), preserving more luma information than the crude low-pass/high-pass method.  

The notch filter was about as sophisticated a filtering method as encountered in most composite color computer monitors and TV sets up to the late 1980s.  The Apple ColorMonitor IIe a.k.a the AppleColor Composite Monitor IIe and the AppleColor Composite Monitor IIc as well as the Commodore 1084s use notch filters.  

The next innovation in luma/chroma composite signal separation was the delay line.  With the two line delay system, the monitor would store one line of a signal, extract the luma and chroma from it while the next line was being sent.  Because  two vertically adjacent lines tend not to have wildly varying colors, when the mostly inverted phases of the color carriers of the two lines are combined they will reduce or cancel each other out leaving much less affected luma than by just using a notch filter.

The two-line delay system was an improvement but it was not perfect and some systems improved on it a little by using a three-line delay system.  High frequency horizontal color detail can bleed into the luma channel, leading to "dot crawl".  Additionally, sharp changes in luma can lead to rainbow artifacts and color bleed.  To reduce these unsightly artifacts, a more sophisticated adaptive two-dimensional  (2D) three line adaptive system was invented to vary the color comparisons over three lines as they were being drawn to best reduce dot crawl.  Finally, three-dimensional (3D) motion adaptive comb filters compare the lines of the previous frame with that of the current frame to virtually eliminate dot crawl.  

Comb filters work by assuming the signal follows the NTSC rules.  Of course, the Apple II signal does not follow the NTSC rules in several respects. First, the Apple II video signal is progressive rather than interlaced.  Second, the Apple II does not invert the phase of every other line, the phase starts at the same point in time on every line.  This tends to make comb filters behave a little strangely on the Apple II.  Because comb filters have to combine the color signal of every pair of lines, they tend to cause a loss of vertical detail.  In the Apple II, this detail can result in the blending of alternating single color lines.  

Karateka with Notch Filter

Consider this screenshot of Karateka.  The ground is being drawn, in canonical Apple II colors, as alternating blue and orange colors.  The black and white image (quite commonly seen on the green screens of the day) shows a checkerboard pattern.

Karateka in Monochrome (raw pixel)

Now consider the pattern through a typical two-line comb filter :

Karateka with Delay Line Comb Filter

As you can see, the lines have been blended together to show a gray which would not ordinarily register to the eye as gray.  Perhaps the crappiest vintage color TVs may blend these lines together to make a solid gray, but that TV would make for a crappy computer monitor.  As blue and orange rely on pixels which are vertically adjacent to each other by one pixel, when you combine them you could get a white signal but due to luma being off over 50% of the line, you get gray instead.  The same thing would happen for alternating green and purple lines.  However, you can get different colors by mixing colors which are not 90 degrees out of phase with each other.

Finally, let's consider how a 3D adaptive motion comb filter would handle this image :

Karateka (Side B) with 3D Motion Adaptive Comb Filter

As you might note, the sophisticated comb filter has done its job too well here, eliminating virtually all the color, which it sees in this instance as a rainbow artifact to be canceled out.  In fact, we have come full circle to reproduce, as well as a capture card can, the original dots making up the video signal!  The lack of evenly sized pixels and the resulting scaling artifacts is a symptom of composite/s-video capture cards which only sample composite video at 13.5MHz.  As the pixel clock is 7.19MHz, you would need a sample rate of 14.318MHz to fully capture all the pixels without scaling artifacts.  

If you look at the back of the box in which Karateka was sold, you should see a gray color to the ground, suggesting that comb filtering was not unknown in the 1980s. But this is not the only example of probable reliance on comb filtering.  Consider the title screen to the Bard's Tale :

The Bard's Tale - Notch Filter

As you see, the hat of the bard and and of one of the listeners has alternating bands of color, orange/purple, green/blue and purple/blue.  But consider how a comb filter blends these color bands :
 
The Bard's Tale - Delay Line Comb Filter

Now the Bard's hat is a solid color pink and the listener's hat has a green color that is different from the normal green (see the Bard's pants) and a purplish-blue-ribbon that is not the normal purple or blue hue.  If you used a display with a motion adaptive 3D comb filter, you would see dots instead of lines or solid color, something that is definitely not ideal.

A Quick Note about Tandy Color Computer Artifact Color

The NTSC Tandy Color Computer 1 and 2 produce artifact color almost identically to the High-Resolution Graphics Mode of the Apple IIs.  The differences are that there is only 256 horizontal displayed pixels on the CoCos vs. 280 pixels on the Apple IIs (by 192 lines).  Only the blue/orange artifact color combination is available for white pixels, there is no bit that induces a half-pixel phase delay to change blue/orange pixels into purple/green pixels.  The CoCo permits green pixels to be used instead of white pixels, which generate greenish-brown artifact colors, but that is the limit of the CoCo's capability to generate direct color in a mode which can produce artifact color.  Finally, the phase offset between luma and chroma is in one of two random positions when the CoCo is booted, so the first color pixel on a line could be blue or orange.  Programs would tell the user to press the reset button, which would cycle to the next phase offset and reverse the blue/orange colors, if the color at boot was incorrect.  The CoCos had many low resolution modes which used direct color and had more colors available, but artifact color did see use in over 100 games.  The CoCo 3 has backwards compatibility with the Coco 1 & 2 modes as well as its own higher resolution, more colorful graphics and supports RGB video as well as composite video.  One game (Where in the World is Carmen Sandiego?) may support artifact color using those more advanced graphics with it.

Sunday, October 3, 2021

The EverDrive GG X7 - The Only Game Gear Flash Cart You'll Ever Need?

The EverDrive GG X7, courtesy of Krikzz

Around two years ago, I bought a Game Gear and a TV Tuner off eBay.  The Game Gear was sold as non-working, and after some time I fixed it by recapping the unit.  That was not a fun process.  However, once I did so I had a fully working Game Gear with almost no games to play on it :(  I am a big fan of krikzz products, but for the longest time the only product he sold for the Game Gear was the flash memory based  EverDrive GG.  Last year he finally released the ram based EverDrive GG X7, and I recently purchased one and will give you my thoughts about it here.

Friday, September 3, 2021

Lag Testing on a Budget

Keeping input and display latency to a minimum is very important when playing any kind of vintage video game which relies to some extent on reflexes.  There are some methods which can test display lag of any display, like the Time Sleuth or the Leo Bodnar Display Lag Testers.  Other methods may require running the same software on two consoles at the same time or connecting one console to two displays via splitters and adapters.  Testing controller latency often requires wiring up an LED or shooting video of a screen and button pressing at a very high frame rate.  These methods tend to be expensive, but what if we consider an approach that is likely to be inexpensive and perhaps cost you nothing?

Is there a Doctor in the Game Console? - The Venus Turbo Doctor 6M

Taiwan may or may not have been the birthplace of commercial video game piracy, but it certainly has a strong claim to have been its nursery.  When video games skyrocketed in popularity in Southeastern Asia with the Famicom, it seemed as through the entire island of Taiwan wanted to cash in on the efforts of the Japanese.  Taiwan was the first source of unlicensed Famicom clones and pirate cartridges.  But cartridges were expensive to make, even for Taiwan fabs and the larger games were not very profitable to clone.  Then Nintendo handed the pirates a gift, the Famicom Disk System, and as it turned out the gift kept on giving.  While copying FDS games was child's play for the organized pirates, they saw in the FDS an opportunity to pirate to go beyond games originally released on disk. They created RAM cartridges, hardware devices that worked with the Famicom and the Disk System to permit cartridge games put on disk to work.  In this blog entry, I will describe my personal experiences with one such device, the Venus Turbo Game Doctor 6M.

Wednesday, September 1, 2021

Nintendo, Sega and the World Outside Japan and North America - Accommodating Non-English Speakers

Early on, most video games did not need to be translated because the amount of text used in these games was very limited.  Some games, like RPGs, were an exception but by and large most games from the pre-crash era used English when they needed to convey information in the written form.  Even games made by Japanese companies, unless the game was for a Japanese game like Go, Mahjong or Shogi, English was the norm for the simple text messages.  

When console games were large enough to hold a significant amount of text and able totell a story, then for the games that were developed in Japan most or all of the game would tend to use Japanese text.  When these games were released in North America the Japanese text would be translated into English, generally with some simplification for 8-bit and 16-bit console titles.  But when tongues other than English had to be accommodated, things got interesting.

Monday, August 23, 2021

Custom Game Boy Design - Revitalizing Broken or Hard to See Portable Systems

It is a fact that the original Game Boy, its four widely available successors and its contemporary competitors had many excellent games but some truly awful screens by modern standards.  Handheld screen technology has advanced extraordinarily far since the rose-tinted glass days of 1989.  Today modding kits are available to fix or "upgrade" these machines with replacement screens, so let me discuss my own experiences with one.  

Friday, August 13, 2021

Digital Joysticks and the Apple II

The Apple II had thousands of games released during its long life-span, and from its first game, Breakout (later known as "Brick Out" and "Little Brick Out"), many of them used analog controllers like paddles and joysticks.  Other home computers and consoles used digital joysticks, which were often better for single-screen games than analog devices.  During the Apple II's commercial life, there were a few attempts to bring digital joystick support to the computer.  When it became a retro-computing machine, there have been a few more homebrew hardware efforts to bring digital input to older games.  This article will give an overview of attempts both old and new.

Monday, August 2, 2021

Bad Keyboard Switch Designs & Quirky Keys of the Past

Having good switches in today's keyboards is taken for granted.  Except for laptops, any computer can be blessed by a keyboard with mechanical switches or good rubber dome switches.  In the old days of computing this was not always the case.  There were many fine keyswitch designs back in the day, IBM buckling spring, Alps switches, leaf spring switches, hall effect, beam spring, magnetic reed.  But this blog post is not about them.  The 70s and 80s also had many bad keyswitch designs too, so let's identify some of them and where they reared their ugly heads.

Additionally, some keyboard had keys which functioned unusually given the keyboards of today.  We'll take a look at some of those as well.