Classic color NTSC uses a frame rate of 59.94 However, classic video game consoles and home computers never adhered strictly to the NTSC standard. Here are the exact frame rates as I have been able to find :
NES & SNES : 60.0988
GB, GBC & GBA : 59.7275
SGB : 61.1679
SGB2 : 60.0988
Apple II, Atari 2600, Colecovision, IBM CGA, PCjr., Tandy 1000, EGA @ 200, MSX, SMS & Genesis : 59.9275
Commodore 64 = 59.862
Hercules Graphics : 50.050048
IBM VGA : 70.086303
IBM VGA 640x480 : 59.940475
Gamecube & Wii : 60.00222p/59.88814i
How do you calculate the exact frame rate? You do it by discovering exactly how long it takes for the console to output all the pixels it sends to the TV each second and divide that by the number of pixels per line and the number of lines per frame.
Let's start with True NTSC. The colorburst frequency is 3,579,545 Hz. That represents the number of times the color can change in a given second. True NTSC outputs 262.5 lines every odd and even field. The half line tells the TV tube to draw an odd field or an even field. Each line of an NTSC display has 227.5 color clocks. This was chosen to minimize interference with the luminance and audio signals. The frame rate is = 3,579,545/227.5/262.5 = 59.94 Hz. The line rate is 59.94 * 262.5 or 15,734.26 Hz.
For consoles, the calculation is almost identical, but the figures are not. First, we need to know the total number of lines output by the device every frame. Classic consoles and computer adapters typically output 262 or 263 lines to avoid telling the TV to interlace. No classic NTSC device used that many lines as part of their active video. Typically only 192, 200, 224 or 240 lines were used. The Atari 2600 is unique in that unlike other devices, it does not have modes with a set number of lines. It is up to the programmer to tell its graphics chip, the TIA, to send a vertical sync signal to get the TV beam back up the screen. Although 262 lines was the standard, games more often than not output lines other than 262. Some games output so many lines that some modern TVs get confused and think they have a PAL signal! Here is a list of games and the lines they output :
http://www.digitpress.com/library/techdocs/vcs_scanlines.htm
Next we go to the trickier part, we need to figure out how many color clocks and how many pixels are being sent for each line. Typically, the number of color clocks consoles use are 228. Usually the console is sending pixels at some multiple of the color clock rate. The Atari 2600 sends one pixel per clock, but only 160 of those 228 pixels are active. The Apple IIe and IBM CGA can send one, two or four pixels per clock. The Sega and Nintendo 8-bit consoles usually send 1.5 pixels per clock. Their graphics chips run at 5,369,317.5 Hz, 1.5x faster than the colorburst.
The most common line rate of Atari 2600 games is 262 lines. This gives us the following calculation for the game's frame rate : 3,579,545/228/262 = 59.9275 Hz
So for the Sega 8-bit console, it is outputting 342 pixels for 228 color clocks. This gives us the following formula : 5,369,317.5/342/262 = 59.9275 Hz
The Apple II High Resolution and the IBM CGA 320x200 modes are outputting 456 pixels at 7,159,090 Hz. The calculation remains the same. In the Apple IIe/IIc double high resolution mode or the IBM CGA 640x200 mode, both the rate and the number of pixels are doubled again, but the refresh rate remains the same, 59.9227Hz.
Let us consider the C64. The newer NTSC C64 VIC-II outputs 8 pixels per cycle and uses 65 cycles per line. In other words, it is outputting 520 pixels per line. It does this at 8,181817 Hz. It displays 263 lines per frame, that gives a framerate of 59.826Hz.
The NES and the SNES act similarly to the SMS and Genesis, but have two important differences. First, while they output 1.5 pixels per color clock, they output 341 pixels, not 342. 341/1.5 = 227.33 color clock. As a result, there is a staircasing effect every three scanlines for color objects because each line is offset a third of a pixel from the previous line. Black and white pixels are unaffected, since they don't care about NTSC color. Second, the number of pixels output changes by 1 for the last scanline of every frame. On odd frames, the last scanline has 341 pixels, and on even frames the scanline has 340 pixels. Nintendo did this to reduce composite color artifacts, but on non-CRTs it makes for a very gritty display.
Because of the 1-pixel difference, here we calculate by the total number of pixels per frame. That is (341 x 261) + 340.5 = 89,341.5. The formula becomes = 5,369,317.5/89,341.5 = 60.0988.
The Game Boy doesn't care about NTSC color, so its calculation is done differently. The Game Boy runs at 4,194,304 Hz or cycles per second. It draws a frame in 70,224 clock cycles (not machine cycles!). 4,194,304 / 70,224 = 59.7275 Hz. The Game Boy Color runs twice as fast and draws a frame in twice as many cycles. Ditto for the Game Boy Advance except that now it runs four times as fast as the Game Boy and takes four times the cycles to draw a frame. The Super Game Boy uses the SNES master clock signal (21,477,273MHz) divided by 5 to run at 4,295,454 Hz. Because the Super Game Boy still takes 70,224 cycles to draw a frame, it draws 61.1679 frames per second. The Super Game Boy 2 does not use the SNES for a clock source, it has a Game Boy 4,194,304 clock in the cartridge, but it is still accelerated by the SNES to its framerate, not an LCD Game Boy's.
It's interesting that devices which replicate old systems such as RetroUSB AVS have different frame rates to avoid issues with modern HD screens: https://www.youtube.com/watch?v=uJkBw7rYr2o
ReplyDeleteI have a question, maybe you know something about it. If some of the game consoles you mentioned that can be hooked up to an NTSC TV have slightly off framerates, how is it possible that there are no visible frame drops or double frames?
ReplyDeleteSD CRTs were far more tolerant of non-standard and out of spec signals than anything else. This is because they were intended to receive analog broadcast signals, which could deform quite a bit from a TV tower to your antenna. When it comes to the number of lines to draw, they do what they are told. If the signal is wildly out of spec, then you will encounter rolling. Older CRTs had a vertical hold dial that could fix most of these issues. The vertical hold dial on many TVs can essentially dial the vertical refresh down to PAL 50fps.
ReplyDeleteSo you are say, the CRT TVs are syncing to the video signal? I thought they were somehow hard-wired to the AC power frequency, e.g. 60hz. Thus, my question.
ReplyDeleteYes, there are special sync pulses in a PAL or NTSC signal.
ReplyDeleteThere is a hsync pulse at every scanline, and a vsync pulse for every field.
(There are a few other pulses/signals in there, eg to (re)calibrate the black level, and the colorburst as mentioned, which is a 'synchronization' for the color decoding circuitry).
I think they mainly chose sync rates that were virtually identical to the AC power frequency to mimimize interference.
"The Super Game Boy 2 does not use the SNES for a clock source, it has a Game Boy 4,194,304 clock in the cartridge, but it is still accelerated by the SNES to its framerate, not an LCD Game Boy's. "
ReplyDeleteDoes the SNES insert duplicate frames for this acceleration or how does it work?
Does the Super GameBoy 2 run at 60.0988 fps or as a real GameBoy at 59.7275 fps?
I know that the GameBoy Interface ultra-low latency (for gamecube) is supposed to run at 59.7276 fps (very close to GB 59.7275 fps). And I also know that many crt's have a problem of displaying that kind of framerate. Something that does not seem to apply to the Super GameBoy 2 (because it runs as a SNES?!)
Does the usual GameBoy Player on the other hand run at 60.00222?
So.. IS really the Super GameBoy 2 really as accurate as some say?
The SBG2 gives accurate music playback, which is why Youtube personalities prefer to use it over the original. But the SGB2 must have a little extra stutter where it must hold one frame a little longer to compensate for the difference between 59.7275Hz and 60.0988Hz. An original SGB should not have extra stutter because it drives the SNES PPU faster. Instead it runs faster.
ReplyDeleteA PAL SGB must drop frames because the NES and SNES PAL refresh is 50.0070Hz.
Hmm. Made a test at home just now. Tested Castlevania on Gameboy and recorded sound from start of the first levels music until it runs out and you die. By setting up this way I will not have a specific frame count to check against, good or bad (?).
ReplyDeletePal Gamecube with Gameboy Interface Low Latency: Assuming fps of 59.8261.
Pal Gamecube with Gameboy player: 59.8929 fps (very close to 59.88814 interlaced as stated above.
Gameboy Advance on batteries: 59.8587 fps.
If assuming that the Gameboy Advance is true 59.7275 then the others would look like:
Pal Gamecube with Gameboy Interface Low Latency: 59.6948 fps.
Pal Gamecube with Gameboy player: 59.7615 fps.
Second assumption looks worse to me. Look like the Gameboy Advance is lagging when batteries and half run out?
"The Super Game Boy 2 does not use the SNES for a clock source, it has a Game Boy 4,194,304 clock in the cartridge, but it is still accelerated by the SNES to its framerate, not an LCD Game Boy's."
ReplyDeleteSorry but this statement is not true at all because the SNES uses the SGB2's framerate and not its own framerate. If the SNES didn't use SGB2's framerate all the GB games would actually run faster than on GB Player or even original GB but that's not the case as games run at the same framerate on the SGB2 than on other platforms (besides SGB1 obviously).
That's also the reason why the SGB1 is running faster because the SNES is using the SGB1 framerate.
Also there is a mistake at the beginning of your article because SGB2 is running at 59.7275 Hz and not 60.0988.
Is this true? So will sgb2 be as good as gci ull?
DeleteSorry late to the party. Yes, it's true. Article should be edited to fix the SBG2 framerate. SBG2 being accurate to real Game Boy lets it be used in speedrunning for competitive games.
DeleteGCI ULL is superior to SGB2 when it comes to framerate.
ReplyDeleteWhat happens when these consoles are connected to an LCD TV? Are LCD TVs somewhat flexible with the refresh rate the way that CRT televisions are? I would assume that LCD TVs are designed to operate with the 59.94 hz NTSC refresh rate. When an NES is connected, would an LCD TV refresh at 60.0988 hz?
ReplyDeleteI am unsure, I believe it would probably drop frames.
ReplyDeleteThat's what I've always assumed. I know that at least one monitor displays very oddly if a frame rate of 59.93 is used vs 59.94. Windows 11 supposedly uses the former instead in at least some cases for some strange reason, whereas 10 will use the later
DeleteVery helpfulh article, but I spotted a mistake. 5,369,317.5/342/262 as well as 3,579,545/228/262 both yield ~59.92275, not ~59.9275. One of the 2s in the fractional part must have slipped past your mind when you were rounding them. The similarly looking figure for the Game Boy family (~59.7275) must have tricked you into following its pattern.
ReplyDeleteThis is true. It's actually 59.922743, so it doesn't even round up to 59.92275. He does give the correct value while describing the Apple II and CGA frame rates, which are the same as the Atari VCS/SMS/etc.
DeleteExcellent article! How would I work out the colour burst from the vsync and hsync? I can't can I unless I know the number of lines and or colour freq?
ReplyDeleteThere would be a crystal running at that frequency for either the entire system (typical for older systems) or for the video system (with a separate crystal for CPU--typical for newer systems). The speed of that crystal will already have been determined for most systems. Otherwise, you can find the crystal on the PCB and determine its frequency based on the info printed on it or you can use an oscilloscope.
DeleteKind people, please answer the question. The maximum output frame rate of the Sega Genesis in reality is 59 or 60? Or are there any tenths of frames displayed on the screen?
ReplyDeleteGenesis is 59.922751013550524hz for NTSC and 49.70146011994842hz for PAL
DeleteSource: Tasvideos PlatformFramerates
Tasvideos is wrong to give excessive decimals for framerates. Framerates depend on the frequency of the crystal oscillator, which is constantly shifting and has tolerance (accuracy) and stability error (precision) given in parts per million. Typical stability value of 50 ppm means Genesis NTSC framerate is really 59.922 ± 0.003 Hz, with perfect accuracy and no annual drift.
Delete"On odd frames, the last scanline has 341 pixels, and on even frames the scanline has 340 pixels. Nintendo did this to reduce composite color artifacts"
ReplyDeletemarqs was speculating on that to be the reason. It's not a fact. That RGB PlayChoice-10 does not alternate the pixel count is evidence for. Evidence against is PAL NES and SNES don't alternate the pixel count, nor does either N64 console.