If you actually wanted to play Richard Garriott's masterpiece, Ultima IV - Quest of the Avatar, on real contemporaneous hardware, and had a choice of the platform to select, which platform you choose? Would you choose the original Apple II, the Atari 8-bit version, the Commodore 64 version, the IBM PC version, the Atari ST version, the Amiga version, the NES or the Sega Master System (Europe Only) version? Outside of Japan, these were your choices.
Putting the console versions aside, which are not quite 100% faithful to Garriott's vision (the SMS is very close and quite enjoyable and easily puts the NES version to shame), the version to play is the IBM PC version. That version may not have the nicest looking introduction and has no music, but what it does have is the advantage of speed. It is the only version that can officially be installed to a hard disk. All the other versions had to be run off floppy drives. The 8-bit home computer versions typically required disk swapping with four disk sides.
Ultima IV accesses the floppy drives a lot. Every time you enter or leave a town or castle, the game needs to load the next area from the disk. Every time you initiate a conversation, the game needs to access the disk to load the NPC's responses. Every combat requires a load. There is a load for each 16x16 tile boundary crossed. On the 8-bit systems, entering or exiting a dungeon requires flipping the disk over. 8-bit systems have slow floppy drives. The Apple II is the fastest but does not really have a hardware disk controller. The Commodore 64 and Atari 8-bit machines have hardware disk controllers in their drives but their systems speeds are crippled by the slow serial buses these systems use to connect their floppy drives. Try any of these versions in an emulator with the authentic disk drive speed on and see how much fun you will have. IBM had a true disk drive controller and an adapter solely for the disk drive, so its disks transfer with comparative speed. Hard drives were not standard on the Atari ST and Amiga in the first few years of their lives, so these ports were not designed to be hard drive installable.
Ultima IV on the IBM PC suffers from an absolute lack of music. When the port was released in 1987, supporting the EGA card was considered a big deal. Sound cards were at best an emerging technology, and the only PC compatible systems that had a music chip were the discontinued IBM PCjr. and Tandy 1000 series. Although there is a patch for the PC version that adds music, and later combined with a patch that adds VGA graphics, it requires a 386 to run at all. Moreover, the PC Speaker sound effects and spell effects are lost due to the increased speed. It is not suitable at all for machines which Origin targeted.
The music is not really a big deal. Relatively few Apple II owners owned a Mockingboard and the game itself, and without that, there was no music on the Apple II. Atari 8-bit users did not get the music due to Origin not feeling the need to support 64KB Atari machines. The Commodore 64's music is not that great, neither is the Atari ST or Amiga's. Besides, there are only eight music pieces in the game (Towne, Castle, Wanderer, Combat, Dungeon, Shoppe, Rule Britannia and Shrine), and they get old very quickly.
Ultima IV scales reasonably well on low end machines. It runs extremely well on my Tandy 1000SX in the 16-color Tandy Graphics mode at 7.16MHz, and it also runs well in 4-color CGA mode at 4.77MHz, making it suitable for IBM PC and XTs. The IBM PC version requires 256KB RAM. The Apple II and Commodore 64 versions require 64KB and do not utilize extra RAM. The PC can load conversations and combat scenes in memory. The increased loading speed of a hard drive means that loads from a hard drive can occur much more quickly than even the 16-bit machines. With real floppies, the game will check the floppy to ensure that the genuine article is being used, but there are cracks and the CD compilations do not suffer from the checks.
Ultima IV is not atypical of the Ultima series or PC ports in general. While many PC ports were graphically or aurally inferior to their Apple, Commodore and Atari counterparts, they can run far better than any of those on real hardware. The 8-bit machines required accelerators (except for the IIgs, which could accelerate 8-bit Apple II software), which are hard to come by and can screw up timing. Non-PC 16-bit machines also tended to be tricky to get working faster. PC ports by the mid to-late 80s were programmed to be at least somewhat flexible in regard to system speed. In this case, once you get beyond character creation, the 16-color EGA and Tandy graphics are almost on par with the Atari ST and Commodore Amiga graphics.
The only issue I have with the PC port is with the keyboard movement buffer. The keyboard movement buffer will buffer up to eight keystrokes if you hold down a directional key. At the 4.77MHz speed, however, the keyboard movement buffer will always store up to four to five presses of a directional key if you hold the key down. On AT-class machines you can terminate the buffered movement keys by pressing the space bar, but that does not work at the 4.77MHz speed.
Wednesday, July 24, 2013
Saturday, July 13, 2013
How to Destroy a Great Game Series - Eye of the Beholder
SSI will forever have a place in the hearts of computer RPG fans for bringing the Advanced Dungeons and Dragons ruleset to personal computers, beginning in the form of the Gold Box games. The Gold Box engine was originally designed for the Commodore 64, and its top-down tactical turn based combat was one approach to interacting with a game world, other approaches existed. The game Dungeon Master was very popular in the late 1980s on the Atari ST and Amiga. Its real-time first person 3D perspective was innovative, even if you could only turn 90 degrees. A port of this popular game to the IBM PC platform was slow in coming, so SSI commissioned Westwood Studios to make a game in the same style using the AD&D license it had.
The result was Eye of the Beholder, which was slightly simplified or streamlined compared with its inspiration. One feature that EOB had over DM was that the player could create four player characters from scratch, adjust (or max out) their stats, and choose their character class. The graphics supported 256-color MCGA/VGA, an improvement over the 16 colors of DM. The sound of EOB was fairly simple, supporting the PC Speaker, Tandy 3-voice chip or the Adlib.
This game was all about slaying monsters, solving puzzles and trying to avoid traps. Since fights occurred in real time, fast maneuvers became important, especially with tough enemies. You can attack enemies on their flanks, but they can attack yours as well. Being surrounded meant that you could not move and unless you killed your enemies quickly, you were doomed. Only the front rows could attack with melee weapons, the second and third rows could only use thrown weapons or spells. Since this was a first person perspective game, all characters could only attack one way. Automapping was not a feature in the original DOS version, but was available in some of the later ports. Like DM, the EOB series required the player to find food for the party to eat, either in the form of rations or with a Create Food spell.
The style of the game meant that the actual implementation of the AD&D, 2nd Edition system was simplified. Classes were basic, but the game engine imposed limits on the more unusual classes like the Bard and the Druid. Item manipulation on the main window required the use of a mouse, just like DM. Spells were generally limited to attack, defense and healing spells. Magical items concentrated heavily on weapons, armor and wands. Attribute bonuses, saving throws, THAC0 and Armor Class were all present. Alignments were generally not important to the game. Your party is good, the monsters you fight are evil, and that is as morally complex as it gets.
This being 1990, and not 1987, players expected a bit more than to just read the manual and then thrust themselves into the game. There was an opening cinematic outlining the plot and music (until you entered the game). The plot was thin, even by 1990's standards, but if people wanted a good plot in a computer RPG, they would play Ultima VI. The various levels of the dungeon had different designs, from the red brick and slime of the sewers to the stone work of the dwarven levels and the onyx designs of the drow. Sound effects were especially important, as the various noises could tell you how close enemies were. There was a special portal system to allow your party to warp to various levels in the dungeon.
The game was a big success, and successes equal sequels. The next game in the series, 1991's Eye of the Beholder II : The Legend of Darkmoon, used the exact same engine but made several improvements to the gameplay. The opening cinematic was pretty spectacular when people first saw it. Outdoor areas were added, and the game was far larger than its precessor. There was an ending cinematic, which was sadly absent from the MS-DOS version of Eye of the Beholder. The monsters were more varied and more devious. The game was more difficult and could at times be cruel to the player. Items were not quite as nice as the first game.
The starting level for characters began just about where EOB left off. Westwood Studios took inspiration from the Gold Box games and allowed you to import your characters from EOB. Imported characters retained virtually all of their equipment, unlike the Gold Box games which generally found a way to nerf your characters almost every time you imported them from the previous game. Even with your high-powered and over-developed characters from the first game, it was still hardly a cakewalk. Perhaps due to the increase in difficulty, the game offered multiple save slots whereas the original game only supported one save.
Even without support for digitized sound, the second game was successful and improved on the weaknesses of the first game without alienating the fans of the first. Both games ran pretty well on modest (386SX) hardware, and did not require EMS other types of exotic memory management. For 1991, this was acceptable.
Dungeon Master was finally ported to the MS-DOS platform in 1992, and it was starting to look really creaky as it was the same game people had played on their ST and Amiga in 1987 and 1988. Its Expansion Pack, Chaos Strikes Back, was never released for MS-DOS. However, with 1993's Eye of the Beholder III : Assault on Myth Drannor, everything took a turn for the worse, a really bad turn. Playing the port of Dungeon Master, going back to go through EOB 1 & 2 again or trying Ultima Underworld suddenly seemed very attractive
Westwood Studios was not involved in the development of EOB3. Instead they used the engine to make the well-regarded game Lands of Lore : The Throne of Chaos. SSI chiefly wanted to add digitized sound support to the EOB engine. To do that, they hired John Miles, later famous for his middleware Miles Sound Drivers, to revise the engine. The original EOB1/2 engine was strictly meant for real mode and 640KB of RAM. Instead of just requiring EMS, which would have solved the problem of storage for sound samples, Miles rewrote the engine, called AESOP, to use the 16-bit protected CPU mode.
16-bit protected mode was supported by the 286 and above processors. By contrast, DOOM used a 32-bit protected mode and only worked on a 386 or better CPU. By the time the game was released in 1993, nobody cared about the 286 and SSI did not even state that the game worked with a 286 on the box. The engine's performance on anything but high end hardware was dreadful. There is a patch to convert the game to use a 32-bit version of the engine, but it has issues with sound stuttering with real hardware and current versions of DOSBox, (it works in DOSBox 0.73) but apparently plays more smoothly than the original 16-bit engine. Regardless of engine, loading a save game takes far longer than it should especially compared to EOB 1 or 2.
EOB 1 and 2 shared a connected plot. EOB had no connection to the previous games other than it occurs after your party returns victorious from Darkmoon. The plot is not particularly developed in the game, and the story in the manual has, at best, only a thematic connection to the game's plot. The opening cinematic for EOB3 is nowhere near as impressive as EOB2's was.
The digitized sound that SSI and Miles were so keen to incorporate into the game detracts from the immersion instead of adding to it. The ghosts in the opening level and the undead warriors in the mausoleum make machine-like noises. The sound is extremely loud, usually unpleasant and it never seems to stop. Turn down your speakers or your significant other will make you turn them down or order you to put on your headphones. On lower end machines, the game will pause at times for the sound samples to load off the hard disk and into memory. Playing with the digitized sound on on these machines can make for a really choppy playing experience. In addition to FM music, the Roland MT-32 and compatibles is also supported, but outside the introduction music is heard so infrequently in the game that it does not really add to the game.
The greatest innovation this game could boast is the All-Attack Button, which let all party members selected attack at the same time. There are some new portraits for your characters this time around. Characters with polearms could attack from the second row, but that is it for the positive innovations.
The difficulty in this game was all over the map. The first level of the mausoleum, which is usually the second level you encounter in the game, is almost certainly the hardest level in the game. Most of the rest of the game is comparatively easy, even the final level. There is a very difficult part just before you meet the Lich, however. Other than that, the game is easier than its predecessor. The Lich himself, who is supposedly the main antagonist in the game, is a pushover. The game is (eventually) generous with items, but your mage will have a difficult time gaining the levels needed to memorize all the high level spells in the game.
The opening level in the graveyard is extremely tedious due to all the time you spend hacking away at trees to find hidden alcoves and eventually the exit to the forest path to Myth Drannor. Its also almost an entirely open space, and with no automap it is difficult to figure out where you are and where you need to go. Import axes with your characters from Darkmoon. As a result of EOB3, SSI declined to make more games in the series and it is now fondly remembered by its fans as the last of the great games from SSI's AD&D license. It did release a tool called Dungeon Hack to allow players to make their own levels using the EOB engine, but it was not as successful as the similarly featured Gold Box engine tool Forgotten Realms: Unlimited Adventures. SSI would make more AD&D games in the first person format, but none of them garnered the critical acclaim or sales figures of the Gold Box or the first two Eye of the Beholder games. Westwood Studios would go on to make Dune 2 and the Command and Conquer series. FTL only made one more Dungeon Master game before going out of business.
![]() |
Dungeon Master, the original 1st Person Dungeon Crawler |
![]() |
Eye of the Beholder, of course they didn't copy everything about Dungeon Master, as you can see |
This game was all about slaying monsters, solving puzzles and trying to avoid traps. Since fights occurred in real time, fast maneuvers became important, especially with tough enemies. You can attack enemies on their flanks, but they can attack yours as well. Being surrounded meant that you could not move and unless you killed your enemies quickly, you were doomed. Only the front rows could attack with melee weapons, the second and third rows could only use thrown weapons or spells. Since this was a first person perspective game, all characters could only attack one way. Automapping was not a feature in the original DOS version, but was available in some of the later ports. Like DM, the EOB series required the player to find food for the party to eat, either in the form of rations or with a Create Food spell.
![]() |
You will encounter many of these in your travels |
This being 1990, and not 1987, players expected a bit more than to just read the manual and then thrust themselves into the game. There was an opening cinematic outlining the plot and music (until you entered the game). The plot was thin, even by 1990's standards, but if people wanted a good plot in a computer RPG, they would play Ultima VI. The various levels of the dungeon had different designs, from the red brick and slime of the sewers to the stone work of the dwarven levels and the onyx designs of the drow. Sound effects were especially important, as the various noises could tell you how close enemies were. There was a special portal system to allow your party to warp to various levels in the dungeon.
![]() |
Drow architecture, ornate yet oppressive |
![]() |
From the EOB2 Intro : Look into my eyes, you will accept my quest! |
![]() |
Do you really have a choice? |
![]() |
Priestly discipline at Temple Darkmoon |
Westwood Studios was not involved in the development of EOB3. Instead they used the engine to make the well-regarded game Lands of Lore : The Throne of Chaos. SSI chiefly wanted to add digitized sound support to the EOB engine. To do that, they hired John Miles, later famous for his middleware Miles Sound Drivers, to revise the engine. The original EOB1/2 engine was strictly meant for real mode and 640KB of RAM. Instead of just requiring EMS, which would have solved the problem of storage for sound samples, Miles rewrote the engine, called AESOP, to use the 16-bit protected CPU mode.
![]() |
From the EOB3 Intro : Seriously, would you accept a quest from this guy? |
EOB 1 and 2 shared a connected plot. EOB had no connection to the previous games other than it occurs after your party returns victorious from Darkmoon. The plot is not particularly developed in the game, and the story in the manual has, at best, only a thematic connection to the game's plot. The opening cinematic for EOB3 is nowhere near as impressive as EOB2's was.
The digitized sound that SSI and Miles were so keen to incorporate into the game detracts from the immersion instead of adding to it. The ghosts in the opening level and the undead warriors in the mausoleum make machine-like noises. The sound is extremely loud, usually unpleasant and it never seems to stop. Turn down your speakers or your significant other will make you turn them down or order you to put on your headphones. On lower end machines, the game will pause at times for the sound samples to load off the hard disk and into memory. Playing with the digitized sound on on these machines can make for a really choppy playing experience. In addition to FM music, the Roland MT-32 and compatibles is also supported, but outside the introduction music is heard so infrequently in the game that it does not really add to the game.
![]() |
Eye of the Beholder : Lumberjacking Simulator |
The difficulty in this game was all over the map. The first level of the mausoleum, which is usually the second level you encounter in the game, is almost certainly the hardest level in the game. Most of the rest of the game is comparatively easy, even the final level. There is a very difficult part just before you meet the Lich, however. Other than that, the game is easier than its predecessor. The Lich himself, who is supposedly the main antagonist in the game, is a pushover. The game is (eventually) generous with items, but your mage will have a difficult time gaining the levels needed to memorize all the high level spells in the game.
![]() |
These guys will make you wet your pants, but they are the third monster you encounter in the game |
![]() |
Faces of the Beholder, Eye of the Beholder 1 & 2 |
Monday, June 24, 2013
The 64M GB Smart Card - Diamond in the Rough for your Game Boy
Update 10/15/17 : This device is on the list of risky, dangerous and flawed products for your retro consoles : http://nerdlypleasures.blogspot.com/2017/08/flawed-risky-and-dangerous-devices-for.html Don't buy, don't use, spend the extra money for a properly designed flash cart. I threw mine away.
Update 12/29/15 : This device is almost totally obsolete compared to the EverDrive GB. Read my review of the EverDrive GB here : http://nerdlypleasures.blogspot.com/2015/01/the-everdrive-gb-game-boy-and-game-boy.html
Some years ago, I felt my Game Boy Advance SP (backlit) was not receiving the 8-bit love it deserved. Tired of chasing down all the games I wanted to play on a real Game Boy, I decided to purchase a Game Boy flash cart. The days of the Game Boy were long past in 2010, and handheld systems get precious attention from multi-cart designers. The old devices from Bung and others had not been manufactued in a very long time, and who wants to deal with parallel port programmable devices in the 21st Century? Nintendo manufactured an official flash cart called the Nintendo Power GB Memory Cartridge, but it only supported a select number of original Game Boy games, required a special commercial burner, and only had 1MB for ROM and 128KB for SRAM.
Fortunately, while interest in Game Boy games is fairly low, interest in Game Boy sound is high. Chiptune music has become more and more popular, and the Game Boy's Audio Processing Unit (APU) is very similar to the NES's APU. However, a Game Boy is portable, it can be brought to a party, a club or a rave and be controlled with the buttons on its face and run off batteries. A NES requires a gamepad and a monitor screen of some kind, its not very portable and requires a free wall socket. While you can make music on a laptop, its not very exciting to bring a laptop to a rave and emulation does not have the allure of real hardware.
A ROM program called Little Sound Dj (LSDJ) was developed to allow music programmers easy access to the Game Boy. There was a need for a cart to store the music that would be played on the Game Boy at these parties and recording sessions, so a new breed of flash carts became available. One of the most common ones, and one I purchased, is the following : http://store.kitsch-bent.com/product/usb-64m-smart-card
You can also purchase it here : http://www.nonelectronics.com/catalog/index.php?main_page=product_info&cPath=2&products_id=112&zenid=b54fe49ab9c28ca36063e603674744bc
At $40.00, the price is very reasonable and 64 Megabits / 8 Megabytes means that the cart can hold two of the largest Game Boy Color games. There is also a 32 Mbit /4MB version. It also has 1Mbit / 128 KB of SRAM. It has a mini-USB port, no external programmer is required. It will not work with Game Boy Advance games.
That 8MB is divided into two 4MB pages. ROMs can be stored on either page. When you insert the cartridge, the contents of the first ROM page is always displayed. To get to the second ROM page, you must quickly turn off and then turn back on your Game Boy. Game Boy Color should not be on the same page as regular monochrome Game Boy games, glitching will result otherwise. The first thing you see after the Nintendo scroll is the menu (unless only one game is in the ROM page). The menu uses the name in the game's header, not the filename of your ROM. Some Game Boy and Game Boy Color games use a Japanese name even though the language of the game is English.
The cartridge is a cheap Chinese-manufactured device, but its essentially the only device around. It does not fit in a Game Boy Advance SP slot quite as nicely as a real licensed cart, you need to listen and feel for the "click". It sucks your Game Boy's batteries much faster than real cartridges, even Game Boy Advance games. It backs up SRAM via a coin cell battery. Writing to the card is slow, about 3.5 minutes for one 4MB page. Each page must be written to separately, and writing a new game or games to the page will erase all the old games (because its flash memory). A single game like Shantae or Dragon Warrior III can use one whole 4MB page. Multiple games in a ROM page can use no more than 3.75MB because the menu for the page is stored in the last 32KB of the ROM page. Usually that means that no more than three Game Boy Color games can be fit in a single page, because almost all Game Boy Color games are at least 1MB.
Game Boy (B&W) games range in size from 32KB to 1MB. Like the NES, the first games released for the Game Boy fit inside the CPU's addressing space and did not require any additional hardware inside the cartridge. However, almost immediately it was understood that 32KB (the limit of the CPU's addressing capabilities) was simply not going to be enough for games that aspired to something better than first generation NES games. However, Nintendo kept much stricter controls on mapper hardware than on the NES, which had dozens of different mappers. Nintendo in the early days of the Game Boy used two "mappers" called MBC 1 and MBC2 (Memory Bank Controller). All licensed third party companies were required to use these two mappers (if their game was larger than 32KB) and did not use their own custom hardware except in a very limited way.
MCB1 could support 2MB of ROM with 8KB of SRAM or 512KB of ROM with 32KB of SRAM. Games in the US were released with up to 512KB ROM with 8KB Battery Backed SRAM . MBC2 games used 256KB ROMs with 512 nybbles of Battery Backed SRAM integrated into the MBC chip. Much, much later, when the first Pokemon games were released, they used the MBC3 with support for 2MB of ROM and 32KB SRAM. Additionally, the MBC3 also included a battery backed real time clock chip driven by an external oscillator.
For the Game Boy Color, Nintendo made a MBC5 chip that was included in virtually every Game Boy Color game. This chip could support up to 8MB of ROM (no US game ever required more than 4MB) and 128KB of Battery Backed SRAM. It could also support a rumble feature, but not a real time clock chip. Only Game Boy Color games requiring a real time clock used MBC3.
The flash cart will work in a Super Game Boy or the GameCube Game Boy Player. In the Super Game Boy, any game with Super Game Boy features will work fine if it is the only game in the page. If it is not, then you must start the game from the menu, then reset the game to utilize the Super Game Boy features. Otherwise, the Super Game Boy will play the game as if it were a regular Game Boy game. It may be unreliable in a Game Boy Pocket due to the power draw, I would find the highest rated mA AAA batteries you can find if you used it in one.
While many, many games work correctly on this cartridge, quite a few will not. Some write to the flash cart's registers and screw things up. Some games written for MBC1-MBC3 hardware will sometimes fail to work properly or at all because they use a feature which does not work or works differently on an MBC5. The only solution is to patch your ROM. Virtually every game that has had problems reported on the NESDEV forum has a fix. Most of them you can find here : http://thegaminguniverse.org/ninjagaiden4/mottzilla/ and some you can find in this thread : http://forums.nesdev.com/viewtopic.php?f=20&t=5804. According to this, the first revision of the 32MB carts had an functional MBC1 emulation mode : http://blog.gg8.se/gameboyprojects/week09/EMS_FAQ.txt
When I termed this flash cart to be a diamond in the rough, the main issue is the stock flashing program. Engrish aside, the chief problem with the program as it runs on the Game Boy is that it only allows for one game to save at a time! So if you play The Legend of Zelda : Link's Awakening, save and do not download your save to your PC, when the next game you play is Metroid II : Return of Samus, you will have wiped out your Zelda save. Most Game Boy games only use 8KB to save the game and the cart has 128KB of SRAM available. A NESDEV user named Mottzilla made a custom version of the PC loader that will allow for multiple save games on the cart. It is a must-download, find it here : http://thegaminguniverse.org/ninjagaiden4/mottzilla/
With his addon to the flasher program, each game with save features has an 8KB slot in the SRAM reserved for it. (MBC2 games still take 8KB even though they only save 256 bytes). One 32KB game save (for games like Pokemon) is supported, so you can have up to 16 or 12 save files in the SRAM at any one time. You can see which games currently have saves and you can delete saves from the flash cart's menu. However, the SRAM is saved between the two ROM pages, so it is generally best to use one ROM page for games with save features and the other ROM page for games that do not save.
There are a few weird games that will never work with this cart. Some Japanese games used more exotic mapping hardware like the HuC-1 and HuC-3, which supported an infrared sensor for wireless communication. None were released in the US or Europe except for a Pokemon clone called Robopon - Sun Version, which used the HuC-3 and came in an oversized black cartridge. It has an infrared port for commications with other Robopon carts. Uniquely, it has the capabilities to make simple sounds from the cartridge when the cartridge is not in use. It has a speaker and an extra (user replaceable) battery for this function in addition to the battery backed internal RAM. Finally, it has a real time clock. Kirby's Tilt 'N Tumble used the MBC7 due to its motion sensor. The Game Boy Camera is another piece of unique hardware that includes a ROM which functions like a game cartridge. Finally, there is an official ROM of Mortal Kombat 1 & 2 which uses MCB1 in an odd way to support its 1MB size, just use the standalone versions of the games instead. Unlicensed games, such as those released by Sachen and Wisdom Tree, use their own custom mapper and are not playable with this flash cart..
Those of us who love Game Boy games have been yearning for years for a proper Game Boy flash cart like the PowerPak. One that accepts microSD cards, does not suck down batteries, properly supports all four major MBCs and a real time clock, and whose menu hardware can be shut off from the system upon game loading. It does not exist as of yet (but Gamegear carts do, strangely), so at the moment we are stuck with this device, which is much better than nothing.
Update 12/29/15 : This device is almost totally obsolete compared to the EverDrive GB. Read my review of the EverDrive GB here : http://nerdlypleasures.blogspot.com/2015/01/the-everdrive-gb-game-boy-and-game-boy.html
Some years ago, I felt my Game Boy Advance SP (backlit) was not receiving the 8-bit love it deserved. Tired of chasing down all the games I wanted to play on a real Game Boy, I decided to purchase a Game Boy flash cart. The days of the Game Boy were long past in 2010, and handheld systems get precious attention from multi-cart designers. The old devices from Bung and others had not been manufactued in a very long time, and who wants to deal with parallel port programmable devices in the 21st Century? Nintendo manufactured an official flash cart called the Nintendo Power GB Memory Cartridge, but it only supported a select number of original Game Boy games, required a special commercial burner, and only had 1MB for ROM and 128KB for SRAM.
Fortunately, while interest in Game Boy games is fairly low, interest in Game Boy sound is high. Chiptune music has become more and more popular, and the Game Boy's Audio Processing Unit (APU) is very similar to the NES's APU. However, a Game Boy is portable, it can be brought to a party, a club or a rave and be controlled with the buttons on its face and run off batteries. A NES requires a gamepad and a monitor screen of some kind, its not very portable and requires a free wall socket. While you can make music on a laptop, its not very exciting to bring a laptop to a rave and emulation does not have the allure of real hardware.
A ROM program called Little Sound Dj (LSDJ) was developed to allow music programmers easy access to the Game Boy. There was a need for a cart to store the music that would be played on the Game Boy at these parties and recording sessions, so a new breed of flash carts became available. One of the most common ones, and one I purchased, is the following : http://store.kitsch-bent.com/product/usb-64m-smart-card
You can also purchase it here : http://www.nonelectronics.com/catalog/index.php?main_page=product_info&cPath=2&products_id=112&zenid=b54fe49ab9c28ca36063e603674744bc
At $40.00, the price is very reasonable and 64 Megabits / 8 Megabytes means that the cart can hold two of the largest Game Boy Color games. There is also a 32 Mbit /4MB version. It also has 1Mbit / 128 KB of SRAM. It has a mini-USB port, no external programmer is required. It will not work with Game Boy Advance games.
That 8MB is divided into two 4MB pages. ROMs can be stored on either page. When you insert the cartridge, the contents of the first ROM page is always displayed. To get to the second ROM page, you must quickly turn off and then turn back on your Game Boy. Game Boy Color should not be on the same page as regular monochrome Game Boy games, glitching will result otherwise. The first thing you see after the Nintendo scroll is the menu (unless only one game is in the ROM page). The menu uses the name in the game's header, not the filename of your ROM. Some Game Boy and Game Boy Color games use a Japanese name even though the language of the game is English.
The cartridge is a cheap Chinese-manufactured device, but its essentially the only device around. It does not fit in a Game Boy Advance SP slot quite as nicely as a real licensed cart, you need to listen and feel for the "click". It sucks your Game Boy's batteries much faster than real cartridges, even Game Boy Advance games. It backs up SRAM via a coin cell battery. Writing to the card is slow, about 3.5 minutes for one 4MB page. Each page must be written to separately, and writing a new game or games to the page will erase all the old games (because its flash memory). A single game like Shantae or Dragon Warrior III can use one whole 4MB page. Multiple games in a ROM page can use no more than 3.75MB because the menu for the page is stored in the last 32KB of the ROM page. Usually that means that no more than three Game Boy Color games can be fit in a single page, because almost all Game Boy Color games are at least 1MB.
Game Boy (B&W) games range in size from 32KB to 1MB. Like the NES, the first games released for the Game Boy fit inside the CPU's addressing space and did not require any additional hardware inside the cartridge. However, almost immediately it was understood that 32KB (the limit of the CPU's addressing capabilities) was simply not going to be enough for games that aspired to something better than first generation NES games. However, Nintendo kept much stricter controls on mapper hardware than on the NES, which had dozens of different mappers. Nintendo in the early days of the Game Boy used two "mappers" called MBC 1 and MBC2 (Memory Bank Controller). All licensed third party companies were required to use these two mappers (if their game was larger than 32KB) and did not use their own custom hardware except in a very limited way.
MCB1 could support 2MB of ROM with 8KB of SRAM or 512KB of ROM with 32KB of SRAM. Games in the US were released with up to 512KB ROM with 8KB Battery Backed SRAM . MBC2 games used 256KB ROMs with 512 nybbles of Battery Backed SRAM integrated into the MBC chip. Much, much later, when the first Pokemon games were released, they used the MBC3 with support for 2MB of ROM and 32KB SRAM. Additionally, the MBC3 also included a battery backed real time clock chip driven by an external oscillator.
For the Game Boy Color, Nintendo made a MBC5 chip that was included in virtually every Game Boy Color game. This chip could support up to 8MB of ROM (no US game ever required more than 4MB) and 128KB of Battery Backed SRAM. It could also support a rumble feature, but not a real time clock chip. Only Game Boy Color games requiring a real time clock used MBC3.
The flash cart will work in a Super Game Boy or the GameCube Game Boy Player. In the Super Game Boy, any game with Super Game Boy features will work fine if it is the only game in the page. If it is not, then you must start the game from the menu, then reset the game to utilize the Super Game Boy features. Otherwise, the Super Game Boy will play the game as if it were a regular Game Boy game. It may be unreliable in a Game Boy Pocket due to the power draw, I would find the highest rated mA AAA batteries you can find if you used it in one.
While many, many games work correctly on this cartridge, quite a few will not. Some write to the flash cart's registers and screw things up. Some games written for MBC1-MBC3 hardware will sometimes fail to work properly or at all because they use a feature which does not work or works differently on an MBC5. The only solution is to patch your ROM. Virtually every game that has had problems reported on the NESDEV forum has a fix. Most of them you can find here : http://thegaminguniverse.org/ninjagaiden4/mottzilla/ and some you can find in this thread : http://forums.nesdev.com/viewtopic.php?f=20&t=5804. According to this, the first revision of the 32MB carts had an functional MBC1 emulation mode : http://blog.gg8.se/gameboyprojects/week09/EMS_FAQ.txt
When I termed this flash cart to be a diamond in the rough, the main issue is the stock flashing program. Engrish aside, the chief problem with the program as it runs on the Game Boy is that it only allows for one game to save at a time! So if you play The Legend of Zelda : Link's Awakening, save and do not download your save to your PC, when the next game you play is Metroid II : Return of Samus, you will have wiped out your Zelda save. Most Game Boy games only use 8KB to save the game and the cart has 128KB of SRAM available. A NESDEV user named Mottzilla made a custom version of the PC loader that will allow for multiple save games on the cart. It is a must-download, find it here : http://thegaminguniverse.org/ninjagaiden4/mottzilla/
With his addon to the flasher program, each game with save features has an 8KB slot in the SRAM reserved for it. (MBC2 games still take 8KB even though they only save 256 bytes). One 32KB game save (for games like Pokemon) is supported, so you can have up to 16 or 12 save files in the SRAM at any one time. You can see which games currently have saves and you can delete saves from the flash cart's menu. However, the SRAM is saved between the two ROM pages, so it is generally best to use one ROM page for games with save features and the other ROM page for games that do not save.
There are a few weird games that will never work with this cart. Some Japanese games used more exotic mapping hardware like the HuC-1 and HuC-3, which supported an infrared sensor for wireless communication. None were released in the US or Europe except for a Pokemon clone called Robopon - Sun Version, which used the HuC-3 and came in an oversized black cartridge. It has an infrared port for commications with other Robopon carts. Uniquely, it has the capabilities to make simple sounds from the cartridge when the cartridge is not in use. It has a speaker and an extra (user replaceable) battery for this function in addition to the battery backed internal RAM. Finally, it has a real time clock. Kirby's Tilt 'N Tumble used the MBC7 due to its motion sensor. The Game Boy Camera is another piece of unique hardware that includes a ROM which functions like a game cartridge. Finally, there is an official ROM of Mortal Kombat 1 & 2 which uses MCB1 in an odd way to support its 1MB size, just use the standalone versions of the games instead. Unlicensed games, such as those released by Sachen and Wisdom Tree, use their own custom mapper and are not playable with this flash cart..
Those of us who love Game Boy games have been yearning for years for a proper Game Boy flash cart like the PowerPak. One that accepts microSD cards, does not suck down batteries, properly supports all four major MBCs and a real time clock, and whose menu hardware can be shut off from the system upon game loading. It does not exist as of yet (but Gamegear carts do, strangely), so at the moment we are stuck with this device, which is much better than nothing.
Saturday, June 22, 2013
The Sound Blaster 2.0 and the C/MS Upgrade
I decided to make this post to gather all the known, accurate and current information about the C/MS Upgrade for the 2.0. This post supersedes anything I have said in prior blog posts. As you should know, Creative Labs' first PC sound card of any note was the Creative Music System card (C/MS), later re-marketed as the Creative Game Blaster. This card was based off two sound chips CL labeled as CMS-301, but were actually Phillips SAA-1099s. The C/MS card was not a great seller, and even the name change failed to dislodge the Ad Lib card from its increasingly dominant position in the affordable PC sound card market. Thus Creative came up with the "killer card", the Sound Blaster. This card combined the full functionality of the Ad Lib card with almost all of the functionality (detection chip was eliminated) of the Game Blaster card and more.
While the Sound Blaster, with its joystick/midi port and digital sound processor began to sell well, the design was definitely in the past with lots of TTL logic chips. The card was originally marketed as a "stereo" card, but the only thing stereo about it was the Game Blaster chips. In the first production runs, the two Game Blaster chips, marked with a CMS-301 sticker were soldered onto the motherboard (so were the Ad Lib chips, marked FM1312 and FM1314). However, almost nobody really cared about Game Blaster when the Adlib was also present, and virtually all games that supported the Game Blaster also supported the Adlib, so for the 1.5 version of the Sound Blaster, the chips were not installed by default. The two empty sockets could be populated with chips purchased from Creative Labs fr the low, low price of $29.95.
After CL released its new flagship product, the Sound Blaster Pro, it redesigned the original Sound Blaster as a budget card and released it in late 1991. CL enhanced the new card's capabilities by allowing it to record up to 44.1kHz, but it was still a mono card. The C/MS chips were left off the board again, but this time there was a third empty socket. A special version of the upgrade was required from CL, one not really well-identified in the catalog accompanying the card. The third chip had a sticker marking it as 0048013500, and underneath it was a pre-programmed PAL (Programmable Array Logic) chip. (A PAL16L8 chip with the security fuse blown so it could not be dumped). Eventually, CL stopped advertising the upgrade completely, and without the PAL chip the CMS functionality would not work even with the Phillips chips were installed. Given the rarity of boards with the upgrade found in the wild and the lack of advertising, few upgrade kits must have been sold.
However, in 2012, a long-time member of the Vintage Computer Forum named Chuck(G) devised a method to determine the way in which common PALs were programmed. He analyzed the PAL on an officially upgraded Sound Blaster 2.0 and released the instructions to replicate the programmed logic on a GAL (Generic Array Logic) chip. Unlike PALs, GALs can be reprogrammed and do not require expensive and hard to obtain hardware to program. The GAL required is a GAL16V8 and the file to program the chip can be found here : http://www.vintage-computer.com/vcforum/entry.php?328-Cloning-a-HAL-PAL-Part-11. The programmed chip is inserted in the only empty socket that can fit it, the one just above the FM1312 /YM-3812 chip. However, in almost one year following the first successful report of a GAL SB 2.0 CMS upgrade, not all boards have worked with the upgrade. Below I try to identify each board known whether or not to work with the upgrade.
Board Types :
The earliest known boards are CT-1350B boards marked with a rev 2, 3 or 4 and do not have "SOUND BLASTER" silkscreened. Here is a photo of a rev. 3 board :
And here is a rev. 4 board :
Between the rev 2 and 3 and the rev 4 and later boards, there is one obvious difference. The rev 4 added a DMACTL jumper. This jumper will disable the DMA capabilities of the Sound Blaster, virtually eliminating its ability to reproduce digitized sound. However, and it may not be truly visible in the photo, but both share a CT1336 Bus Interface Chip and a CT1351 DSP chip, version v2.01. You can identify the DSP chip by this silkscreened text on the chip "CT1351V201", with the "V201" indicating the version number.
These boards have been proven time and time again to work with the CMS upgrade, whether a GAL or PAL. Apparently there is no difference between Lattice Semiconductor and National Semiconductor GALs. Apparently SGS Thompson GALs do not work.
Later boards look like this :
Now the words "SOUND BLASTER" are next to the CT1350B. This board's revision can be determined by the six digit number silkscreened to the lower left of the address jumpers. The above board is a 049151 and it works with the upgrade. 059316 and 069328 are also known to exist and shown below :
After further testing, it has been determined that that the DSP version does not matter, the CT1336 version does. If your card has a v2.01 DSP or v2.02 DSP, the upgrade will work. If your card has a CT1336 Bus Interface Chip, the upgrade will work. On the other hand, if your card has a CT1336A Bus Interface Chip, the upgrade will not work! Since v2.02 DSPs tend to be paired with CT1336A chips, be very careful to check the bus interface chip before you buy. Make sure your seller shows you a photo of the exact card you will receive.
While the Sound Blaster, with its joystick/midi port and digital sound processor began to sell well, the design was definitely in the past with lots of TTL logic chips. The card was originally marketed as a "stereo" card, but the only thing stereo about it was the Game Blaster chips. In the first production runs, the two Game Blaster chips, marked with a CMS-301 sticker were soldered onto the motherboard (so were the Ad Lib chips, marked FM1312 and FM1314). However, almost nobody really cared about Game Blaster when the Adlib was also present, and virtually all games that supported the Game Blaster also supported the Adlib, so for the 1.5 version of the Sound Blaster, the chips were not installed by default. The two empty sockets could be populated with chips purchased from Creative Labs fr the low, low price of $29.95.
After CL released its new flagship product, the Sound Blaster Pro, it redesigned the original Sound Blaster as a budget card and released it in late 1991. CL enhanced the new card's capabilities by allowing it to record up to 44.1kHz, but it was still a mono card. The C/MS chips were left off the board again, but this time there was a third empty socket. A special version of the upgrade was required from CL, one not really well-identified in the catalog accompanying the card. The third chip had a sticker marking it as 0048013500, and underneath it was a pre-programmed PAL (Programmable Array Logic) chip. (A PAL16L8 chip with the security fuse blown so it could not be dumped). Eventually, CL stopped advertising the upgrade completely, and without the PAL chip the CMS functionality would not work even with the Phillips chips were installed. Given the rarity of boards with the upgrade found in the wild and the lack of advertising, few upgrade kits must have been sold.
However, in 2012, a long-time member of the Vintage Computer Forum named Chuck(G) devised a method to determine the way in which common PALs were programmed. He analyzed the PAL on an officially upgraded Sound Blaster 2.0 and released the instructions to replicate the programmed logic on a GAL (Generic Array Logic) chip. Unlike PALs, GALs can be reprogrammed and do not require expensive and hard to obtain hardware to program. The GAL required is a GAL16V8 and the file to program the chip can be found here : http://www.vintage-computer.com/vcforum/entry.php?328-Cloning-a-HAL-PAL-Part-11. The programmed chip is inserted in the only empty socket that can fit it, the one just above the FM1312 /YM-3812 chip. However, in almost one year following the first successful report of a GAL SB 2.0 CMS upgrade, not all boards have worked with the upgrade. Below I try to identify each board known whether or not to work with the upgrade.
Board Types :
The earliest known boards are CT-1350B boards marked with a rev 2, 3 or 4 and do not have "SOUND BLASTER" silkscreened. Here is a photo of a rev. 3 board :
And here is a rev. 4 board :
Between the rev 2 and 3 and the rev 4 and later boards, there is one obvious difference. The rev 4 added a DMACTL jumper. This jumper will disable the DMA capabilities of the Sound Blaster, virtually eliminating its ability to reproduce digitized sound. However, and it may not be truly visible in the photo, but both share a CT1336 Bus Interface Chip and a CT1351 DSP chip, version v2.01. You can identify the DSP chip by this silkscreened text on the chip "CT1351V201", with the "V201" indicating the version number.
These boards have been proven time and time again to work with the CMS upgrade, whether a GAL or PAL. Apparently there is no difference between Lattice Semiconductor and National Semiconductor GALs. Apparently SGS Thompson GALs do not work.
Later boards look like this :
Now the words "SOUND BLASTER" are next to the CT1350B. This board's revision can be determined by the six digit number silkscreened to the lower left of the address jumpers. The above board is a 049151 and it works with the upgrade. 059316 and 069328 are also known to exist and shown below :
Note the new CT1336A and DSP v2.02 on these boards. The 059316 also has small surface mounted versions of the YM3812 and Y3014 chips and a 1993 copyright date silkscreened onto the board. The last four digits refer to the design date of the revision in year-week format. The 059316 and 069328 are the first boards confirmed not to work with the CMS upgrade, regardless of whether a CL PAL or a modern-programmed GAL is used. However, the 049151 is not guaranteed to work, as can be seen here :
It seems that the first two digits, 04, 05 or 06, refer to a revision number. There are some mysteries remaining. For example, will either one of these boards work :
The first is a 049151 with a v2.02DSP and a CT1336 chip, the second one is identical except for the word "SOUND MACHINE".
2019 Update
In the past year or so, one or two people found a Sound Blaster 2.0 card with an original Creative PAL, the two Creative-marked CMS-301/SAA-1099P chips and a CT1336A Bus Interface Chip, and the Game Blaster functionality worked! Apparently CL revised the programming for the PAL to work with the CT1336A chip at some point, presumably when they started manufacturing CT1336A chips. That way, when a customer called with an upgrade request, one PAL would work for all SB 2.0 cards. This newer PAL has been reversed engineered and can be purchased as a GAL, see here for details : https://www.vogons.org/viewtopic.php?f=5&t=30242&view=unread#p726379
Monday, June 17, 2013
Ultima VII on Real Hardware
Ultima VII - The Black Gate is, in my opinion and in many others, the greatest Computer Role Playing Game of the first half of the 1990s. But like many Origin games (Wing Commander I & II) of this time period, it is extremely sensitive to hardware speed. It runs faster than the ideal on my 486DX2/66 (the characters move like they are on amphetamines).
While on my 486DX/2 66 Wing Commander I & II can be tamed by disabling the internal cache, this method doesn't work with Ultima VII. The game re-enables the internal cache in my machine if I disable it beforehand either with a software utility like ICD.EXE or through the BIOS option. Slowing the RAM speed in the BIOS does nothing substantial to slow down the game. The turbo button will show real slowdown, but the slowdown is a little too great and the game runs at a suboptimal speed. The resulting speed is close to a 386DX40. Like Goldilocks and the Three Bears, getting too hot and too cold is easy, but getting the speed "just right" is not easy. Pentium computers will begin to run Ultima VII insanely fast. The game's sequel, Serpent Isle, runs on an updated version of the Ultima VII engine. Serpent Isle has a frame limiter to prevent the action from becoming sickeningly fast, however a powerful machine can still make it play faster than it should.
The ideal system to play Ultima VII on is a 486DX33 with 4MB of RAM, so I have been told. Scorpia made the claim in Computer Gaming World. Apparently this is indicated somewhere on or in the game's packaging, but I have reviewed all the items included in the box and I see no such assertion. The system requirements label on the underside of the box cover just says 2MB of RAM and a 386SX,386 or 486 PC is required and recommends 20+ MHz. Like most games, the more things that are happening on the screen, the slower the resulting gameplay.
The best advice I have for people with a 486DX2/66 is to use the jumpers to lower the FSB speed from 33 to 20MHz. This will turn your CPU into a 486DX2/40, a CPU that never existed in the wild. However, it will produce a correct balance of speed, very close to the ideal.
The next issue is the Voodoo Memory Manager. Ultima VII and Serpent Isle use a custom memory manager that is wholly incompatible with all expanded memory managers, including EMM386, QEMM, CEMM, you name it. All these drivers put the CPU into Virtual 8086 mode, whereas Ultima VII requires true Real Mode for its voodoo memory manager (which the game puts into the so-called "Unreal Mode") to work. Origin specifically informs the user that it used this memory manager to avoid the problems with EMS, but it does not indicate what those problems were. My guess is that the performance hit from using EMS was too much for the game to handle. Ultima VII is also incompatible with Windows for the same reason; it must be run in DOS unless you use the unofficial U7WIN9X patch (for Windows 95-ME).
The problem with these games' EMM386 incompatibility is that EMM386 provides Upper Memory Blocks. Device drivers, such as those for the mouse, CD-ROM and disk cache, can load in UMB to save precious conventional memory. Ultima VII and especially Serpent Isle require lots of conventional memory free, up to approximately 585K free for Serpent Isle with all sound options. No EMM386, no guarantee of UMB. While CTMOUSE is small enough (3K) that it can be loaded without a substantial impact on the conventional RAM, SMARTDRV and MSCDEX require about 25K a piece. Without UMBs its virtually impossible to load a normal DOS configuration. The official solution was to create a barebones boot disk.
It may be possible to create UMBs in systems without loading EMM386. In Pentium and later systems, UMBPCI works with proper PCI implementations. In 386 and 486 systems, you can try HIRAM, URAM (1988), RDOSUMB or LastByte Memory Manager may also work, depending on whether the software supports the chipset and its use of shadowing memory. You will probably not get as many UMBs as you would with EMM386 (forget B000-B7FF), but if it works for you, it will avoid having to specially boot your system for these games.
The final issue is that this game will thrash a hard drive like a sailor running the gauntlet in the Age of Sail. To avoid those annoying short (or not so short) pauses as the game scrolls the screen, the standard 16-bit IDE controller is not going to cut it. Fortunately there are many options to speed up disk access. On the extreme end of the scale, if you have 32MB of RAM, consider creating a RAM drive with the DOS utility RAMDRIVE. A full install of Ultima VII + Forge of Virtue is 19.5MB without savegames. Serpent Isle and The Silver Seed runs to 22.5MB. Thus your RAMDISK needs to be about 1.5MB larger than the install size. Just remember to copy the game back to your system.
Other improvements include SCSI, whether IDE, VLB or PCI; VLB or PCI IDE, and the use of fast hard drives and compact flash cards (with IDE or rare SCSI adapter). SMARTDRV, a disk caching program, will also assist with slower hard drives (i.e. hard drives you would actually have used at the time).
Ultima VII does not really care too much about the VGA card used, but I would recommend a Tseng ET4000AX if you are forced to use an ISA card, otherwise use a PCI or VLB card. It only supports Adlib & Sound Blaster & Roland MT-32 (and compatibles) for music and sound effects. Roland is obviously the best choice, the game uses stereo panning for sound effects and loads custom patches. Adlib and Sound Blaster rely entirely on FM synthesis for music and sound effects. Voice samples (for the Guardian in the intro and occasionally in-game) require a Sound Blaster or Sound Blaster Pro or compatibles.
While on my 486DX/2 66 Wing Commander I & II can be tamed by disabling the internal cache, this method doesn't work with Ultima VII. The game re-enables the internal cache in my machine if I disable it beforehand either with a software utility like ICD.EXE or through the BIOS option. Slowing the RAM speed in the BIOS does nothing substantial to slow down the game. The turbo button will show real slowdown, but the slowdown is a little too great and the game runs at a suboptimal speed. The resulting speed is close to a 386DX40. Like Goldilocks and the Three Bears, getting too hot and too cold is easy, but getting the speed "just right" is not easy. Pentium computers will begin to run Ultima VII insanely fast. The game's sequel, Serpent Isle, runs on an updated version of the Ultima VII engine. Serpent Isle has a frame limiter to prevent the action from becoming sickeningly fast, however a powerful machine can still make it play faster than it should.
The best advice I have for people with a 486DX2/66 is to use the jumpers to lower the FSB speed from 33 to 20MHz. This will turn your CPU into a 486DX2/40, a CPU that never existed in the wild. However, it will produce a correct balance of speed, very close to the ideal.
The next issue is the Voodoo Memory Manager. Ultima VII and Serpent Isle use a custom memory manager that is wholly incompatible with all expanded memory managers, including EMM386, QEMM, CEMM, you name it. All these drivers put the CPU into Virtual 8086 mode, whereas Ultima VII requires true Real Mode for its voodoo memory manager (which the game puts into the so-called "Unreal Mode") to work. Origin specifically informs the user that it used this memory manager to avoid the problems with EMS, but it does not indicate what those problems were. My guess is that the performance hit from using EMS was too much for the game to handle. Ultima VII is also incompatible with Windows for the same reason; it must be run in DOS unless you use the unofficial U7WIN9X patch (for Windows 95-ME).
The problem with these games' EMM386 incompatibility is that EMM386 provides Upper Memory Blocks. Device drivers, such as those for the mouse, CD-ROM and disk cache, can load in UMB to save precious conventional memory. Ultima VII and especially Serpent Isle require lots of conventional memory free, up to approximately 585K free for Serpent Isle with all sound options. No EMM386, no guarantee of UMB. While CTMOUSE is small enough (3K) that it can be loaded without a substantial impact on the conventional RAM, SMARTDRV and MSCDEX require about 25K a piece. Without UMBs its virtually impossible to load a normal DOS configuration. The official solution was to create a barebones boot disk.
It may be possible to create UMBs in systems without loading EMM386. In Pentium and later systems, UMBPCI works with proper PCI implementations. In 386 and 486 systems, you can try HIRAM, URAM (1988), RDOSUMB or LastByte Memory Manager may also work, depending on whether the software supports the chipset and its use of shadowing memory. You will probably not get as many UMBs as you would with EMM386 (forget B000-B7FF), but if it works for you, it will avoid having to specially boot your system for these games.
The final issue is that this game will thrash a hard drive like a sailor running the gauntlet in the Age of Sail. To avoid those annoying short (or not so short) pauses as the game scrolls the screen, the standard 16-bit IDE controller is not going to cut it. Fortunately there are many options to speed up disk access. On the extreme end of the scale, if you have 32MB of RAM, consider creating a RAM drive with the DOS utility RAMDRIVE. A full install of Ultima VII + Forge of Virtue is 19.5MB without savegames. Serpent Isle and The Silver Seed runs to 22.5MB. Thus your RAMDISK needs to be about 1.5MB larger than the install size. Just remember to copy the game back to your system.
Other improvements include SCSI, whether IDE, VLB or PCI; VLB or PCI IDE, and the use of fast hard drives and compact flash cards (with IDE or rare SCSI adapter). SMARTDRV, a disk caching program, will also assist with slower hard drives (i.e. hard drives you would actually have used at the time).
Ultima VII does not really care too much about the VGA card used, but I would recommend a Tseng ET4000AX if you are forced to use an ISA card, otherwise use a PCI or VLB card. It only supports Adlib & Sound Blaster & Roland MT-32 (and compatibles) for music and sound effects. Roland is obviously the best choice, the game uses stereo panning for sound effects and loads custom patches. Adlib and Sound Blaster rely entirely on FM synthesis for music and sound effects. Voice samples (for the Guardian in the intro and occasionally in-game) require a Sound Blaster or Sound Blaster Pro or compatibles.
Sunday, June 16, 2013
Roland MPU-401 : The Vintage Computing Part that Simply Works
I have, and have owned lots of vintage PC items since 2006. I used to have an Apple II Platinum with dual Mockingboards, that was quite the system. I have an Atari 800 that works, but I have no software for it. I also have had lots of PC compatible cards and components. I have owned one of every major iteration of the 3dfx Voodoo cards except the Voodoo 2 (and I had one of those back in the day). At one time I possessed a representative of every major ISA Sound Blaster. I currently possess a Game Blaster complete in box with all manuals, and while that is an exceptionally rare card, it is not what I consider the highlight of my collection. The quality of the music for the card or its chips is often third-rate (behind MT-32 and Adlib). I have the true IBM VGA 8-bit card, and while it is also extremely rare its not especially impressive in what it can do. I have a Gravis Ultrasound ACE, but from what I hear the full Ultrasound is not particularly incompatible with the Adlib or Sound Blaster, so its Adlib port disabling feature is not a great selling point. The 286 Express Tandy 1000/SX accelerator is also quite a find, but it can overheat with software. The ADP-50L is a fine 16-to-8 bit IDE card, but it has a very long bootup screen.
Perhaps most prized item in my possession is my Roland MPU-401 MIDI Processing Unit. Equally as important is the interface card that came with it, the MIF-IPC-A. The MPU-401 is essentially a computer in its own right, with a CPU, ROM, RAM and a bus interface. The external unit contains all the intelligent circuitry and can be connected to an Apple II, a Commodore 64, an MSX, Sharp X1, a Fujitsu FM7 or an NEC PC-88 or PC-98 machine in addition to a PC compatible. All you need is the appropriate adapter card or cartridge. The external unit connects to the interface card/cartridge through a DB-25 male/male cable. In my cable, pins 7, 8, 19, 20 and 21 are not connected and pins 13 & 25 are tied together (both GND).
Just as important, it came with all documentation and manuals. It had the Roland MPU-401 Technical Interface Manual, the MPU-401 Booklet and the sheet for the MIF-IPC(-A) with the schematic on the back. Finally, the equipment and manuals are in beautiful shape. It didn't come with the box as I recall. Later a friend of mine donated a second MPU-401 to me, but I don't have a second interface card for it.
What is the most important thing about this interface is what it represented. The principal function of this interface was to connect to a Roland MT-32 or CM-32L (have both of them). Unlike the Game Blaster, when a game supported the MT-32, the game almost always sounded best on it (not too hard when the competition was the Adlib Music Synthesizer Card and to a lesser extent the Game Blaster). Roland LA Synthesis had a very long reign in the PC sound realm, from 1988 to 1992 when General MIDI devices began taking over.
The interface card provides two I/O ports to the interface at I/O 330 & 331. By its design, it cannot use I/O 332-337 because pins A1 & A2 are unconnected The traces next to the comparator allow it to be put virtually anywhere in the PC's 10-bit address space. It can also use any interrupt request available on the ISA bus, but to use an IRQ other than the default 2/9, you also have to cut a trace. Essentially the card is a traffic cop and a buffer for the 8 data bits and a few other signals. It also supplies an Interrupt Request when the MPU-401 needs to assert an interrupt. The card uses a DB-25 female connector and bracket, four 74LS TTL logic chips and decoupling capacitors, one resistor and one electrolytic capacitor. The card is as simple as you get and can easily be recreated. You can find the schematic in my Tutorial : How to Get the Roland MT-32 working with DOS Games. Without it, the complex interface unit is useless.
The card can be used in every PC compatible system, including PCs, XTs, ATs, Tandy 1000s and others. The original MIF-IPC was a more complex card but has issues working in AT-class machines. While the MPU-IPC and MPU-IPC-T can also work with everything, the interface is on the card, and if lost the breakout box is useless. The last in the series, the MPU-401AT is not guaranteed to work in something less advanced than an AT system. However, it was released in 1994 when the XT class machines were no longer a strong market segment, so it may not have been tested in them. It is an 8-bit card, and probably will work without trouble in standard and near-standard (Tandy 1000) PCs. However, it, like the MPU-401 SCC-1 and LAPC-I card go for very high prices when they are auctioned off on ebay. MPU-IPC and MPU-IPC-Ts are often auctioned off incomplete. In those cases, the card is important and the external MIDI IN and OUT ports can be recreated easily.
The MPU-401, MPU-IPC, MPU-IMC, MPU-IPC-T and LAPC-I all provide two MIDI OUTs. Thus you can control a Roland MT-32 and a Roland SC-55 with one of these devices and not have to use a MIDI THRU, which can add latency and eventually lead to loss of data integrity depending on how many modules are in the Thru chain. In practice, the first module in a MIDI THRU loop will be fine.
In my experience, the interface cards are much more difficult to find than the external interface. If you find one, some things to note :
On the card itself, the traces by the silkscreened I/O and Int boxes should be at A7 & A& and 2/9, respectively. If the unlikely event they are not, then you will need to connect them with wire and obliterate any other traces made. Otherwise the card will not be at the ideal settings and your games will not work with it unless they have an option to manually set the I/O and IRQ values.
The external interface unit has a cover secured by four Phillips screws. You should remove them. Inside, depending on the age of the board, you may or may not have an EPROM/ROM. Versions 1.2A, 1.2B, 1.3, 1.4, 1.4A, 1.4B, 1.5 and 1.5A exist. Version 1.5A is the version found in the MPU-IPC, MPU-IMC, MPU-IPC-T and LAPC-I, so that is the version you should look for. If the CPU inside is a HD6801VOB55P, then the ROM is embedded in the chip. Serial numbers above 588000 are safe. Look on the underneath of the box to find it. You can find the version number, if not identified by an EPROM sticker, with a program called mputhru. In the end, I am uncertain if it makes any difference what the ROM version is, at least if you have a 1.3 or above.
Second, inside the box there is a silkscreened "ADRS" with four positions numbered 1-4. This must not be modified in order for your unit to work with the MIF-IPC-A or at I/O 330-331. If it is, you must set it back to #1 with a jumper wire. This was there to allow a single MIF-IPC and other cards to control up to four external interfaces. It will not work with the MIF-IPC-A, with the MIF-IPC-A only address #1 will be valid. You need an original IF-MIDI/IBM or MIF-IPC (functionally identical) card to get this to work. Here is how the address lines were configured on the IF-MIDI/IBM and MIF-IPC :
A0 - 0 or 1 (selects command/data or status port)
A1 - 0 or 1
A2 - 0 or 1
A3 - 0 or 1 (Unconnected at MPU-401)
A4 - 1
A5 - 1
A6 - 0
A7 - 0
A8 - 1
A9 - 1
The MPU-IPC-A connects the lines as follows :
A0 - 0 or 1 (selects command/data or status port)
A1 - 0 (Not connected to MPU-401)
A2 - 0 (Not connected to MPU-401)
A3 - 0
A4 - 1
A5 - 1
A6 - 0
A7 - 0
A8 - 1
A9 - 1
For all cards, the traces can be cut and a jumper block can be soldered on to select 0 or 1 for A4-A9, which gives 64 choices, not all of them would work.
As I understand the function of the LS138 in the MPU-401 box, it is responsive to ISA A1 and A2. Typically, they are both 0 and the third '138 input is always 0, only Y0 will be active (low). If A1 or A2 become 1, then the MPU-401 should be unresponsive. You could control up to four MPU-401 boxes in software just by setting the right address. A1 = 1, A2 = 0, I/O 332-333; A1 = 0, A2 = 1, I/O 334-335; A1 = 1, A2 = 1, I/O 336-337.
The ADRS block in the MPU-401 should allow you to rearrange the addressing of the 138 with only cutting a trace and installing a jumper block or wire. I have never done this and see no reason to do it, but it is definitely possible, but only with the older boards. These boards have flaky compatibility in an IBM AT or systems with faster speeds, but the MIF-IPC-A is sold in any system with an ISA slot, including Tandy 1000s.
Fascinatingly, this unit, which was released around 1984, combined with the MIF-IPC-A card released in 1986, functioned perfectly as a MIDI interface for well over fifteen years after it was released. I am positive that driver support exists for it in every version of Windows from 3.0 with multimedia extensions to Windows XP. It has no hanging note bugs and any game or software requiring Normal, a.k.a. Intelligent MPU-401 features will work with it.
Unlike virtually every other contemporary sound card, the Roland MPU-401 + MT-32 or SC-55 never requires any kind of software driver to use with games or programs. Drivers always accompany the software or can be found as patches (floppy versions of LOOM, Secret of Monkey Island & Day of the Tentacle). Usually the same can be said for the Adlib, Game Blaster and Sound Blasters, but some games do require drivers supplied on the card's installation disks to work. Once you get to the Sound Blaster Pros and 16s or your Pro Audio Spectrums and your Gravis Ultrasounds, you will need your installation disks.
Unknown to me for the longest time, but there was secretly trouble in this paradise. One game that required MPU-401 was It Came from the Desert. That game would simply not work with the MT-32 option in my 486 with the MPU-401 and MIF-IPC-A. I put it down to marginal code which did not like my system and I did not think of it again for a long time. However, eventually I found there were problems in trying to use the MIDI IN of the MPU-401. I eventually isolated the issue to the 74LS04 Hex Inverter on my MIF-IPC-A. The interrupt line goes though that IC and was not being raised by the system, thus breaking many MIDI recording programs and It Came from the Desert. I replaced that IC and everything worked thereafter, including that Cinemaware game. Roland did not use all the inputs (6) on the hex inverter and left the unused inputs floating. Apparently this is a bad thing for the life of the chip, and all the unused inputs (pins 1, 11 & 13) should be tied to ground (pin 7) .
Perhaps most prized item in my possession is my Roland MPU-401 MIDI Processing Unit. Equally as important is the interface card that came with it, the MIF-IPC-A. The MPU-401 is essentially a computer in its own right, with a CPU, ROM, RAM and a bus interface. The external unit contains all the intelligent circuitry and can be connected to an Apple II, a Commodore 64, an MSX, Sharp X1, a Fujitsu FM7 or an NEC PC-88 or PC-98 machine in addition to a PC compatible. All you need is the appropriate adapter card or cartridge. The external unit connects to the interface card/cartridge through a DB-25 male/male cable. In my cable, pins 7, 8, 19, 20 and 21 are not connected and pins 13 & 25 are tied together (both GND).
Just as important, it came with all documentation and manuals. It had the Roland MPU-401 Technical Interface Manual, the MPU-401 Booklet and the sheet for the MIF-IPC(-A) with the schematic on the back. Finally, the equipment and manuals are in beautiful shape. It didn't come with the box as I recall. Later a friend of mine donated a second MPU-401 to me, but I don't have a second interface card for it.
What is the most important thing about this interface is what it represented. The principal function of this interface was to connect to a Roland MT-32 or CM-32L (have both of them). Unlike the Game Blaster, when a game supported the MT-32, the game almost always sounded best on it (not too hard when the competition was the Adlib Music Synthesizer Card and to a lesser extent the Game Blaster). Roland LA Synthesis had a very long reign in the PC sound realm, from 1988 to 1992 when General MIDI devices began taking over.
The interface card provides two I/O ports to the interface at I/O 330 & 331. By its design, it cannot use I/O 332-337 because pins A1 & A2 are unconnected The traces next to the comparator allow it to be put virtually anywhere in the PC's 10-bit address space. It can also use any interrupt request available on the ISA bus, but to use an IRQ other than the default 2/9, you also have to cut a trace. Essentially the card is a traffic cop and a buffer for the 8 data bits and a few other signals. It also supplies an Interrupt Request when the MPU-401 needs to assert an interrupt. The card uses a DB-25 female connector and bracket, four 74LS TTL logic chips and decoupling capacitors, one resistor and one electrolytic capacitor. The card is as simple as you get and can easily be recreated. You can find the schematic in my Tutorial : How to Get the Roland MT-32 working with DOS Games. Without it, the complex interface unit is useless.
The card can be used in every PC compatible system, including PCs, XTs, ATs, Tandy 1000s and others. The original MIF-IPC was a more complex card but has issues working in AT-class machines. While the MPU-IPC and MPU-IPC-T can also work with everything, the interface is on the card, and if lost the breakout box is useless. The last in the series, the MPU-401AT is not guaranteed to work in something less advanced than an AT system. However, it was released in 1994 when the XT class machines were no longer a strong market segment, so it may not have been tested in them. It is an 8-bit card, and probably will work without trouble in standard and near-standard (Tandy 1000) PCs. However, it, like the MPU-401 SCC-1 and LAPC-I card go for very high prices when they are auctioned off on ebay. MPU-IPC and MPU-IPC-Ts are often auctioned off incomplete. In those cases, the card is important and the external MIDI IN and OUT ports can be recreated easily.
The MPU-401, MPU-IPC, MPU-IMC, MPU-IPC-T and LAPC-I all provide two MIDI OUTs. Thus you can control a Roland MT-32 and a Roland SC-55 with one of these devices and not have to use a MIDI THRU, which can add latency and eventually lead to loss of data integrity depending on how many modules are in the Thru chain. In practice, the first module in a MIDI THRU loop will be fine.
In my experience, the interface cards are much more difficult to find than the external interface. If you find one, some things to note :
On the card itself, the traces by the silkscreened I/O and Int boxes should be at A7 & A& and 2/9, respectively. If the unlikely event they are not, then you will need to connect them with wire and obliterate any other traces made. Otherwise the card will not be at the ideal settings and your games will not work with it unless they have an option to manually set the I/O and IRQ values.
The external interface unit has a cover secured by four Phillips screws. You should remove them. Inside, depending on the age of the board, you may or may not have an EPROM/ROM. Versions 1.2A, 1.2B, 1.3, 1.4, 1.4A, 1.4B, 1.5 and 1.5A exist. Version 1.5A is the version found in the MPU-IPC, MPU-IMC, MPU-IPC-T and LAPC-I, so that is the version you should look for. If the CPU inside is a HD6801VOB55P, then the ROM is embedded in the chip. Serial numbers above 588000 are safe. Look on the underneath of the box to find it. You can find the version number, if not identified by an EPROM sticker, with a program called mputhru. In the end, I am uncertain if it makes any difference what the ROM version is, at least if you have a 1.3 or above.
Second, inside the box there is a silkscreened "ADRS" with four positions numbered 1-4. This must not be modified in order for your unit to work with the MIF-IPC-A or at I/O 330-331. If it is, you must set it back to #1 with a jumper wire. This was there to allow a single MIF-IPC and other cards to control up to four external interfaces. It will not work with the MIF-IPC-A, with the MIF-IPC-A only address #1 will be valid. You need an original IF-MIDI/IBM or MIF-IPC (functionally identical) card to get this to work. Here is how the address lines were configured on the IF-MIDI/IBM and MIF-IPC :
A0 - 0 or 1 (selects command/data or status port)
A1 - 0 or 1
A2 - 0 or 1
A3 - 0 or 1 (Unconnected at MPU-401)
A4 - 1
A5 - 1
A6 - 0
A7 - 0
A8 - 1
A9 - 1
The MPU-IPC-A connects the lines as follows :
A0 - 0 or 1 (selects command/data or status port)
A1 - 0 (Not connected to MPU-401)
A2 - 0 (Not connected to MPU-401)
A3 - 0
A4 - 1
A5 - 1
A6 - 0
A7 - 0
A8 - 1
A9 - 1
For all cards, the traces can be cut and a jumper block can be soldered on to select 0 or 1 for A4-A9, which gives 64 choices, not all of them would work.
As I understand the function of the LS138 in the MPU-401 box, it is responsive to ISA A1 and A2. Typically, they are both 0 and the third '138 input is always 0, only Y0 will be active (low). If A1 or A2 become 1, then the MPU-401 should be unresponsive. You could control up to four MPU-401 boxes in software just by setting the right address. A1 = 1, A2 = 0, I/O 332-333; A1 = 0, A2 = 1, I/O 334-335; A1 = 1, A2 = 1, I/O 336-337.
The ADRS block in the MPU-401 should allow you to rearrange the addressing of the 138 with only cutting a trace and installing a jumper block or wire. I have never done this and see no reason to do it, but it is definitely possible, but only with the older boards. These boards have flaky compatibility in an IBM AT or systems with faster speeds, but the MIF-IPC-A is sold in any system with an ISA slot, including Tandy 1000s.
Fascinatingly, this unit, which was released around 1984, combined with the MIF-IPC-A card released in 1986, functioned perfectly as a MIDI interface for well over fifteen years after it was released. I am positive that driver support exists for it in every version of Windows from 3.0 with multimedia extensions to Windows XP. It has no hanging note bugs and any game or software requiring Normal, a.k.a. Intelligent MPU-401 features will work with it.
Unlike virtually every other contemporary sound card, the Roland MPU-401 + MT-32 or SC-55 never requires any kind of software driver to use with games or programs. Drivers always accompany the software or can be found as patches (floppy versions of LOOM, Secret of Monkey Island & Day of the Tentacle). Usually the same can be said for the Adlib, Game Blaster and Sound Blasters, but some games do require drivers supplied on the card's installation disks to work. Once you get to the Sound Blaster Pros and 16s or your Pro Audio Spectrums and your Gravis Ultrasounds, you will need your installation disks.
Unknown to me for the longest time, but there was secretly trouble in this paradise. One game that required MPU-401 was It Came from the Desert. That game would simply not work with the MT-32 option in my 486 with the MPU-401 and MIF-IPC-A. I put it down to marginal code which did not like my system and I did not think of it again for a long time. However, eventually I found there were problems in trying to use the MIDI IN of the MPU-401. I eventually isolated the issue to the 74LS04 Hex Inverter on my MIF-IPC-A. The interrupt line goes though that IC and was not being raised by the system, thus breaking many MIDI recording programs and It Came from the Desert. I replaced that IC and everything worked thereafter, including that Cinemaware game. Roland did not use all the inputs (6) on the hex inverter and left the unused inputs floating. Apparently this is a bad thing for the life of the chip, and all the unused inputs (pins 1, 11 & 13) should be tied to ground (pin 7) .
Saturday, June 15, 2013
8-bit Chip Music vs. Game Music
I love video game music, but I love it in the manner in which it is intended to be enjoyed, playing a video game. As any regular reader of this blog knows (is there anyone out there), I also love retro consoles like the NES and the SNES. The NES was the first console to truly give music an important role in home video games. Before the NES, many video games like those found on the Atari 2600 or the Colecovision relied primarily on sound effects or very short and simple tunes in the systems' games. The sound functionality in the 2600 was not very musical, but the Intellivision, Colecovision, 5200 and Vetrex all had sound chips with basic musical functionality using square waves and noise to produce sound. Like consoles, most home computers released in the USA prior to 1985 had fairly primitive sound hardware like the speaker for the Apple II, TRS-80 and IBM PC or the primitive DACs for the TRS-80 CoCo and early Macintoshes. Other home computers, including the Atari 8-bits, the VIC-20, the TI-994A, the Mockingboard sound cards for the Apple II and IBM PCjr./Tandy 1000 had similar musical chips to those included in the pre-crash consoles.
However, the Commodore 64 showed the world it was possible to put an advanced synthesizer chip inside the machine to make music. In fact it was designed to address, in the opinion of its designer, the unmusicality of its home computer predecessors. Thus the famous SID chip was born with its three channels, each of which can have a square, triangle, sawtooth or noise waevform selected, ASDR envelopes, high, low and bandpass filters, ring modulation, etc.
At the time of the video game crash, video game music was splitting into three separate ideologies. First there was the American ideology of jingles and sound effects. Arcade games, from which this concept arose, needed little else. Since the consoles were considered dying and computers did not have exceptional sound capabilities across the board, little attention was given to music, except in certain instances. Richard Garriott and Origin Systems, beginning with Ultima III, supported full music scores, but this was an unusual idea, especially for role playing games. On the Apple II, you needed a special card called the Mockingboard to hear the music, and the board was not especially popular. Phillip Price's Alternate Reality series had a very elaborate musical intro for its time, especially considering it was developed for the Atari 8-bit series. Even with title music, most of these games were played in silence. Pitfall II : Lost Caverms was a rare exception for the 2600, but that required a complex and unique chip to help it play music.
The second method of music was the Japanese approach. This was focused around the Famicom (NES), the Sega Mark III (Master System) and the 8/16-bit Japanese computers of the time. More and more, the concept of music always was in force. Konami's Gyruss (1983) and Gradius (1985) are good examples of early arcade games where there real music tracks playing throughout. While the Mark III had relatively anemic built-in sound hardware, it could be upgraded with an FM Synthesizer Module. The NEC PC-8801 and 9801 had Yamaha FM chips, the MSX had sound add-on upgrades, and the Famicom could support external audio chips in cartridges, and its built-in sound was put through its paces. Composers like Yuzo Koshiro and Koji Kondo became internationally famous, or at least their music did.
The third method was the European method. Unlike their American and Japanese counterparts, who enjoyed cartridge and floppy disk games, European gamers generally played their games off cassette tapes until the Amiga and ST became popular. Cassette tapes take exponentially longer to load than floppy disks but the C64 Datasette was far cheaper than the disk drive in Europe. To compare, the Commodore 64 could store 170KB on each side of a formatted floppy disk. While a 90-minute cassette can store more data per side, the system can only handle a 64KB load at a time. Loading a game could be very slow, up to ten minute load times were not unheard of. Someone had the bright idea of having the computer do something while a game loaded, and the concept of the chiptune was born. While the tape was turning, a special piece of music would play on the computer. Soon these pieces would become complex and push the limits of the SID's capabilities. Tunes were also made, on a much lesser scale, for the European Atari 8-bit machine, the ZX Spectrum 128 and Amstrad CPC machines. The actual in-game music was generally much less impressive, due in part because the audio no longer had the primary attention of the 1MHz 6510 of the C64.
Since the American game developers were comparatively muted, Japanese games began defining the music for two-thirds of the gaming world. Super Mario Bros., the game that firmly established the NES as giving people a reason to buy consoles again, has music for virtually every portion of the game except the title screen. The game's classic tunes accompany Mario as he proceeds across the Mushroom Kingdom to save Princess Toadstool (later Peach). There are not one but four major pieces of music depending on the level you are in. This music plays non-stop until you complete the level. The main theme (da-da-da da-da da)is even today almost instantly recognizable, even by people who have never played the game. Series like Castlevania and Mega Man, both of which began on the NES, were famous for the quality of their soundtracks. Even RPGs like Dragon Warrior, Final Fantasy and the ports of the Ultima games to the NES had full sountracks and were rarely silent.
One comparison I can make is between Super Mario Bros and its C64 clone, Great Gianna Sisters. GGS slavishly copied the gameplay of SMB while adding very little. The C64 had the capabilities to make a very decent port of SMB, and GGS is something like a decent port. One area the developers failed to take inspiration is in GGS's music. While the game is never silent, containing title screen music and in-game music, the in-game music leaves something to be desired. Unlike SMB, the overworld music is also used for the underworld, so the only time the actual music changes is in the castles. In SMB, in addition to the underground, there are also water levels with music, music when you get an invincibility star and the castle music.
The most important distinction, however, between Kondo's music in SMB and Huelsbeck's in GGS is that Kondo's music effortlessly flows with the game. It makes you want to keep going and complements the level. The underground has spooky music, the water worlds have slower paced music and the castle levels sound harsh. Its catchy stuff. Huelsbeck's in-game music tries to capture that same "traveling along" spirit but it just does not work nearly as well. The music in GGS calls too much attention to itself compared with what is happening on the screen. The main theme is full of odd notes and sound effects, and the castle theme is discordant. The music is also somewhat at odds with the whimsical and cute nature of the game.
The chief difference between chip music and game music thus is that chip music is any music composed with a computer or console sound chip. On the Commodore 64, the European composers went wild with chip music. However, the chip shows its greatest potential when the system does not have to keep track of input, scrolling or sprites, in other words when the game is not actually being played. An excellent case in point is the music for Mayhem in Monsterland. This late C64 from the UK is quite an achievement considering the limitations of the C64, the gameplay is smooth and the graphics are extremely detailed and colorful for the system. However, the in-music is extremely simple, repetitive and monotonous. The port of R-Type to the system kept the original arcade music intact and the results are very impressive. The gameplay, however, is less impressive compared to other ports of the game.
The NES showed that you can have good music and good gameplay. Mega Man 2 is one of the best examples of this. The level designs are not always perfect, but the quality of the music just makes you want to keep playing. Also, Mega Man 2 had some stunning large enemy graphics and animated backgrounds for the time. However, as much as I love the music to games, I do not have stored in my smartphone or listen to them in the car. The music is best experienced, by far, when you are actually playing the game.
This bring me to an ideological issue I have with C64 chip tunes. You are hearing that music while the game's cassette tape is loading. You are not playing during that time. NES and other console games do not have loading times. I am listening and playing. With the C64, you listen, then you play but rarely both at the same time. In my opinion, the first five to ten minutes waiting for your game to load is to listen to some musician trying to demonstrate how well he can program the SID chip. I can find better ways to spend my time, thanks.
However, the Commodore 64 showed the world it was possible to put an advanced synthesizer chip inside the machine to make music. In fact it was designed to address, in the opinion of its designer, the unmusicality of its home computer predecessors. Thus the famous SID chip was born with its three channels, each of which can have a square, triangle, sawtooth or noise waevform selected, ASDR envelopes, high, low and bandpass filters, ring modulation, etc.
At the time of the video game crash, video game music was splitting into three separate ideologies. First there was the American ideology of jingles and sound effects. Arcade games, from which this concept arose, needed little else. Since the consoles were considered dying and computers did not have exceptional sound capabilities across the board, little attention was given to music, except in certain instances. Richard Garriott and Origin Systems, beginning with Ultima III, supported full music scores, but this was an unusual idea, especially for role playing games. On the Apple II, you needed a special card called the Mockingboard to hear the music, and the board was not especially popular. Phillip Price's Alternate Reality series had a very elaborate musical intro for its time, especially considering it was developed for the Atari 8-bit series. Even with title music, most of these games were played in silence. Pitfall II : Lost Caverms was a rare exception for the 2600, but that required a complex and unique chip to help it play music.
The second method of music was the Japanese approach. This was focused around the Famicom (NES), the Sega Mark III (Master System) and the 8/16-bit Japanese computers of the time. More and more, the concept of music always was in force. Konami's Gyruss (1983) and Gradius (1985) are good examples of early arcade games where there real music tracks playing throughout. While the Mark III had relatively anemic built-in sound hardware, it could be upgraded with an FM Synthesizer Module. The NEC PC-8801 and 9801 had Yamaha FM chips, the MSX had sound add-on upgrades, and the Famicom could support external audio chips in cartridges, and its built-in sound was put through its paces. Composers like Yuzo Koshiro and Koji Kondo became internationally famous, or at least their music did.
The third method was the European method. Unlike their American and Japanese counterparts, who enjoyed cartridge and floppy disk games, European gamers generally played their games off cassette tapes until the Amiga and ST became popular. Cassette tapes take exponentially longer to load than floppy disks but the C64 Datasette was far cheaper than the disk drive in Europe. To compare, the Commodore 64 could store 170KB on each side of a formatted floppy disk. While a 90-minute cassette can store more data per side, the system can only handle a 64KB load at a time. Loading a game could be very slow, up to ten minute load times were not unheard of. Someone had the bright idea of having the computer do something while a game loaded, and the concept of the chiptune was born. While the tape was turning, a special piece of music would play on the computer. Soon these pieces would become complex and push the limits of the SID's capabilities. Tunes were also made, on a much lesser scale, for the European Atari 8-bit machine, the ZX Spectrum 128 and Amstrad CPC machines. The actual in-game music was generally much less impressive, due in part because the audio no longer had the primary attention of the 1MHz 6510 of the C64.
Since the American game developers were comparatively muted, Japanese games began defining the music for two-thirds of the gaming world. Super Mario Bros., the game that firmly established the NES as giving people a reason to buy consoles again, has music for virtually every portion of the game except the title screen. The game's classic tunes accompany Mario as he proceeds across the Mushroom Kingdom to save Princess Toadstool (later Peach). There are not one but four major pieces of music depending on the level you are in. This music plays non-stop until you complete the level. The main theme (da-da-da da-da da)is even today almost instantly recognizable, even by people who have never played the game. Series like Castlevania and Mega Man, both of which began on the NES, were famous for the quality of their soundtracks. Even RPGs like Dragon Warrior, Final Fantasy and the ports of the Ultima games to the NES had full sountracks and were rarely silent.
One comparison I can make is between Super Mario Bros and its C64 clone, Great Gianna Sisters. GGS slavishly copied the gameplay of SMB while adding very little. The C64 had the capabilities to make a very decent port of SMB, and GGS is something like a decent port. One area the developers failed to take inspiration is in GGS's music. While the game is never silent, containing title screen music and in-game music, the in-game music leaves something to be desired. Unlike SMB, the overworld music is also used for the underworld, so the only time the actual music changes is in the castles. In SMB, in addition to the underground, there are also water levels with music, music when you get an invincibility star and the castle music.
The most important distinction, however, between Kondo's music in SMB and Huelsbeck's in GGS is that Kondo's music effortlessly flows with the game. It makes you want to keep going and complements the level. The underground has spooky music, the water worlds have slower paced music and the castle levels sound harsh. Its catchy stuff. Huelsbeck's in-game music tries to capture that same "traveling along" spirit but it just does not work nearly as well. The music in GGS calls too much attention to itself compared with what is happening on the screen. The main theme is full of odd notes and sound effects, and the castle theme is discordant. The music is also somewhat at odds with the whimsical and cute nature of the game.
The chief difference between chip music and game music thus is that chip music is any music composed with a computer or console sound chip. On the Commodore 64, the European composers went wild with chip music. However, the chip shows its greatest potential when the system does not have to keep track of input, scrolling or sprites, in other words when the game is not actually being played. An excellent case in point is the music for Mayhem in Monsterland. This late C64 from the UK is quite an achievement considering the limitations of the C64, the gameplay is smooth and the graphics are extremely detailed and colorful for the system. However, the in-music is extremely simple, repetitive and monotonous. The port of R-Type to the system kept the original arcade music intact and the results are very impressive. The gameplay, however, is less impressive compared to other ports of the game.
The NES showed that you can have good music and good gameplay. Mega Man 2 is one of the best examples of this. The level designs are not always perfect, but the quality of the music just makes you want to keep playing. Also, Mega Man 2 had some stunning large enemy graphics and animated backgrounds for the time. However, as much as I love the music to games, I do not have stored in my smartphone or listen to them in the car. The music is best experienced, by far, when you are actually playing the game.
This bring me to an ideological issue I have with C64 chip tunes. You are hearing that music while the game's cassette tape is loading. You are not playing during that time. NES and other console games do not have loading times. I am listening and playing. With the C64, you listen, then you play but rarely both at the same time. In my opinion, the first five to ten minutes waiting for your game to load is to listen to some musician trying to demonstrate how well he can program the SID chip. I can find better ways to spend my time, thanks.
First Generation Roland Sound Canvas Devices
In this article, I will take an overview of the various first generation devices Roland released bearing its "Sound Canvas" branding and GS standard and their importance to PC gaming. I characterize a "first generation Roland GS device" as any device with the features of an SC-55 but not the full features of an SC-55mkII.
In 1991, Roland released its SC-55 MIDI Sound Generator, its first GS product. This machine supported 16 MIDI channels (percussion on channel 10), 128 capital tones, 69 variation tones and 8 percussion/drum sets. It could support up to 24 voice polyphony at one time (each instrument using 1-2 voices or partials). It also had an effects processor with chorus and reverb which games (Ultima VIII : Pagan) have used. It has a battery to remember the user's settings This is a standard CR2032 coin cell in a holder and should be replaced or removed when you purchase a unit.
Its backlit LCD screen will show which tone is being played by which part, and games can also control the text at the top of the display (Lands of Lore). All first generation devices also supported 128 MT-32 tones and a MT-32 + CM-32L percussion set (63 sounds). The total Sound Canvas patch set is 3MB in size, 1MB of which is probably dedicated to the MT-32 tones
The SC-55 has two MIDI IN ports, one MIDI OUT and one MIDI THRU port. It has two RCA input jacks and two RCA output jacks and a headphone mini-jack. It also comes with a hand held remote but no software. It has 18 front panel buttons. From here you can mute all sound output from the module, which is very useful if you are sending the input from an MT-32 or CM-32L through it. You can set the MT-32 emulation mode by pressing the instrument left button as you turn the power on and pressing All. You can set the GS mode by pressing the instrument right button and pressing All. You can do a full factory reset by holding both instrument buttons and pressing all.
The earliest Roland SC-55s simply have a "GS STANDARD" logo on the bottom of the front panel, while slightly later SC-55s have a "GS" logo. Modules with the GS logo may still have General MIDI firmware.
Between these Roland SC-55s and the later devices, there were two changes in the instrument map. Early modules have at Program Change #121 Fl. Key Click. There is no Breath Noise tone. Later modules in the series would put Breath Noise as the default tone on PC#121 and put Fl. Key Click as a variation tone. Breath Noise is the canonical General MIDI tone for PC#121. Early modules lack a sine wave variation tone at PC#81. These changes can be seen in all General MIDI/GS units and even some simply "GS" branded units. GS branded SC-55s with a Control ROM version of 1.10 will have the old instrument table, while units with a 1.20 or 1.21 Control ROM will have the new instruments.
If you want to find the version number of your unit, put the SC-55 in standby mode and press both Instrument buttons followed by both MIDI CH buttons. The information will flash on the LCD panel.
A hypothesis has circulated that certain PC game composers may have used an early "GS STANDARD" to compose their music. Examples included Descent and Descent 2, DOOM, DOOM II or Duke Nukem 3D. However, the composer for DOOM I & II and Duke Nukem, Robert Prince, has stated that he did not use an early SC-55. Moreover, at least six composers contributed to Descent and Descent 2. There is little likelihood of more than one or two of them using an early module. I would suggest that this hypothesis is a myth that has been debunked.
Next, Roland released the SCC-1 IBM PC sound card. This card contained the synthesis portion of the SC-55 with a cut-down Roland MPU-401 interface. No battery, no display. Nonetheless, if a game had an install option for the Roland SCC-1, the exact same functionality will be present though a Roland MPU-401 card/breakout box connected to an SC-55 via MIDI OUT. The SCC-1 has two RCA output jacks, one headphone mini-jack, one MIDI OUT and one MIDI IN mini-DINs. They need adapters to full DINs. Pinouts can be found here : http://nerdlypleasures.blogspot.com/2010/03/tutorial-how-to-get-roland-mt-32.html
Soon after, Roland released the GS compatible devices in its "computer module series", the CM-300 and CM-500. The CM-300 has the exact same capabilities as the SC-55 except no LCD panel or buttons and fewer ports. The CM-500 is a CM-300 combined with an CM-32L. Both CM modules lack a battery. Both have one MIDI IN, MIDI OUT and MIDI THRU ports. They also have two auidio output 1/4" jacks, one stereo headphone 1/4" jack. The CM-300 has two audio input 1/4" jacks, intended for an MT-32 or CM-32L's input. The CM-500 has a four-position mode switch instead. The CM-500 is a true Roland LA Synthesis device, but its vibrato has been criticized as noticeably harsher than the MT-32, MT-100, CM-32L, CM-64 and LAPC-I. The SCC-1, CM-300 and CM-500 also support the first 64 tones ("built-in") of the CM-64 module, although the instrument sample quality is superior on the true CM-64.
By no later than March of 1992, Roland added official support for the General MIDI standard to the SC-55, CM-300 and CM-500. These modules are identified with the General MIDI logo being next to the GS logo. Unlike the SC-55mkII, the SC-55 cannot initialize GM mode via the front panel. However, it can turn the GS features off by pressing the part buttons simultaneously, using the all and mute buttons to get to the Rx GS Reset item and pressing the instrument left button to turn it off (and the right button to turn it on).
What does the addition of General MIDI support mean in practice?
General MIDI is a derived subset of the GS standard. In other words, GS is General MIDI with some extra features. In a pre-General MIDI module, if a GS reset or a GM reset was sent to the module, it would have the identical effect, the module would be reset. In a General MIDI module, a GS reset would turn on GS features (variation tone support and NRPN) if the device was in GM mode. You would think that a GM reset would turn off those GS features. However, this is not the case unless the setting of Rx GS Reset is turned off. In that case, games that use variation tones will not be used (Might and Magic IV-V).
In the beginning of 1992, Roland released the SC-155. This device added sliders to control various settings of the synthesizer and otherwise looks like an SC-55. This is the first device that officially supported the General MIDI standard on its release, as would all Roland products released thereafter.
Later in 1992, Roland released the JV-30 Keyboard. This was a music synthesizer keyboard with 61 keys and could save the user's modifications to individual tones. It appears to contain the extra tones found in the SC-55mkII. However, unlike the mkII it does support capital tone fallback.
What is capital tone feedback and why is it important?
Each of the 128 Capital Tones can support 127 Variation Tones in the GS standard, even though the upper limit never came close to being never reached. If a composer used variation tone #5 of capital tone #1 (Grand Piano), in his MIDI synthesizer for a certain piece of music, it would not play if a listener played back the song on another device that did not contain variation tone #5. In the early first generation GS devices, if the specified variation tone did not exist in the module, then the module would fallback and play the capital tone instead. Since variation tones are supposed to be similar to the capital tone, some semblance of the sound the composer intended would be played. Beginning with the SC55mkII, however, no sound will be heard if the song selects an invalid variation tone. This can be manifested when programmers got lazy and relied on this feature instead of sending proper Program Change and Bank Select messages (Space Quest V : The Next Mutation).
Finally, Roland released an update to the SCC-1. The board itself is labeled the SCC-1A. It has the extra sounds of the JV-30 Keyboard but does not support capital tone fallback. This was a feature patented by Yamaha apparently.
Also during 1992, Roland released the General MIDI only SC-7. This cut down module does not support the GS standard, so no variation tones. However, it does support a Serial MIDI interface (no longer useless for DOS games because of SoftMPU) and 28-voice polyphony. It also supports six out of the nine drumsets, reverb and chorus found in a true Sound Canvas. The SC-55mkII would incorporate the increased polyphony of the SC-7 and the sounds of the SCC-1A.
In 1991, Roland released its SC-55 MIDI Sound Generator, its first GS product. This machine supported 16 MIDI channels (percussion on channel 10), 128 capital tones, 69 variation tones and 8 percussion/drum sets. It could support up to 24 voice polyphony at one time (each instrument using 1-2 voices or partials). It also had an effects processor with chorus and reverb which games (Ultima VIII : Pagan) have used. It has a battery to remember the user's settings This is a standard CR2032 coin cell in a holder and should be replaced or removed when you purchase a unit.
Its backlit LCD screen will show which tone is being played by which part, and games can also control the text at the top of the display (Lands of Lore). All first generation devices also supported 128 MT-32 tones and a MT-32 + CM-32L percussion set (63 sounds). The total Sound Canvas patch set is 3MB in size, 1MB of which is probably dedicated to the MT-32 tones
The SC-55 has two MIDI IN ports, one MIDI OUT and one MIDI THRU port. It has two RCA input jacks and two RCA output jacks and a headphone mini-jack. It also comes with a hand held remote but no software. It has 18 front panel buttons. From here you can mute all sound output from the module, which is very useful if you are sending the input from an MT-32 or CM-32L through it. You can set the MT-32 emulation mode by pressing the instrument left button as you turn the power on and pressing All. You can set the GS mode by pressing the instrument right button and pressing All. You can do a full factory reset by holding both instrument buttons and pressing all.
The earliest Roland SC-55s simply have a "GS STANDARD" logo on the bottom of the front panel, while slightly later SC-55s have a "GS" logo. Modules with the GS logo may still have General MIDI firmware.
Between these Roland SC-55s and the later devices, there were two changes in the instrument map. Early modules have at Program Change #121 Fl. Key Click. There is no Breath Noise tone. Later modules in the series would put Breath Noise as the default tone on PC#121 and put Fl. Key Click as a variation tone. Breath Noise is the canonical General MIDI tone for PC#121. Early modules lack a sine wave variation tone at PC#81. These changes can be seen in all General MIDI/GS units and even some simply "GS" branded units. GS branded SC-55s with a Control ROM version of 1.10 will have the old instrument table, while units with a 1.20 or 1.21 Control ROM will have the new instruments.
If you want to find the version number of your unit, put the SC-55 in standby mode and press both Instrument buttons followed by both MIDI CH buttons. The information will flash on the LCD panel.
A hypothesis has circulated that certain PC game composers may have used an early "GS STANDARD" to compose their music. Examples included Descent and Descent 2, DOOM, DOOM II or Duke Nukem 3D. However, the composer for DOOM I & II and Duke Nukem, Robert Prince, has stated that he did not use an early SC-55. Moreover, at least six composers contributed to Descent and Descent 2. There is little likelihood of more than one or two of them using an early module. I would suggest that this hypothesis is a myth that has been debunked.
Next, Roland released the SCC-1 IBM PC sound card. This card contained the synthesis portion of the SC-55 with a cut-down Roland MPU-401 interface. No battery, no display. Nonetheless, if a game had an install option for the Roland SCC-1, the exact same functionality will be present though a Roland MPU-401 card/breakout box connected to an SC-55 via MIDI OUT. The SCC-1 has two RCA output jacks, one headphone mini-jack, one MIDI OUT and one MIDI IN mini-DINs. They need adapters to full DINs. Pinouts can be found here : http://nerdlypleasures.blogspot.com/2010/03/tutorial-how-to-get-roland-mt-32.html
Soon after, Roland released the GS compatible devices in its "computer module series", the CM-300 and CM-500. The CM-300 has the exact same capabilities as the SC-55 except no LCD panel or buttons and fewer ports. The CM-500 is a CM-300 combined with an CM-32L. Both CM modules lack a battery. Both have one MIDI IN, MIDI OUT and MIDI THRU ports. They also have two auidio output 1/4" jacks, one stereo headphone 1/4" jack. The CM-300 has two audio input 1/4" jacks, intended for an MT-32 or CM-32L's input. The CM-500 has a four-position mode switch instead. The CM-500 is a true Roland LA Synthesis device, but its vibrato has been criticized as noticeably harsher than the MT-32, MT-100, CM-32L, CM-64 and LAPC-I. The SCC-1, CM-300 and CM-500 also support the first 64 tones ("built-in") of the CM-64 module, although the instrument sample quality is superior on the true CM-64.
By no later than March of 1992, Roland added official support for the General MIDI standard to the SC-55, CM-300 and CM-500. These modules are identified with the General MIDI logo being next to the GS logo. Unlike the SC-55mkII, the SC-55 cannot initialize GM mode via the front panel. However, it can turn the GS features off by pressing the part buttons simultaneously, using the all and mute buttons to get to the Rx GS Reset item and pressing the instrument left button to turn it off (and the right button to turn it on).
What does the addition of General MIDI support mean in practice?
General MIDI is a derived subset of the GS standard. In other words, GS is General MIDI with some extra features. In a pre-General MIDI module, if a GS reset or a GM reset was sent to the module, it would have the identical effect, the module would be reset. In a General MIDI module, a GS reset would turn on GS features (variation tone support and NRPN) if the device was in GM mode. You would think that a GM reset would turn off those GS features. However, this is not the case unless the setting of Rx GS Reset is turned off. In that case, games that use variation tones will not be used (Might and Magic IV-V).
In the beginning of 1992, Roland released the SC-155. This device added sliders to control various settings of the synthesizer and otherwise looks like an SC-55. This is the first device that officially supported the General MIDI standard on its release, as would all Roland products released thereafter.
Later in 1992, Roland released the JV-30 Keyboard. This was a music synthesizer keyboard with 61 keys and could save the user's modifications to individual tones. It appears to contain the extra tones found in the SC-55mkII. However, unlike the mkII it does support capital tone fallback.
What is capital tone feedback and why is it important?
Each of the 128 Capital Tones can support 127 Variation Tones in the GS standard, even though the upper limit never came close to being never reached. If a composer used variation tone #5 of capital tone #1 (Grand Piano), in his MIDI synthesizer for a certain piece of music, it would not play if a listener played back the song on another device that did not contain variation tone #5. In the early first generation GS devices, if the specified variation tone did not exist in the module, then the module would fallback and play the capital tone instead. Since variation tones are supposed to be similar to the capital tone, some semblance of the sound the composer intended would be played. Beginning with the SC55mkII, however, no sound will be heard if the song selects an invalid variation tone. This can be manifested when programmers got lazy and relied on this feature instead of sending proper Program Change and Bank Select messages (Space Quest V : The Next Mutation).
Finally, Roland released an update to the SCC-1. The board itself is labeled the SCC-1A. It has the extra sounds of the JV-30 Keyboard but does not support capital tone fallback. This was a feature patented by Yamaha apparently.
Also during 1992, Roland released the General MIDI only SC-7. This cut down module does not support the GS standard, so no variation tones. However, it does support a Serial MIDI interface (no longer useless for DOS games because of SoftMPU) and 28-voice polyphony. It also supports six out of the nine drumsets, reverb and chorus found in a true Sound Canvas. The SC-55mkII would incorporate the increased polyphony of the SC-7 and the sounds of the SCC-1A.
Subscribe to:
Posts (Atom)