I can tell you what we had been working on prior to Commodore changing management, and extrapolate that, had they not hired the managers who bankrupted Commodore, and had they spent money more wisely, investing in technology instead of manager’s salaries, where things were likely to go.
If you don’t recognize me, I’m Dave Haynie. I was a senior hardware engineer at Commodore from 1983 to 1994, and found myself working on high-end Amiga machines after taking on the Amiga 2000 design in 1986.
Amiga 4000
There was always going to be an Amiga 4000. But the Amiga 4000 you got, designed by Greg Berlin and Scott Schaeffer, was kind of a last minute thing… that’s yet another story, but I can write about where we did go in another article.
So in 1991, I had been developing the prototype of the next generation high-end system at Commodore, which had been dubbed the Amiga 3000+. At the time, this was the first system using the Amiga Pandora chipset, which lots of people called “AA” and the marketroids had dubbed “AGA” to make it sound PCish. In fact, this machine almost booted up first shot in January of 1991, but the original Alice chip had a bug, so I didn’t get a display. The next revision was ready in February, and it was up an running. I found a few other Alice bugs… fixed them in the system design. They never fixed it in the chip!
The Amiga 3000+ at Rev 1 did have 68030 processor, those the desire was to switch over the 68040 in the next revision, including revised versions of the Buster, Gary, and RAMsey chips. The big improvement, though, was that the A3000+ had the AT&T DSP3210 digital signal processor as a secondary processor, with AT&T’s VCOS/VCAS operating system integrated into full cooperation with AmigaOS. Amigas were already being commonly used for video rendering, and the DSP3210 had the advantage of running 32-bit floating point up to 10x faster than an MC68040.
Connected to the DSP, at least on the prototype, we had two audio CODECs. One was a bidirectional stereo audio CODEC for 16-bit, 48kHz audio. The other was a mono phase-correcting CODEC for telephony, like modems, etc. The deal with AT&T gave us most of their mathematics libraries and modems to 2400 baud in the box. There would probably have been an add-on for a 9600 baud modem, running on the DSP… that was kind of crown jewel with them and they didn’t want to bundle with the basic features.
Amiga1000+
While not known so well, at the time I was working on the A3000+, Joe Augenbraun was working on a thing we had dubbed the A1000+. This was intended to be another full 32-bit machine, expandable with a CPU slots, two Zorro slots, but sell for under $1,000, in a low profile case with detachable keyboard. Kind of shooting for the Amiga 500 upgrade market there, pretty much everyone thought that machine would put Commodore on the map by selling like crazy. This was another fatality of the mid-1991 management change.
Amiga 5000
In 1988, Commodore started working on the Advanced Amiga Architecture, which of course had no other possible nickname but “AAA” — this is, in fact, why the Pandora chipset was dubbed AA, which stood for nothing other than, well, one less “A”. AAA was a full 32-bit/64-bit graphics system with 1280x1000 display capability, 24-bit graphics in various modes, chunky pixels to 8-bits, planar graphics to 10-bits, a number of “hybrid” pixel modes that basically worked a bit like MPEG, HAM8, HAM10, chunky HAM8, all kinds of stuff. It had 8-channel, 16-bit audio, a floppy style interface fast enough for CDs and even lower-end HDDs. It was a four chip set in the 32-bit implementation, doubling up on Linda and Mary chips to make a 64-bit version. It supported both fast page mode and dual port video RAM.
I built a test system, dubbed Nyx, that exploited this chipset. This was for testing purposes only, not really intended as a product per se, but it was intended to hash out some product concepts. For example, it modules for chip RAM, which could be either VRAM or DRAM, up to 16MB. It had ROM on a SIMM, and that SIMM specification supported flash memory devices, even though we didn’t actually build a flash device in 1993 when I was working on this. I put in a low speed (2.5Mb/s) network using ARCnet protocols… not sure if we could have justified Ethernet just yet, but networking was a given.
The prototype used Amiga 3000 chips and the Amiga 3000 processor modules, just because they existed. I had been working on a whole new system architecture, starting back in 1991, dubbed “Acutiator”. This originally defined a modular, CPU-independent system bus, but when PCI came out, there was no question: we would use PCI.
It’s always a little difficult to say just where Commodore would have wound up. The Amiga systems were doing quite well up until the end of Commodore, but not necessarily the systems you could buy. Commodore’s rampant mismanagement starting in 1991 managed to kill off popular computers like the Amiga 500, replace them with a machine no one at the time wanted, the Amiga 600, and intentionally delay the Pandora systems for six months, mostly just to make the previous engineering management team look bad, which on one believed anyway.
However, there had been years of underfunding development underpinning that final effort at suicide. For example, the AAA project was started in 1988. It should have been shipping in 1992 if not 1990. A 64-bit graphics system with full four operand blitter and all that would have been pretty amazing in 1990. It would have been okay in 1992. By 1994 — the earliest it could have been out if things hadn’t completely tanked in 1993 — it would have been okay, but others were already getting serious about 3D by then.
Getting Serious About 3D
Commodore was also getting serious about 3D near the end. Dr. Ed Hepler had been working on a system dubbed Hombre. This was a complete reinvention of the idea of the graphics chips, intended to learn from Amiga but not worry about compatibility — after all, we’d all have RTG by then, so any graphics system would run on an Amiga, eh?
Hombre was chunky-pixel only, and optimized for 16-bit pixels. It supported four simultaneous playfields, and it had its own processor with 3D instructions. So a full GPU, even if different in concept from the GPUs that actually made it to market by 1999. This was intended to be dual use: a two-chip set concentrating mostly on graphics, living on the PCI bus (Dr. Hepler and I had independently decided PCI was our logical future).
The CPU in Hombre was a PA-RISC architecture processor, leading many to speculate that future Amigas would use PA-RISC. Dr. Hepler chose PA-RISC mainly because of the simplicity of the design and the ability to extend the instruction set. We at the systems and software groups, management, etc. has not really had that big talk about where we’d go beyond the 68K. Some felt that PowerPC was a logical next move, since Apple was pushing that into being a proper desktop CPU, but of course, that also meant that Apple got exclusive use of new processors, as long as they were paying for development. The video rendering people were lusting after DEC/Samsung Alpha, simply because they were the big dog in terms of CPU performance in the early 1990s. There was no easy answer, and at least the way history turned out, the only right answers — ARM or x86 — were not on the table.
Investigations of Multiprocessing
Near the end, between 1991 and 1994, I was doing as many projects as I could fit in, hoping that if some interested party came along to buy the company, they’d see all this stuff and understand something about what they were buying.
One of the curious things I made was the Gemini board. This was a Zorro III board with two 68030 processors and FPUs, each with 4MB of DRAM, which could talk to the main system via DMA and interrupts. This was a bit of a clumsy way of doing it, but the idea was to work on a loosely coupled multiprocessing system. Randell Jessup, which had also worked on the DSP software, was working on this concept on the software side.
Unfortunately, this counted on a new feature I had put into the last Buster chip, which allowed it to be used as a board controller rather than a host controller. That worked. But unfortunately, the CSG folks had used that chip to test out the early version of Commodore’s sea-of-gates gate array technology, designed to replace our channeled arrays. These only ran at about 1/4 the designed speed. Enough to know it kind of worked, but I really didn’t get very far. By the time the revised chips came along, the window for this project had closed. You never know what to expect when things are spiraling down and completely out of your control.
AmigaOS
I was of course designing the hardware, and that tended to be a full time job… and then some. So I can’t tell everything that was being done in software. But I know a few things.
The big one was Retargetable Graphics. Chris Green was working on a fully modular system that would allow add-on graphics boards to be supported with drivers that worked into the Amiga graphics libraries and everything built on top of that (layers, Intuition, etc). If you’ve followed the Amiga market, this was hackable anyway, more or less by replacing most of the graphics library, but the intent was to formalize this in a much simpler way.
Another thing they had been working on was the imaging model. Amigas largely did things in terms of bitmaps, which was good for videogames and all, but bad for publishing and more complex graphics, and even more clumsy as we started to get to larger displays. In fact, we already had them — most of the folks in engineering who used Amigas for work that some version of Hedley Davis’ “Headly Hires” monitor, which delivered 1000x800 pixels in two bit monochrome. In fact, if you put AmigaOS 2.0 on such a monitor, it looked just a bit like NeXTOS. This was not a coincidence.
The plan was to offer Postscript as an imaging engine built into AmigaOS. Just as you could open a bitmapped window at a low level or a console window at some higher level, you’d be able to open a postscript window. I don’t know much more of the details, but it seemed like a pretty cool idea, using existing technology that already solved much of the problem of targeting images for publishing.
At some point we had also discussed doing OpenGL windows in a similar way. The existing Amiga hardware didn’t do 3D in interesting ways, but Hombre would. And rather than invent something new, OpenGL was the obvious choice. Of course, as I pointed out, I wasn’t developing the software. There were probably all sorts of cool features in the minds of the Software Team. Not just features, of course, but directions.
Lots was happening with object oriented programming, between BOOPSI for programming user interface stuff under Intuition and Datatypes for dealing with data classes. This was going to keep growing and ultimately become the preferred system for many kinds of programming. There was some effort to make devices work in a similar fashion… the Amiga driver model already supported that to an extent, but adding true driver classes, subclassing, and reflection hadn’t been implemented, but seemed like to happen.
The Missing Onwards
I had a habit of trying to think about five years down the road, when I was building computers for Commodore. It wasn’t always an accurate view, of course, but that was about the extent of my crystal ball. I do think, looking back, it was probably inevitable that we merge with mainstream x86 CPUs at some point. The hardware was going to be CPU agnostic in the A5000 generation, because that’s certainly about the time we’d have to deal with going beyond the 68000… unless we had interest in making our own CPUs. That had been discussed a couple of times.
Commodore was also stumbling a little on the technology in the 1990s. Of course Commodore was Commodore because of their custom chips. We had more chip designers than system designers! And we were pushing the envelope on contract semiconductor fabs, too. That really wasn’t much of thing in those days. But Commodore Semiconductor Group was behind the times. They had 1.5 micron CMOS in the early 1990s. You could make a Lisa chip in 1.5 micron CMOS, but 1.0 micron was better. And all the AAA chips were 1.0 micron. Hewlett Packard made Lisas and did the AAA prototypes as well.
Going to outside chip fabs also changed the dynamics of chip development. This really meant more simulation and longer revision cycles than Commodore has been used to. I could get a new chip in less than a month, maybe three weeks if it was a Commodore gate array and CSG wasn’t super busy with other chips. But it’s also the case that everyone last system design house got out of the chip business except for a few giants like IBM… in fact, IBM kept the chips and got out of the PC business.
That is, in fact, what I kept doing for the PiOS One in 1996/1997, which together with Andy Finkel, Stephan Domeyer, and Geerd Ebeling, started PiOS AG with the intent of making personal computers. In this design, CPU and memory were on a card that connected via PCI to a CPU-less main board. I think an A5000 or A6000 would have been made something like this.
Now, I was building computers for Commodore based on what they did well and didn’t do well. Commodore had never been the company to crank out a whole new system every year. In fact, when it was a relatively small team, and custom silicon was a key part of it, there just wasn’t that possibility. I’m not sure how well that would have scaled into a computer industry that’s become so influenced by the consumer elections market that you need to make two new models every year. When I knew that an Amiga main board was going to need to stick around for five years, building in better modularity was far more important than if each board was a throw-away after a year.
Oddly enough, today, I have something similar on the embedded system boards I designed at Rajant, though there is a board management processor on the main board, we just keep the main Linux CPU on a module in order to lower costs and allow flexibility. Particularly when dealing with x86 CPUs…
Read More
http://www.devili.iki.fi/mirrors/haynie/research/acutiatr/docs/acu1.pdf
http://www.devili.iki.fi/mirrors/haynie/research/nyx/docs/AAA.pdf
View More, Cry a Little Too