
| From: Alvin Starr <alvin@netvel.net> | Geac did the same thing. | Several years later when I was with ISG we developed a 128bit processor | that we jokingly called a VRISC processor because it had something like | 6 instructions. | We were using the processor in a graphics display system. My impression is that GEAC was in a really special place in the market. It was highly vertical -- hardware through applications deployment. It may have helped vendor lock-in to have their own hardware. I don't actually know what their hardware advantage was, if any. Perhaps they understood transaction processing better than designers of minis or micros. | The Geac system was originally designed with core memory where the | access times were in the range of micro-seconds and the clock speed of | the microcode in the CPU was about 4Mhz built using 4bit bit-slice ALU's | and a lot of random logic. If you were interested in hardware in those days, that was an attractive approach. But if you were really interested in supporting credit unions and libraries, I don't see that this was a good use of your energy. (I did know Gus German before GEAC. Interesting guy. One of the original four undergrads who wrote the WATFOR compiler (for the IBM 7040/44).) | Microcode also helped with reusing gates. | For example coding a multiply instruction as a loop of adds and shifts. | now days most processors have ripple multipliers. Some of the original RISC machines had a "multiply step" instruction. You just wrote a sequence of them. The idea was that each instruction took exactly one cycle and a multiply didn't fit that. | RISC also benefited from increased transistor density. Every processor benefited from that. I guess RISC processors had more regularity and that made the design cycle take less engineering which in turn could improve time-to-market. But Intel had so many engineers that that advantage disappeared. But there was a point where all or a RISC CPU would fit on a single die but a comparable CISC CPU would not. This made a big difference. But we passed that roughly when the i386 came out. Of course Intel's process advantage helped a bit. | The x86 although popular is not the best example of a CISC design. | The National Semiconductor NS32000 which I believe was the first | production 32bit microprocessor. It sure wasn't the first if you count actually really working silicon. I have some scars to prove it. | The current x86 64bit is just the last of a long set of patches from the | 8086. Yes, but the amazing thing is that the i386 and AMD patches were actually quite elegant. If I remember correctly, Gordon Bell said roughly that you could update an architecture once, but after that things get to be a mess. I'm impressed that the AMD architecture is OK. I'm not counting all the little hacks (that I don't even know) like MMX, AVX, ... | I believe the last original CPU design from intel was the iAPX 432. i860? i960 (with Siemens)? Itanium (with HP)? | Intel had plans to dead end the x86 in favour if the Itanium as the step | up to 64bit but AMD scuttled those plays by designing a 64 but | instruction set addition. Yeah. It kind of pulled an Intel. The AMD architecture filled a growing gap in x86's capability and was good enough. One of Intel's motivations for Itanium seemed to be to own the architecture. It really was unhappy with having given a processor license to AMD. | A number of Risc processors still live on mostly in embedded applications.. | MIPS. I didn't mention MIPS. Mostly because they seem to be shrinking. Most MIPS processors that I see are in routers and newer models seem to be going to ARM. | It was a shame to see the end of the Alpha it was a nice processor and | opened the door to NUMA interprocessor interconnects that just came into | the the Intel world. If I remember correctly, some Alpha folks went to AMD, some went to Sun, and some went to Intel. Intel doesn't have all the good ideas even if it had all the processor sales. AMD's first generation of 64-bit processors was clearly superior to Intel's processors, up until the Core 2 came out. AMD still lost to Intel in the market. It's sad to see AMD's stale products now. I think that ARM's good idea was to stay out of Intel's field of view until they grew strong. Intel had an ARM license (transferred from DEC) from the StrongARM work. They decided to stop using it after producing some chips focused on networking etc. ("XScale"). They sold it to Marvell. There were and are a lot of architectures in the embedded space but ARM seems to be the one that scaled. It could be the wisdom of ARM Inc. but I don't know that.