In the beginning - page 1 I started my university career in 1970, studying computer science and especially architecture. At the time there was no such thing as a microprocessor, and "architecture" meant nand gates and flip-flops. And then in 1974, Intel came out with the 8080 microprocessor, and life as we know it changed. Over the years I watched the 8080 evolve into the 8086, 80186, 80286, 80386, 80486, 80586 (the "Pentium"), and then cores with various nicknames from there on out. (Apparently Intel found out they couldn't trademark numbers!) Each year these processors became bigger and bigger and more and more power hungry and more and more (and more) complex. And each year I thought, "Couldn't it be made more simple instead? Why are they adding all this complexity?" One answer to that question is that each new model in this progression had to run *absolutely* every bit of software created for the previous model. Even old features that were superseded by new features had to be left in. They can only add new stuff and can never remove old stuff. And they could never change any of the fundamental underpinnings of its operation. Machine language code today looks almost identical to what it looked like 50 years ago. The code I wrote in 1975 will probably still run today. All of this was good for sales and marketing, and the planet transformed. But it put us into a paradigm "rut" that killed a lot of potential creativity, a rut we are still in to this day a half-century later.