The Era of Compute Chaos
The Bad Old Days
I often make reference to the bad old days when I made specific career choices owing to my reading of the computer industry. So I had ChatGPT write an essay because I don’t have time this morning. Here it is.
The Unsettled Architecture of Computing (1983–1995)
The computing industry between the early 1980s and mid-1990s was not merely competitive; it was structurally indeterminate. Multiple incompatible models of hardware, operating systems, networking, and even artificial intelligence coexisted without a clear trajectory toward convergence. In retrospect, the emergence of commodity personal computers running Microsoft operating systems, connected via TCP/IP over Ethernet, appears inevitable. At the time, however, that outcome was only one possibility among many, and not necessarily the most technically compelling.
At the operating system level, fragmentation was pervasive. “UNIX” existed less as a unified platform than as a loose family of mutually incompatible implementations. Vendors such as Sun, IBM, Hewlett-Packard, and Digital Equipment Corporation each maintained their own variants, derived from either AT&T’s System V or Berkeley’s BSD lineage. Standards efforts like POSIX attempted to impose coherence, but progress was slow and partial. Meanwhile, IBM and Microsoft jointly advanced OS/2 as a successor to DOS, offering features such as protected memory and preemptive multitasking that surpassed the capabilities of early Windows. Yet OS/2’s ecosystem failed to gain sufficient developer momentum, and the partnership itself fractured. Microsoft redirected its efforts toward Windows NT, while IBM retained control of OS/2 in a manner that constrained its broader adoption.
This lack of standardization at the software layer was mirrored—and in some respects amplified—by divergence in hardware strategy. A significant portion of the industry placed its bets not on commodity microcomputers but on high-performance engineering workstations. Companies such as Apollo Computer and Symbolics built specialized systems optimized for technical computing, computer-aided design, and artificial intelligence workloads. These machines were often tightly integrated with proprietary operating systems and networking stacks, delivering performance and capabilities well beyond what early PCs could offer.
Parallel to this, a distinct branch of computing—centered on artificial intelligence—was gaining significant attention. Firms like Gold Hill Computer participated in an ecosystem focused on Lisp-based development environments and expert systems. The prevailing assumption was that knowledge representation and rule-based inference engines would become a dominant paradigm in enterprise computing. Expert systems were deployed in domains such as medical diagnosis, financial analysis, and industrial control, and for a time they attracted substantial investment.
However, both the workstation model and the expert systems movement suffered from similar structural weaknesses. They depended on specialized hardware, bespoke software environments, and highly trained operators. As general-purpose microprocessors rapidly improved, the performance gap between workstations and commodity PCs narrowed. At the same time, expert systems proved brittle outside narrowly defined problem domains. Their reliance on explicitly encoded rules made them difficult to scale, maintain, and generalize. By the early 1990s, the so-called “AI winter” had set in, and many of the companies and research programs associated with expert systems declined sharply.
The collapse of these parallel trajectories reinforced a broader industry shift. The minicomputer paradigm, exemplified by Digital Equipment Corporation, was already under pressure from the economics of commodity hardware. DEC’s VAX systems and VMS operating environment had defined enterprise computing for a generation, but their proprietary nature and cost structure rendered them vulnerable. As microcomputers and networked workstations matured—and then themselves gave way to standardized PCs—the rationale for vertically integrated systems weakened further. DEC’s eventual decline was emblematic of a broader transition away from institution-centered computing toward distributed, personal, and networked systems.
At the same time, some of the most advanced conceptual work in computing failed to translate into sustained commercial influence. Xerox PARC had pioneered the graphical user interface, Ethernet networking, and object-oriented programming environments. However, Xerox did not successfully capitalize on these innovations, and its retreat from mainstream computing left a vacuum. The ideas developed at PARC diffused into the industry, but without a single steward to unify and propagate them. This diffusion contributed to the proliferation of partially realized implementations across competing platforms, including those in the workstation and AI domains.
The trajectory of Steve Jobs during this period further illustrates the ambiguity of technological progress. After leaving Apple in 1985, Jobs founded NeXT, which produced a sophisticated operating system and development environment grounded in object-oriented principles. Despite its technical elegance, NeXT failed commercially due to high costs and limited market penetration—conditions similar to those faced by workstation vendors and AI-focused firms. Yet its software architecture would later underpin Apple’s resurgence, demonstrating that technical merit and immediate market success were not tightly coupled during this era.
Networking further exemplified the lack of consensus. Competing protocol stacks and architectures divided the landscape. IBM’s Systems Network Architecture (SNA), including LU6.2 for application communication, coexisted with DECnet and Novell’s IPX/SPX. At the physical layer, Ethernet contended with Token Ring, the latter benefiting from IBM’s influence and perceived reliability. TCP/IP, now the universal standard, was initially associated with academic and research environments rather than enterprise deployments. Its eventual dominance was contingent on the growth of the internet, a development that only became commercially significant in the early 1990s.
Distribution and market structure added another layer of uncertainty. The personal computer had not yet become a fully standardized consumer product. Retail channels were fragmented, and purchasing decisions often involved bespoke configurations or enterprise procurement processes. This limited the speed at which any single platform could achieve ubiquity, reinforcing the persistence of multiple competing ecosystems.
The cumulative effect of these dynamics was a prolonged period in which no single layer of the computing stack—hardware, operating system, networking, or even higher-level paradigms like artificial intelligence—had achieved dominance. Each layer exhibited local optimizations without a clear global equilibrium. Vendors pursued strategies that combined technical innovation with attempts at ecosystem control, often resulting in lock-in that deterred broader adoption. At the same time, the economic center of gravity was shifting toward lower-cost, high-volume systems, undermining incumbents whose business models depended on proprietary integration or specialized expertise.
Resolution emerged gradually rather than through a single decisive breakthrough. By the mid-1990s, several forms of convergence had taken hold. Commodity x86 hardware established itself as a de facto standard platform. Microsoft’s Windows, particularly through the NT lineage, achieved dominance on the desktop and in many enterprise contexts. TCP/IP, running over Ethernet, became the universal networking protocol, driven by the rapid expansion of the internet. UNIX persisted, but increasingly through open or semi-open implementations such as Linux and BSD derivatives, rather than through proprietary vendor forks. Meanwhile, artificial intelligence retreated from symbolic, rule-based systems toward a quieter period of statistical and data-driven approaches that would only resurface decades later.
This period can be understood as a pre-standardization phase in which multiple viable architectures competed without a clear mechanism for selection. The eventual winners were not determined solely by technical superiority but by a combination of economic scalability, developer ecosystem alignment, and distribution efficiency. The failure of expert systems and specialized workstation companies underscores this point: systems that were in many respects more advanced did not survive because they could not align with the emerging economics of general-purpose computing.
In that sense, the instability of the 1980s and early 1990s was not an anomaly but a necessary precursor to the relatively stable computing environment that followed. It was a phase in which the fundamental abstractions of modern computing—standardized hardware, interoperable networks, widely adopted operating systems, and scalable software ecosystems—were contested, discarded, and ultimately consolidated.



You're probably familiar with the story, but the late Jerry Pournelle was there at the moment IBM snubbed Bill Gates at the OS/2 rollout and filed him with a burning desire for revenge that led to Windows and Office putting OS/2 in an early grave.
https://www.jerrypournelle.com/reports/intellectual/intcap1.html#7