Unix turns 40

After four decades, the future of a once revolutionary operating system is clouded, but its legacy will endure

Forty years ago thissummer, a programmer sat down and knocked out in one month what wouldbecome one of the most important pieces of software ever created.

In August 1969, Ken Thompson, a programmer at AT&T subsidiary Bell Laboratories,saw the month-long departure of his wife and young son as anopportunity to put his ideas for a new operating system into practice.He wrote the first version of Unixin assembly language for a wimpy Digital Equipment Corp. (DEC) PDP-7minicomputer, spending one week each on the operating system, a shell,an editor and an assembler.

Thompson and a colleague, Dennis Ritchie, had been feeling adrift sinceBell Labs had withdrawn earlier in the year from a troubled project todevelop a time-sharing system called Multics (Multiplexed Information and Computing Service).They had no desire to stick with any of the batch operating systemsthat predominated at the time, nor did they want to reinvent Multics,which they saw as grotesque and unwieldy.

More from ITWorldCanada.com Other Unix articles

After batting aroundsome ideas for a new system, Thompson wrote the first version of Unix,which the pair would continue to develop over the next several yearswith the help of colleagues Doug McIlroy, Joe Ossanna and Rudd Canaday.Some of the principles of Multics were carried over into their newoperating system, but the beauty of Unix then (if not now) lay in itsless-is-more philosophy.

“Apowerful operating system for interactive use need not be expensiveeither in equipment or in human effort,” Ritchie and Thompson wouldwrite five years later in the Communications of the ACM (CACM), thejournal of the Association for Computing Machinery. “[We hope that]users of Unix will find that the most important characteristics of thesystem are its simplicity, elegance, and ease of use.”

Apparentlythey did. Unix would go on to become a cornerstone of IT, widelydeployed to run servers and workstations in universities, governmentfacilities and corporations. And its influence spread even farther thanits actual deployments, as the ACM noted in 1983 when it gave Thompsonand Ritchie its top prize, the A.M. Turing Award for contributions toIT: “The model of the Unix system has led a generation of softwaredesigners to new ways of thinking about programming.”

Earlysteps Of course, Unix’ success didn’t happen all at once. In 1971 itwas ported to the PDP-11 minicomputer, a more powerful platform thanthe PDP-7 for which it was originally written. Text-formatting andtext-editing programs were added, and it was rolled out to a fewtypists in the Bell Labs Patent department, its first users outside thedevelopment team.

In1972, Ritchie wrote the high-level C programming language (based onThompson’s earlier B language); subsequently, Thompson rewrote Unix inC, which greatly increased the OS’ portability across computingenvironments. Along the way it picked up the name Unics (UniplexedInformation and Computing Service), a play on Multics; the spellingsoon morphed into Unix.

It was time to spread the word. Ritchie and Thompson’s July 1974 CACM article, ” The UNIX Time-Sharing System,”took the IT world by storm. Until then, Unix had been confined to ahandful of users at Bell Labs. But now with the Association forComputing Machinery behind it — an editor called it “elegant” — Unixwas at a tipping point.

“TheCACM article had a dramatic impact,” IT historian Peter Salus wrote inhis book The Daemon, the Gnu and the Penguin. “Soon, Ken was awash inrequests for Unix.”

Hackers’ heaven

Thompsonand Ritchie were the consummate “hackers,” when that word referred tosomeone who combined uncommon creativity, brute force intelligence andmidnight oil to solve software problems that others barely knewexisted.

Their approach, andthe code they wrote, greatly appealed to programmers at universities,and later at startup companies without the mega-budgets of an IBM,Hewlett-Packard or Microsoft. Unix was all that other hackers, such asBill Joy at the University of California, Rick Rashid at CarnegieMellon University and David Korn later at Bell Labs, could wish for.

“Nearlyfrom the start, the system was able to, and did, maintain itself,”wrote Thompson and Ritchie in the CACM article. “Since all sourceprograms were always available and easily modified online, we werewilling to revise and rewrite the system and its software when newideas were invented, discovered, or suggested by others.”

Korn, an AT&TFellow today, worked as a programmer at Bell Labs in the 1970s. “One ofthe hallmarks of Unix was that tools could be written, and better toolscould replace them,” he recalls. “It wasn’t some monolith where you hadto buy into everything; you could actually develop better versions.” Hedeveloped the influential Korn shell, essentially a programminglanguage to direct Unix operations, now available as open-sourcesoftware.

Authorand technology historian Salus recalls his work with the programminglanguage APL on an IBM System/360 mainframe as a professor at theUniversity of Toronto in the 1970s. It was not going well. But the dayafter Christmas in 1978, a friend at Columbia University gave him ademonstration of Unix running on a minicomputer. “I said, ‘Oh my God,’and I was an absolute convert,” says Salus.

He says the key advantage of Unix for him was its “pipe” feature,introduced in 1973, which made it easy to pass the output of oneprogram to another. The pipeline concept, invented by Bell Labs’McIlroy, was subsequently copied by many operating systems, includingall the Unix variants, Linux, DOS and Windows. Another advantage ofUnix — the second “wow,” as Salus puts it — was that it didn’trequire a million-dollar mainframe to run on. It was written for thetiny and primitive DEC PDP-7 minicomputer because that’s all Thompsonand Ritchie could get their hands on in 1969. “The PDP-7 was almostincapable of anything,” Salus recalls. “I was hooked.”

Alot of others got hooked as well. University researchers adopted Unixin droves because it was relatively simple and easily modified, it wasundemanding in its resource requirements, and the source code wasessentially free. Startups like Sun Microsystems and a host ofnow-defunct companies that specialized in scientific computing, such asMultiflow Computer, made it their operating system of choice for thesame reasons.

Unix offspring

Unixgrew up as a non-proprietary system because in 1956 AT&T had beenenjoined by a federal consent decree from straying from its mission toprovide telephone service. It was okay to develop software, and even tolicense it for a “reasonable” fee, but the company was barred fromgetting into the computer business.

Unix, which wasdeveloped with no encouragement from management, was first viewed atAT&T as something between a curiosity and a legal headache.

Then, in the late 1970s, AT&T realized it had something ofcommercial importance on its hands. Its lawyers began adopting a morefavorable interpretation of the 1956 consent decree as they looked forways to protect Unix as a trade secret. Beginning in 1979, with therelease of Version 7, Unix licenses prohibited universities from usingthe Unix source code for study in their courses.

No problem, said computer science professor Andrew Tanenbaum, who hadbeen using Unix V6 at Vrije Universiteit in Amsterdam. In 1987 he wrotea Unix clone for use in his classrooms, creating the open-source Minixoperating system to run on the Intel 80286 microprocessor.

“Minixincorporated all the ideas of Unix, and it was a brilliant job,” Salussays. “Only a major programmer, someone who deeply understood theinternals of an operating system, could do that.” Minix would becomethe starting point for Linus Torvalds’ 1991 creation of Linux — if notexactly a Unix clone, certainly a Unix look-alike.

Steppingback a decade or so, Bill Joy, who was a graduate student andprogrammer at the University of California at Berkeley in the ’70s, gothis hands on a copy of Unix from Bell Labs, and he saw it as a goodplatform for his own work on a Pascal compiler and text editor.

Modificationsand extensions that he and others at Berkeley made resulted in thesecond major branch of Unix, called Berkeley Software Distribution(BSD) Unix. In March 1978, Joy sent out copies of 1BSD, priced at $50.

Soby 1980 there were two major lines of Unix, one from Berkeley and onefrom AT&T, and the stage was set for what would become known as theUnix Wars. The good news was that software developers anywhere couldget the Unix source code and tailor it to their needs and whims. Thebad news was they did just that. Unix proliferated, and the variantsdiverged.

In 1982 Joyco-founded Sun Microsystems and offered a workstation, the Sun-1,running a version of BSD called SunOS. (Solaris would come about adecade later.) The following year, AT&T released the first versionof Unix System V, an enormously influential operating system that wouldbecome the basis for IBM’s AIX and Hewlett-Packard’s HP-UX.

The Unix Wars

In the mid-’80s,users, including the federal government, complained that while Unix wasin theory a single, portable operating system, in fact it was anythingbut. Vendors paid lip service to the complaint but worked night and dayto lock in customers with custom Unix features and APIs.

In 1987, Unix System Laboratories, a part of Bell Labs at the time,began working with Sun on a system that would unify the two major Unixbranches. The product of their collaboration, called Unix System VRelease 4.0, was released two years later and combined features fromSystem V Release 3, BSD, SunOS and Microsoft’s Xenix.

Other Unix vendors feared the AT&T/Sun alliance. The variousparties formed competing “standards” bodies with names like X/Open,Open Software Foundation, Unix International and Corporation for OpenSystems. The arguments, counter-arguments and accomplishments of thesegroups would fill a book, but they all claimed the high road to aunified Unix while taking potshots at each other.

In an unpublished paper written in 1988 for the Defense AdvancedResearch Projects Agency (DARPA), the noted minicomputer pioneer GordonBell said this of the just-formed Open Software Foundation, whichincluded IBM, HP, DEC and others allied against the AT&T/Sunpartnership: “OSF is a way for the Unix have-nots to get into theevolving market, while maintaining their high-margin code museums.'”

The Unix Wars failed to settle differences or set a true standard forthe operating system. But in 1993, the Unix community received a wakeupcall from Microsoft in the form of Windows NT, an enterprise-class,32-bit multiprocessing operating system. The proprietary NT was aimedsquarely at Unix and was intended to extend Microsoft’s desktophegemony to the data center and other places owned by the likes of Sunservers.

Microsoftusers applauded. Unix vendors panicked. All the major Unix rivalsunited in an initiative called the Common Open Software Environment,and the following year more or less laid down their arms by merging theAT&T/Sun-backed Unix International group with the Open SoftwareFoundation. That coalition evolved into today’s The Open Group,certifier of Unix systems and owner of the Single Unix Specification,now the official definition of “Unix.”

As a practicalmatter, these developments may have “standardized” Unix about as muchas possible, given the competitive habits of vendors. But they may havecome too late to stem a flood tide called Linux, the open-sourceoperating system that grew out of Prof. Tanenbaum’s Minix.

The future of Unix

While many firms continue to think twice about migrating from Unix,the continued lack of complete portability across competing versions ofUnix, as well as the cost advantage of Linux and Windows on x86commodity processors, will prompt IT organizations to migrate away fromUnix, suggests a recent poll by Gartner Group.”The results reaffirm continued enthusiasm for Linux as a host serverplatform, with Windows similarly growing and Unix set for a long, butgradual, decline,” says the poll report, published in February 2009.

“Unixhas had a long and lively past, and while it’s not going away, it willincreasingly be under pressure,” says Gartner analyst George Weiss.”Linux is the strategic ‘Unix’ of choice.” Although Linux doesn’t havethe long legacy of development, tuning and stress-testing that Unix hasseen, it is approaching and will soon equal Unix in performance,reliability and scalability, he says.

Second thoughts about Linux Mainstream adoption eludes Linux


But a recent surveyby Computerworld suggests that any migration away from Unix won’thappen quickly. In the survey of 130 Unix users among 211 IT managers,90 per cent said their companies were “very or extremely reliant” onUnix. Slightly more than half said, “Unix is an essential platform forus and will remain so indefinitely,” and just 12 per cent said, “Weexpect to migrate away from Unix in the future.” Cost savings,primarily via server consolidation, was cited as the number one reasonfor migrating away.

Half of the businesses that have deployed Linux on the desktop have rolled it out to less than 20 per cent of their workers due to perceived and real obstacles, according to a survey released today by UK analyst firm, Freeform Dynamics.

Weiss says themigration to commodity x86 processors will accelerate because of thehardware cost advantages. “Horizontal, scalable architectures;clustering; cloud computing; virtualization on x86 — when you combineall those trends, the operating system of choice is around Linux andWindows,” he says.

“Forexample,” Weiss says, “in the recent Cisco announcement for its UnifiedComputing architecture, you have this networking, storage, compute andmemory linkage in a fabric, and you don’t need Unix. You can run Linuxor Windows on x86. So, Intel is winning the war on behalf of Linux overUnix.”


The Open Group,owner of the Single Unix Specification and certifier of Unix systems,concedes little to Linux and calls Unix the system of choice for “thehigh end of features, scalability and performance for mission-criticalapplications.” Linux, it says, tends to be the standard for smaller,less critical applications.

AT&T’s Korn is among those still bullish on Unix. Korn says astrength of Unix over the years, starting in 1973 with the addition ofpipes, is that it can easily be broken into pieces and distributed.That will carry Unix forward, he says: “The [pipelining] philosophyworks well in cloud computing, where you build small reusable piecesinstead of one big monolithic application.”

The Unix legacy

Regardlessof the ultimate fate of Unix, the operating system born at Bell Labs 40years ago has established a legacy likely to endure for decades more.It can claim parentage of a long list of popular software, includingthe Unix offerings of IBM, HP and Sun, Apple’s Mac OS X and Linux. Ithas also influenced systems with few direct roots in Unix, such asMicrosoft’s Windows NT and the IBM and Microsoft versions of DOS.

Unixenabled a number of startup companies to succeed by giving them alow-cost platform to build on. It was a core building block for theInternet and is at the heart of telecommunications systems today. Itspawned a number of important architectural ideas such as pipelining,and the Unix derivative Mach contributed enormously to scientific,distributed and multiprocessor computing.

The ACM may have said it best in its 1983 Turing award citation inhonor of Thompson and Ritchie’s Unix work: “The genius of the Unixsystem is its framework, which enables programmers to stand on the workof others.”Gary Anthes is a former Computerworld national correspondent.

Related Download
Five Key Issues for DNS: The Next Network Management Challenge Sponsor: F5 Networks
Five Key Issues for DNS: The Next Network Management Challenge
Download this whitepaper to learn the five issues that IT needs to think about around DNS and why, as well as how you can build a strong DNS foundation to maximize use of resources, secure DNS, and increase service management, while remaining agile.
Register Now