On July 1, 1970, the doors opened at Xerox’s Palo Alto Research Center (PARC), with the charter to create “The Office of the Future”, and fifty years later, the impact of that opening still resonates. Computing wouldn’t be the same without the developments that emerged from what is now a wholly-owned subsidiary of Xerox Corporation. Why? Check out this sampling and see how much of what we take for granted today originated at PARC. Unless otherwise specified, all images in this slideshow are courtesy of Xerox Corporation.
In 1971, scientists at PARC “modulated a laser to create a bitmapped electronic image on a xerographic copier drum”. In other words, they created a laser printer, which not only fueled a multi-billion dollar business for Xerox and others such as Canon and HP, it created an industry devoted to supplies and maintenance and made respectable-looking business documents much easier to create (remember dot matrix … ugh!).
What has evolved into the standard for computer networking, Ethernet, was born in 1973 at PARC when Bob Metcalfe and David Boggs proposed a system of linking workstations, files, and printers together using a coaxial cable within a local area network. The benefit was that components could join or leave the network without disturbing data traffic. The first functioning example of it was demonstrated six months later. Ethernet has evolved since – blessedly, coaxial cable is mostly a thing of the past – but it still is the underpinning of our corporate and home networks.
The graphical user interface
Most computers, regardless of the operating system they’re running, now offer a graphical user interface (GUI), but those of us who’ve been around for a while will remember when typing on a command line was the only game in town. The GUIs we see on Windows and Macintosh systems had their genesis in 1973 at PARC on the Xerox Alto personal workstation. The Alto also premiered the WYSIWYG (what you see is what you get) editor.
Once upon a time, you just couldn’t do animations or video editing on a computer. In 1973, PARC researcher Richard Shoup developed a system called SuperPaint. The SuperPaint software lived on a customized Data General Nova minicomputer (now in the collection of the Computer History Museum) that included an 8-bit video digitizer, and was also the first program to use features now standard in computer graphics programs such as changing hue, saturation and value of graphical data, choosing from a preset color palette, custom polygons and lines, virtual paintbrushes and pencils, and auto-filling of images. It was also one of the first to use a GUI, and to feature anti-aliasing.
Its inventors received an Academy Award (1998) and an Emmy Award (1983) for their creation of the system.
Image: Marshall Astor / Life on the Edge / CC BY-SA (https://creativecommons.org/licenses/by-sa/2.0)
VLSI circuit design
In 1981, PARC researcher Lynn Conway and Caltech professor Carver Mead received the annual Electronics magazine achievement award for their joint work in creating a common design culture for very large-scale integrated circuits. “The work they have done, individually and together, brought to fruition in their seminal textbook, Introduction to VLSI Systems, is truly monumental,” said the Electronics article about their work.
The first purely object-oriented programming language, Smalltalk, was designed at PARC in the 1970s by a group led by Alan Kay; Kay later wrote a long history of its evolution in which he revealed that the language grew out of a boast to his colleagues that he could define ‘the most powerful language in the world’ in a page of code. Although it was in use at PARC during the 70s, the first general release was Smalltalk-80.
Image by Gerd Altmann from Pixabay.
Anyone who has typed a character on their keyboard and been rewarded by something completely different on the screen knows well the need for a single encoding scheme for characters on any platform, program, or in any language. This, BTW, now includes emojis.
The standard in use today is known as Unicode, which grew out of work on the Xerox Character Code Standard in the 1980s. In 1987, Joe Becker from Xerox along with Lee Collins and Mark Davis from Apple began looking into the creation of a universal character set. Becker then published a draft proposal for an “international/multilingual text character encoding system in August 1988, tentatively called Unicode”.
Unicode is now managed by the Unicode Consortium, a nonprofit established in 1991. Its goal is to eventually replace existing character encoding schemes with Unicode and its standard Unicode Transformation Format (UTF).
Image by Michael Schwarzenberger from Pixabay
Encoding formats for marking things and generating machine-readable links include the familiar barcodes and QR codes. In 1989, PARC researchers came up with their own take, DataGlyphs. A DataGlyph is composed of tiny forward and backward slashes that represent binary values; those slashes can vary in colour and thickness so they can be embedded in logos or graphics on documents. At 600 dots per inch resolution, up to 1K of data can be encoded per square inch.
A 2005 press release noted, “DataGlyphs are distinct from barcodes in their features. They are more flexible in shape, size and color. Their structure and ability for error correction also make them suitable for curved surfaces and other situations where barcodes fail. They have high data density of nearly twice the capacity of PDF417, and DataGlyphs can also be used with cryptography.”
Today, Xerox has readers in some of its printers that determine the type of form being output by reading a DataGlyph on the paper, and touts the technology as a security feature. A company called Microglyph now holds an exclusive license for DataGlyphs.
Methane detection system
In 2018, PARC won an IDTechEx Technical Development Materials Award for its low-power, low-cost, leak-detection system for methane based on arrays of printed modified carbon nanotube sensor elements that operate at ambient temperature and humidity. The sensor materials developed by PARC and NASA Ames Research Center enable printed-carbon nanotube methane sensors to be used in practical applications outside of the laboratory, replacing the current expensive detection systems. Initial field tests at simulated gas wells successfully identified and quantified methane leaks. The sensors can be adapted to detect other gases.
PARC scientists also developed data collection technologies that allow communication of the readings off-site.
3D liquid metal printing
Most 3D printers used in additive manufacturing (so called because instead of carving away what doesn’t belong in an item from a lump of raw material, it “prints” layers of material to create the finished product) rely on selectively printing a fusing material onto layers of a powder base. Once the item is complete, finishing processes such as removal of excess powder are necessary. Xerox and PARC are developing 3D printing with liquid metal, eliminating those extra steps. Their AI-based 3D design software adds more efficiencies.
Xerox says that parts created with this technology are denser, faster to make and cheaper than those made with metal powders.
Fifty years in, PARC is not slowing down. Its current focus areas are AI and human-machine collaboration, the digital workplace, novel printing (which includes 3D printing, and flexible electronics – yep – printing circuits is in the cards), Internet of Things and machine intelligence, digital design and manufacturing, and microsystems and smart drives.
The next fifty years should be really interesting.