A History of Silicon Valley

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence
Purchase the book

These are excerpts from Piero Scaruffi's book
"A History of Silicon Valley"


(Copyright © 2010 Piero Scaruffi)

10. The Artists (1984-87)

by Piero Scaruffi

The GUI

A new era was born in january 1984 when Apple introduced the Macintosh, the successor to the Lisa, the result of the project started in 1979 by Jef Raskin. Based on a 32-bit Motorola 68000 CPU, it still featured a proprietary Apple operating system with the Lisa GUI. It went on sale for $2,000. The Macintosh created a new industry: desktop publishing. In 1985 Apple also introduced the LaserWriter, the first printer to ship with PostScript; and Aldus of Seattle (founded in 1984 by Paul Brainerd, who coined the term "desktop publishing") introduced PageMaker, the software application that made it easy to create books on a Mac (acquired by Adobe in 1994), followed in 1987 by Adobe with Illustrator, a PostScript-based drawing application. The Macintosh emphasized the user interface over anything else. For example, before the Macintosh each application had its own set of keyboard commands. The Macintosh introduced a standard set of commands: Z for "Undo", X for "Cut", C for "Copy", V for "Paste", W to close a window, etc. Each and every Macintosh application had to comply with this standard. The Macintosh introduced a new marketing concept: "buy me because i'm cool". It was the "look and feel" that mattered. Previously, personal computers had sold because of the killer application, and all computer manufacturers still thought of software as a means to the end of selling hardware. Apple turned the concept upside down: the hardware was a means to power appealing software. In a sense, fashion had come to the computer industry. Apple became the master of style, the equivalent of Italian fashion designers for digital devices. Nonetheless, the Macintosh was always thought with the killer application in mind, and that was desktop publishing. However, that per se would not have been enough to justify that the company ignored legacy compatibility (Apple II applications did not run on the Mac).

The Macintosh marked a dramatic change in philosophy for Apple. The Apple II was Wozniak's machine: an open platform for which (given the limits of the technology of the time) anyone could write software and attach hardware extensions. Crucially, Apple refused to license the Mac's operating system, whereas Microsoft's operating system worked on any IBM clone. The Macintosh was Jobs' machine: a closed platform that can only run Apple-sanctioned software and attach to Apple-sanctioned hardware. In a sense Jobs had hijacked Wozniak's vision of an open world of computing and turned it into a walled garden. (By then Wozniak's health had been severely impacted by a 1981 airplane accident that he miraculously survived). Unlike IBM (that spawned an entire generation of clones), Apple did not tolerate any clone. Ironically, the combination of IBM's open platform and Apple's closed platform contributed to turn Microsoft into the world's largest software company (and to make Bill Gates the richest person in the world). Apple had the better product, but Jobs' decision to opt for a closed world handed the victory to the lesser product, the Intel/Microsoft machines.

There was no room for hobbyists in an industry that was trying to turn the personal computer into a commodity. Apple had hired Pepsi's CEO John Sculley. In 1985 Steve Wozniak and Steve Jobs left Apple (the first one saw the writing on the wall, the other one had to be fired). At the same time the company, following its first quarterly loss, laid off 20% of its workforce, an action that went against the old ethics of the company. The first Apple computer with color graphics, the Macintosh II, debuted in march 1987, priced at $3,900. Its other improvement was the plug-and-play bus architecture that made it easier to add expansion cards. The "Mac" was a critical success, and helped cement the community of Apple fans, but the "open architecture" created by the IBM-Microsoft axis was winning (in broader market) over Apple's closed, proprietary architecture. That closed architecture, that didn't allow anyone else to use Apple software, was Steve Jobs' ultimate legacy, built on a culture of carefully guarded, industrial secrets. Apple's executives liked to joke that Apple had created more secrecy than the CIA. That paranoia was perhaps the result of Jobs being raised in the middle of the "phreaking" movement: aware that even the technology of the most powerful company in the world (AT&T) could be "hacked", Jobs had wanted to build the super-secure architecture, and the super-paranoid company.

During 1984 Ashton-Tate announced Framework for the IBM PC. It integrated word-processing, database management and business graphics within a windowing environment. VisiCorp's VisiOn had been a flop (in january 1985 VisiCorp went bankrupt), but Digital Research thought that it had the muscle to create a mass-market GUI for the IBM PC and in 1985 it launched GEM (Graphical Environment Manager), a GUI for the CP/M operating system designed by former Xerox PARC's employee Lee Jay Lorenzen. In november Microsoft responded with Windows 1.0 for MS-DOS, a rather mediocre imitation of the Lisa GUI. Unlike Apple, which controlled its own hardware, Microsoft had to deal with the hardware delivered by PC manufacturers, and PC manufacturer were only interested in cutting prices to be more competitive, not in tweaking their hardware to run a better GUI. Therefore Microsoft couldn't match the Apple GUI until the hardware of PC clones became adequate. No surprise then that these operating environments for the PC market stagnated.

Also in august 1984 IBM introduced a multitasking operating system named TopView for its new 80286-based PC AT. It never became popular and eventually lost to Windows. A Bay Area programmer, Nathan Myhrvold, had the idea of cloning TopView for MS-DOS and founded Dynamical Systems Research in Oakland. To be on the safe side, Microsoft bought the company in 1986 and hired Myhrvold who, five years later, would establish Microsoft Research and eventually become the company's chief technology officer.

Office Automation

Desktop publishing was not new to the users of (more expensive) Unix workstations. Boston-based Interleaf, founded by David Boucher and Harry George (two former associates of Kurzweil Computer), had introduced a document processor that integrated text and graphics editing for the Unix workstation market. Steve Kirsch saw an opportunity in that idea and founded FrameTechnology in 1986 in San Jose to commercialize FrameMaker, a publishing platform invented by British mathematician Nick Corfield (acquired by Adobe in 1995). But those were products for the high end of the market.

Both on the software and on the hardware fronts there was a push towards making it easier to produce high-quality documents on a PC. In 1985 Hewlett Packard introduced its own laser printer for the home market, the LaserJet. HP had decided to start selling printers for the little handheld scientific calculators and John Vaught's team had been working since 1979 on thermal-inkjet technology. HP launched the ThinkJet in 1984 and Japan's Canon introduced the BubbleJet-80 in 1985. Soon the inkjet printer began to replace the dot-matrix printer in the world of personal computers. Color printing came with HP's PaintJet, introduced in 1987, and evolved out of HP's plotter technology. Inkjet technology began to offer laser-printing quality at a much lower price with the HP Deskjet of 1988. Until then Epson's dot-matrix printers had ruled the market, but now HP was rapidly becoming the leader, challenged only by Canon (whose laser printers would become the standard engine for HP's later printers).

In january 1987 Aldus released PageMaker for Windows.

Office tools (spreadsheet, word-processing and presentation programs) represented one of the fastest growing markets. In many cases they were the real reason to own a PC. In 1987 Microsoft unveiled a spreadsheet program for Windows, Excel. These applications began to make Windows more appealing. The most popular word-processors for MS-DOS were WordStar and WordPerfect. Microsoft's own word-processor, MSWord, created in 1983 by the team led by Charles Simonyi of Xerox PARC fame, was not successful until, ironically, Microsoft made it available on the Apple Macintosh in 1985. Only in 1989 would Microsoft release a version of Word for Windows. In early 1987 Robert Gaskins and Dennis Austin developed PowerPoint, an application for the Macintosh to create slide presentations. In august 1987 Microsoft bought the whole company and ported the product to Windows. For a while the leader in this sector was Software Publishing, which had acquired Harvard Graphics' 1986 presentation program for Windows.

Graphics

The Macintosh was just one of many events of 1984 that displayed a phenomenal acceleration in using computers as graphic media. In 1984 Wavefront, founded near Los Angeles by Bill Kovacs, who had been creating graphic applications with Evans & Sutherland's Picture System in an architect studio, introduced the first commercial 3D-graphics software, Preview, which ran on a Silicon Graphics workstation.

In 1985 Commodore launched the Amiga 1000, a 16-bit home computer with advanced graphics and audio (multimedia), designed by former Atari's employee Jay Miner with a GUI by Carl Sassenrath, and running a multitasking operating system. It was Commodore's response to the Macintosh. Atari's entry in this market was the ST. Both used the Motorola 68000 microprocessor. The two companies engaged in a major feud because Commodore's founder Jack Tramiel had been fired and had bought Atari, bringing key engineers with him to the rival company. Neither computer could attract the kind of third-party software that Apple and especially the MS-DOS camp could attract, and therefore both languished regardless of their technological merits. Software had become the key to sell a computer. In 1986 a third-party vendor based in the Bay Area, Berkeley Softworks (later renamed GeoWorks), founded by videogame expert Brian Dougherty, created GEOS (Graphic Environment Operating System), a GUI for the Commodore 64 that provided the look and feel of the Macintosh even on old 8-bit computers with very limited RAM. It rapidly became the third most popular operating system after MS-DOS and the Mac OS.

Steve Jobs himself had launched a new company, NeXT, to build the next generation of computers with an even more advanced GUI than the Mac's, but opted for proprietary hardware and a proprietary operating system, which dramatically inflated the investment (the machine would not be released until 1989) and discouraged third-party software developers. Unusually for the time, NeXT invested in sophisticated audio features, mostly designed by Julius Smith of Stanford's CCRMA from 1986 on. Furthermore, the NeXT computer was the first to implement Adobe's brand new Display PostScript, which "printed" directly on the computer's screen, thus ensuring that the user saw on the screen exactly what he would get from the printer.

Research labs were contributing actively to the progress in computer graphics. In 1984 Nicholas Negroponte, who had conducted research on human-computer interfaces at the MIT's Architecture Machine Group, founded the MIT Media Lab to foster the development of multimedia technologies.

There were two obvious kinds of audience for graphics computers outside the computer industry: artists and film studios. In 1984 San Jose State University established the CADRE laboratory ("Computers in Art, Design, Research, and Education") that bridged the artistic and high-tech communities. In 1986 Steve Jobs bought Lucasfilm's division that had worked on computer animation, Pixar, and turned it into an independent film studio run by computer-graphics veteran Ed Catmull. Pixar introduced the Pixar Image Computer, the most advanced graphics computer yet, although a commercial flop.

Nobody could yet offer photo-quality images on a computer, but at least in 1987 an international standard for image file format was introduced, the JPEG (Joint Photographic Experts Group). The timing was perfect because in 1986 camera manufacturer Kodak had built the first megapixel sensor, capable of representing a photograph with 1.4 million pixels (a pixel being the fundamental unit of a computer display). The world was getting ready for scanning, storing, manipulating and transmitting images.

Clearly, the personal-computer world was mature enough that now the attention was focused on making them easier to use as well as capable of dealing with images and sound.

Virtual Reality

In 1984 NASA Ames in Mountain View created the first virtual-reality environment. "Virtual reality" was basically an evolution of the old computer simulation systems, such as the ones pioneered by Evans & Sutherland. The software was interactive, meaning that it recreated the environment based on the user's movements, i.e. the user was able to interact with the computer via body movements. The story of virtual reality dated back to the 1960s. Charles Comeau and James Bryan at Philco built a head-mounted display in 1961 ("Headsight"). Meanwhile, Bell Helicopter designed a head-mounted display for pilots that communicated with a moving camera. Ivan Sutherland at ARPA had speculated about the "Ultimate Display" in 1965.In 1966 he moved to Harvard University where he took Bell Helicopter's head-mounted display and connected it to a computer: the images were generated by the computer instead than by a camera. When he moved to University of Utah, he created a rudimentary virtual-reality system in 1969 on a PDP-1 attached to a Bell Helicopter's display with funding from the Central Intelligence Agency (CIA), ARPA, the Office of Naval Research and Bell Laboratories. Thomas Furness at Wright-Patterson Air Force Base in Ohio started work in 1969 on a helmet for pilots that displayed three-dimensional computer graphics (the Visually Coupled Airborne Systems Simulator), first demonstrated in september 1981, and then used it to design a virtual cockpit (the "Super Cockpit"), first announced in 1986, that allowed a pilot to fly a plane through a computer-simulated landscape by moving his head and his hand. Furness went on to establish in 1989 the University of Washington's Human Interface Technology Lab (HITL) in Seattle. Meanwhile, in 1979 Eric Howlett in Boston invented an extreme wide-angle stereoscopic photographic technology, LEEP (Large Expanse Extra Perspective). In 1979 Michael Naimark at MIT's Center for Advanced Visual Studies debuted the Aspen Movie Map, a project directed by Andy Lippman that allowed the user to navigate a representation of a city (Aspen) stored on laserdiscs. The "movie map" had been created over two years by wide-angle cameras mounted on top of a car. In 1984 Berkeley's alumnus Michael McGreevy joined NASA Ames Research Center and started the project for the Virtual Planetary Exploration Workstation, a virtual-reality system for which he built the first low-cost head-mounted display, the Virtual Visual Environment Display system (VIVED). The system was hosted on a DEC PDP-11 interfacing an Evans and Sutherland Picture System 2. In 1985 Scott Fisher, an MIT alumnus who had worked both at the Center for Advance Visual Studies in 1974-76 and at Negroponte's Architecture Machine Group in 1978-82, contributing to the Aspen Movie Map, and who had moved to the Bay Area to join Alan Kay's research group at Atari, joined NASA Ames and built the VIrtual Environment Workstation (VIEW), incorporating the first "dataglove". By moving the dataglove the user moved in the virtual world projected into her head-mounted display. In 1985 Jaron Lanier, another self-taught videogame expert, established VPL Research at his house in Palo Alto, the first company to sell Virtual Reality products, notably the "Data Glove" invented by Thomas Zimmerman. VPL developed the data glove for NASA, based on the one designed by Scott Fisher. Timothy Leary, the prophet of LSD, saw VPL's virtual reality as a way to experience alternative realities just like hallucinogenic drugs.

The history of virtual reality also overlaps with the history of computer games. A MUD (Multi-User Dungeon) is a computer game played by many simultaneous users on different computers all of them connected to the same virtual world. There were predecessors but the game that created the term and started the trend on the Internet was MUD, created in 1978 in Britain by Essex University's student Roy Trubshaw and launched online in 1980. In 1986 Lucasfilm launched "Habitat", a social virtual world created by Randy Farmer and Chip Morningstar and running on Commodore 64 computers connected via dial-up lines. Each user in this virtual world was represented by an "avatar".

New Paradigms of User-Computer Interaction

With new technologies came new paradigms of computer-human interaction. In 1987 a Virginia-based company, Linus Technologies, introduced the first pen-based computer, WriteTop, which allowed the user to write directly on the screen. It was PC-compatible and cost $2,750. Also in 1987 Jerry Kaplan, formerly chief technologist at Lotus and co-founder of Teknowledge, started GO Corporation in Silicon Valley to manufacture similar portable computers with a pen-based user interface. GO never delivered anything of any consequence, but went down in the history of the valley for the impressive amount of venture capital that it managed to amass: $75 million.

In 1987 Apple demonstrated the HyperCard software, which allowed Macintosh users to create applications using interconnected "cards" that could mix text, images, sound and video. The cards constituted a hypertext. Designed by Bill Atkinson, it was another idea derived from Xerox PARC, which had built a hypertext system called NoteCards in 1984, based in turn on the old experiments of Ted Nelson and Douglas Engelbart. HyperCard also pioneered the idea of "plug-ins", of external software that is allowed to access the application's internal data in order to extend its functionalities.

n early handheld mobile computer came out. It went largely unnoticed in the USA, but in 1984 Psion (established in Britain in 1980 by David Potter as the software arm of local computer manufacturer Sinclair), introduced a hand-held computer, the Psion Organiser, that was the archetype of a "personal digital assistant".

Artificial Intelligence

During those years the decline of the field of Artificial Intelligence was as rapid as its rise had been in the late 1970s. The Bay Area had invested in the "symbolic school" of A.I., the knowledge-based approach. In reality, those were the years when the foundations for a new boom of A.I. were laid in the leading places of research on neural networks. Yann LeCun at Bell Labs was working on "convolutional" neural network (which would eventually solve the problem of image recognition), in 1985 Rodney Brooks at MIT introduced a new concept of robot grounded in the body, in 1986 David Rumelhart and Geoffrey Hinton at UC San Diego rediscovered the "backpropagation" algorithm (which would become the backbone of future systems), and Fred Jelinek at IBM was working on a statistical method for automatic language translation (first published in 1988), which would eventually solve that problem. Hinton, in particular, was influential in starting a school of neural networks at Carnegie Mellon University (where Dean Pomerleau would debut the first self-driving vehicle in 1988) and at the Canadian Institute for Advanced Research (CIFAR), where "deep learning" would be invented.

Very few scientists of the Bay Area were interested in neural networks. However, two of the leading computer scientists, Federico Faggin and Carver Mead, founded Synaptics in 1986 in San Jose precisely to explore hardware implementations of neural networks (although the startup would become more famous as a maker of touchpads).

The Semiconductor Wars

Meanwhile, in 1985 Intel introduced the 32-bit 80386 that contained 275,000 transistors and was capable of performing three million instructions per second. The first 32-bit microprocessor had been shipped already in 1983 by National Semiconductor (the NS32032) and second had been Motorola in 1984 (the MC68020), but it was the Intel 80386 (abbreviated as 386) that shook the market. It boasted almost 100 times more transistors than the 4004 (275,000) and it could run both MS-DOS and Unix. However 1985 was the year of the first crisis of the semiconductor industry, brought about by cheaper Japanese products. The Japanese government, via the Ministry of International Trade and Industry (MITI), had sponsored a project headed by Yoshio Nishi at Toshiba for Very Large-Scale Integration (VLSI) with the primary goal of conquering the DRAM market. In 1984 Japanese firms introduced the 256K DRAM chips, and Silicon Valley companies just could not compete with the low prices of those chips. The truth is that Silicon Valley had gotten its start by selling customized military systems, not by selling commodities. It relied on a network of local know-how and on intimate relationships with the customer. Commodities, instead, rely on economy of scale. By 1985 Japanese firms had gained 70% of the DRAM market. Intel, AMD and Fairchild had to exit the DRAM market. (It was mainly a Silicon Valley problem, because non-Silicon Valley firms such as Motorola, Texas Instruments and Micron continued to manufacture competitive DRAMs). In 1981 USA manufacturers had enjoyed a 51.4% share of the world's semiconductor market, whereas Japanese companies had 35.5%. In 1986 the situation had been reversed, with Japan's share reaching 51% and USA companies reduced to a 36.5% share. Thousands of hardware engineers were laid off in Silicon Valley, one of the factors that pushed the region towards software. What saved Intel was the microprocessor. The "computer on a chip" was too complex and required too big a manufacturing investment to be handled like a commodity. Japanese microprocessor technology was simply licensed from the USA. In 1984 the world market for microprocessors was worth $600 million: 63% of those sales went to USA companies, 30% to Japanese companies and 7% to European companies. But the situation was even better for the USA: 99% of those microprocessors was designed under license from a USA manufacturer. It also helped that in 1984 the USA government passed the Semiconductor Chip Protection Act, which made it much more difficult to copy a chip. It also helped that in 1987 the USA government set up Sematech (SEmiconductor MAnufacturing TECHnology), a consortium of USA-based semiconductor manufacturers funded by the DARPA (basically the antidote to the MITI program). The semiconductor industry recovered and Silicon Valley-based companies such as VLSI Technology, Linear Technology, LSI Logic, Cypress Semiconductor, Maxim, Altera and Xilinx went on to become international juggernauts.

Intel's corporate culture changed dramatically after the crisis that almost sank it. Power shifted from Noyce to Andy Grove, who replaced Noyce's idealistic philosophy and casual management with a brutal philosophy of Darwinian competition ("Only the paranoid survive") and with iron discipline. Born in Hungary, Andy Grove (real name Andras Grof) had studied at U.C. Berkeley and worked at Fairchild, and had been Intel's third employee in 1968.

Meanwhile, in 1984 Phil Moorby working in Boston at Prabhu Goel's Automated Integrated Design Systems (later renamed as Gateway Design Automation) wrote AIDS Sim (later renamed Verilog), a programming language similar to C but to design electronic chips. Moorby based it on the HILO-2 language that had been developed by him and others in Britain at Brunel University under a contract for the British Ministry of Defense. Introduced in 1985 (and acquired by Cadence in 1990), the hardware description language Verilog greatly increased the productivity of hardware designers. For several decades there would be two competing industry standards in electronic design automation: Verilog and VHDL (standardized in 1986). For example, Synopsys, a General Electric spinoff started as Optimal Solutions by Aart deGeus and David Gregory in 1986 in North Carolina, soon to become one of the giants in this field with Cadence and Mentor, supported both Verilog and VHDL.

Lithography to the Rescue

Progress in optical lithography was essential to extend Moore's Law. In 1982 TRE Semiconductor (the successor company of Electromask) developed the world’s first ultraviolet "i-line" stepper. Until then, chipmakers had used visible light for lithography to produce integrated circuits. In the 1980s chipmakers shifted from visible light to ultraviolet light and to steppers employing the ultraviolet "i-line" (365 nm) from mercury lamps. Leading the way was ASML, the spin-off created by Philips in the Netherlands, that introduced its i-line stepper in 1987. At this point, in order to further extend Moore's Law, it was necessary to develop wafer steppers that could operate at even lower wavelengths, but there was no light bulb capable of emitting enough light at lower wavelengths. However, during the 1970s a new kind of laser had been developed: the excimer laser. That solved the problem. A new generation of steppers, called "deep UV" (DUV) appeared, at first at 248 nanometers (using krypton-fluoride or KrF excimer lasers) and then at 193 nm (using argon-fluoride or ArF excimer lasers). In 1985 GCA built the first deep-ultraviolet (DUV) stepper (for Bell Labs), and remained a technological leader in the sector thanks to funding from Sematech (funding that lasted until 1993). Their technology was eventually acquired in 1998 by Ultratech Stepper.

Experimenting with Business Models

In 1985 IBM had shipped its four millionth PC, but the following year IBM made one of its historical mistakes. Determined to stamp out the clones, IBM decided to introduce a new computer based on a proprietary architecture built on top of the old Intel 286 microprocessor. Compaq, which had been growing faster than any other company in the history of the USA, did not miss the chance to introduce (in september 1986) a faster machine based on the 386 (although its Deskpro 386 was quite expensive at $6,500). When (in april 1987) IBM at last delivered a 386-based machine, the Personal System/2 (PS/2), it ran a new operating system, OS/2, co-developed by IBM and Microsoft, which greatly confused the customers. Its lasting legacy would be VGA graphics (Video Graphics Array).

Unlike Apple (a Silicon Valley company) that was losing money thanks to its proprietary operating system, Microsoft (not a Silicon Valley company) was booming, thanks to an operating system (MS-DOS) that worked on so many computers. Microsoft went from 40 employees and $7.5 million in revenues in 1980 to $140 million in revenues and 910 employees in 1985. In 1987 Microsoft's stock hit $90, catapulting its main owner, Bill Gates, who was just 31, into the list of billionaires.

In 1985 AT&T introduced a Unix PC for $5,000, quite expensive by the standards of the personal computers. It was followed by the Commodore 900. This too was a historical mistake: the market was not ready for Unix, and Unix (still mired in endless internecine wars) was not ready for the market.

A new business model was introduced by Michael Dell, still a student at University of Texas at Austin when in 1984 in his dormitory room he founded PCs Limited, later renamed Dell. He decided to specialize in custom PC-compatible computers (relieving the customer of the tedious and risky business of assembling components to customize the machine) and to deal directly with the customer, initially by mail order only. It almost represented a return to the business model of the early hobbyists. Dell's success was colossal and its revenues rose exponentially. Dell's success relied on an automated supply-chain system that removed the need for inventories: its PCs were "made to order". Dell's success mirrored Compaq's success in the early 1980s: both owed their low prices more to a distribution strategy than to a technological breakthrough. Another company that was created by a young hobbyist, built to order, and sold directly to customers was Gateway 2000, formed in a South Dakota barn by Ted Waitt, which in 1987 introduced its first PC. In 1991 it was ranked the fastest growing company in the USA. Dell was the heir to a glorious dynasty of Texas electronic businesses that started with Texas Instruments and continued with Tandy and Compaq.

All of them would soon have to face another warfront. In april 1985 Japanese manufacturer Toshiba launched its T1100, one of the earliest IBM-compatible laptops (the project of Atsutoshi Nishida). That machine set the standard in terms of features: internal rechargeable batteries, an LCD (Liquid Crystal Display) screen and a floppy-disk drive. HP had already debuted its first laptop, HP-110, in 1984, which was also IBM-compatible (running MS-DOS on the Intel 8086) but Toshiba took the idea to a new level.

Therefore there were several business models for the personal computer industry:

  • Lock customers with a proprietary operating system (IBM and Apple);
  • Copy the de-facto standard and get to market fast (Compaq); compete with Unix workstations (AT&T);
  • Copy the de-facto standard and make to order "just-in-time" (Dell);
  • Produce not just desktop PCs but also portable "laptops" (Toshiba), and;
  • Focus on a cross-platform software platform (Microsoft).

Networks

As personal computers became more powerful and easier to use, the client-server architecture became a serious alternative to the monolithic mainframe. In a client-server architecture the software application was split in a client portion, which ran on a personal computer, and a server portion, which ran on a more powerful computer. Many clients (MS-DOS PCs or Macintosh machines) were connected to the server (and a Unix minicomputer was much easier to connect than an IBM mainframe). The server hosted the database. By distributing software away from centralized mainframes and onto networked personal computers, companies created more flexible environments and saved money. Mainframes were rapidly abandoned. Software companies built fortunes by porting "legacy systems" (applications created for the mainframe) to minicomputers. Thousands of mainframe programmers of the Cobol generation lost their jobs to software engineers of the C-language and Basic generation (Basic having become the language of choice on personal computers).

In 1984 Oracle's executive Umang Gupta (one of Oracle's earliest engineers) started Gupta Technologies to port client-server relational database technology (named SQLBase) to the personal computer.

Computer networks began to proliferate, both within a corporation and among corporations (thanks also to the Internet). A router is a computer-based device for routing and forwarding data to the computers of a network. Judy Estrin, a pupil of Vint Cerf at Stanford and a former Zilog engineer, had already started a company to sell routers, Bridge Communications in 1981 in Mountain View. In 1981 Stanford had a team working on a project to connect all their mainframes, minis, LISP machines and Altos. William Yeager designed the software (on a PDP-11) and ubiquitous student Andy Bechtolsheim designed the hardware. Leonard Bosack was a support engineer who worked on the network router that allowed the computer network under his management (at the Computer Science lab) to share data with another network (at the Business School). In 1984 he and his wife Sandy Lerner (manager of the other lab) started Cisco in Menlo Park to commercialize the Advanced Gateway Server, which was a revised version of the Stanford router (the product was developed in their garage and first sold in 1986 through word of mouth). They had correctly guessed that connecting networks to networks would become more important as more corporations needed to connect geographically distributed offices, each having its own network. In 1983 Bruce Smith, a former executive at a satellite communications company on the East Coast, founded Network Equipment Technologies (NET) in Redwood City to provide high-end multiplexers to large companies. In 1985 two Xerox PARC engineers, Ronald Schmidt and Andrew Ludwick, started SynOptics in Santa Clara to develop Ethernet products.

In 1985 Washington-based bar owner Jim Kimsey founded Quantum Computer Services that introduced a new business model: to provide dedicated online services for personal computers (initially only Commodore models), so that one could use a personal computer to connect to a bigger computer where a lot of applications could be found (such as videogames). In 1988 Quantum would add service for Apple and PC-compatible computers, and rename itself America Online (AOL).

In 1986 there were already 30 million personal computers in the USA, but very few of them were "online" (capable of connecting to a service run on a remote computer) because modems were slow and expensive. In 1987 U.S. Robotics of Chicago unveiled a 9600-baud modem, but it cost $1,000. In 1985 the first domain name of the Internet, Symbolics.com, was registered.

For decades most directors of Information Technology (or Electronic Data Processing, as it was more commonly called) had reported to chief financial officers, the reason being that a computer was a massive investment and it was typically not trivial to justify the investment to the company's management. The personal computer and office productivity software put a computer and an application on every desk. This had two effects. First of all, a lot of information technology did not require the approval of the chief financial officer because it was as cheap as buying a desk or a typewriter. Secondly, the proliferation of such software and hardware, coupled to the chaotic rapidly evolving nature of the computer industry, created the need for someone to keep track of what I.T. tools the company's productivity depended on. The role of the person in charge of data processing changed: instead of being simply the liaison between various departments and the massive mainframe computer, it became the czar in charge of deciding which hardware and which software should be used across the company. In 1986 BusinessWeek magazine published an article titled "Management's Newest Star: Meet the Chief Information Officer" that launched the trend. The success of the client-server model was indirectly due to the growing power of the CIO and to the main duty of such figure: to bring order within the company. The client-server model centralized again power, like it was in the old days of the mainframe, but did so in a highly distributed world. The CIO became much more important than a manager of EDP had always been, because it soon became apparent that the company's productivity depended in large part on the client-server strategy.

Far away from Silicon Valley, Martin Cooper's 1973 prototype of a cellular phone finally became a commercial product when Motorola introduced the DynaTAC in 1984, a phone capable of making wireless phone calls over a network deployed by Bell Labs in 1983, the Advanced Mobile Phone System (AMPS). This was "1G", the first generation of mobile wireless telephony, first launched in 1979 in Japan, and it was still analog. Just like computers in their infancy, this was not a product that ordinary families could afford, but its ubiquity in movies and on television fired up the imagination of millions of potential consumers. The most important event of the era, however, was one that few noticed, especially since it came from a government agency. In 1985 Michael Marcus, an engineer working for the Federal Communications Commission (FCC) of the USA, the agency in charge of regulating telecommunications, had the idea of liberalizing the use of three useless bands of wireless spectrum: 900MHz, 2.4GHz and 5.8GHz. These came to be known as the "garbage bands". The US government basically allowed anybody to use those bands without the need for a licence, something that existed only for the old ham-radio channels. Little did Marcus knew that he had just started a major revolution, the Wi-Fi revolution.

Storage

Meanwhile, the saga of storage devices that had started with Alan Shugart's floppy disc continued to spawn new ideas and companies.

In 1984 SUN had unveiled the Network File System (NFS), designed by Bill Joy and managed by Bob Lyon, a software component that allowed computers to access data storage over a Unix network. When DEC, HP, IBM and eventually AT&T adopted NFS, it became an industry standard for distributing data storage over a Unix computer network.

Following Novell's NetWare for networks of personal computers (1983) and SUN's NFS for networks of UNIX workstations (1984), computer networks saw the birth of the first machines dedicated to data storage and shared by many computers. 3Com's 3Server (1985) conceived data storage devices as a new type of computer appliance that could be shared by all the computers of an Ethernet local area network. In 1988 3Com and Microsoft introduced a software for OS/2 called LAN Manager and IBM followed suit with its own LAN Server.

Middleware for local area networks such as NFS enabled new architectures for storing data. Auspex Systems, founded in 1987 in Santa Clara by Adaptec's boss Larry Boucher, introduced the first data storage appliances for a computer network. Among the young engineers hired by Boucher were MIPS' file-system expert David Hitz and Chinese-born Berkeley and Stanford engineering alumnus James Lau. Auspex's dedicated NFS server for the UNIX market popularized the concept of the data storage server for a computer network, the "network attached storage" (NAS) appliance.

The Age of Japan

The Bay Area was beginning to have the larger share of computer-related innovations. In the mid 1980s only a handful of events compared with the boom of Silicon Valley. A few came from Japan, which was going through its own technological boom. In 1984 Sony and Philips introduced the CD-ROM for data and music storage. In 1984 Fujio Masuoka at Toshiba invented flash memory, a cheaper kind of EEPROM. This caused a significant revolution in the design of storage. Since the invention of "Williams tubes", computer architectures had split memory into a small, fast, short-term unit of DRAM and a larger, slow, long-term storage drive. The common forms of storage were hard-disk drives and floppy-disk drives, i.e. units that contained spinning disks and movable read-write heads. Flash memories required solid-state drives: drives with no drive motor (and, in general, no significant mechanical parts). The most popular technology of flash memory, NAND, represented a serious threat to EEPROMs used to hold configuration data. In 1983 Nintendo, which so far had mostly copied other people's games (1978's "Block Fever" was a clone of Atari's "Breakout" and 1979's "Space Fever" was a clone of "Space Invaders"), launched the Family Computer, a videogame console designed by Masayuki Uemura and renamed Nintendo Entertainment System two years later in the USA, where it single-handedly resurrected the videogame console.

Changing of the Guard

In the venerable tradition of the Bay Area, already a new generation was taking over and the old one was rapidly being buried. In 1987 National Semiconductor acquired the glorious Fairchild from Schlumberger that had acquired it in 1979 with the whole of Fairchild Camera and Instrument. In the same year ComputerLand (that in 1985 was valued at $1.4 billion) was purchased for very little by a private equity firm. Zilog had long succumbed to Exxon and the managers of its historical Z80 era had already quit.

SUN was causing another revolution. Between 1986 and 1987 revenues almost tripled. By the end of 1988 it would pass DEC in market shares of workstations (SUN 38.3%, DEC 23.1%, Apollo 16.7%, HP 10.6%). SUN ended up eroding DEC's supremacy in the academia and then in the engineering market, which had traditionally been the bedrocks of DEC's success. Their business ideologies were, in fact, opposite. DEC still belonged to the era of vertically-integrated manufacturers that produced in-house virtually all the hardware and software components of the computer. SUN, by contrast, pioneered a manufacturing industry that relied on third parties to provide all the components. The DEC generation believed that a company needed to personally make the key components of its products. The SUN generation believed that such key components ought to be delegated to specialty shops in Silicon Valley (and, eventually, around the world). In-house development was unlikely to match the same "best of breed" quality across the board guaranteed by a portfolio of specialized companies. SUN departments were only in charge of designing, coordinating, assembling and selling. The complexity of creating a product had shifted from a network of internal laboratories to a network of external suppliers. What had changed was the pace of technological innovation. The small start-up SUN had been able to introduce more products in its first five years in its market segment than multi-billion dollar corporation DEC. The DEC generation relied on proprietary components to keep the competition at bay. The SUN generation relied on the frenzied pace of product releases, knowing that each product was easy for the competition to clone but difficult to clone in time before a new product would make it obsolete. In the end, the SUN model greatly increased the revenues per employee. It also reduced its exposure to the risk of capital-intensive operations. This model would create a huge secondary economy in Silicon Valley of hyperspecialized companies that would never become household names despite achieving considerable revenues. (For the record, Apple adopted the SUN model, whereas HP managed to fare better than DEC even with the old model of vertical in-house integration). In 1987 SUN seemed to renege on its own "open-architecture" ideology when it switched from off-the-shelf hardware and software to its own RISC microprocessor, SPARC, and its own operating system, Solaris. However, it was still outsourcing the production of its components.

Cyberculture

Those were the days when the media and the intelligentsia were fascinated by the possibilities of the "cyberspace", the invisible medium/dimension of data. Thanks to the networks, those data now traveled through a space, living a life of their own. William Gibson invented a whole new genre of science fiction with his novel "Neuromancer" (1984), the one that popularized the term "cyberspace". In 1983 Bruce Bethke had written the story "Cyberpunk", which introduced another term in the genre: the punk that roams the cyberspace. The media had been creating a mythology of hackers, of software engineers who could manipulate programs and data. The media had also been speculating on the possibilities of Artificial Intelligence. All these threads resonated with a society that was haunted by fear of the nuclear holocaust and by alienated urban life.

In january 1986 a "computer virus" nicknamed "Brain" started spreading among IBM PCs: every time a user copied something from an infected floppy disc, the user also involuntarily copied the virus on the PC, which then replicated itself on any other floppy disc used by that machine. The virus had been created in faraway Pakistan by the owners of Lahore's computer shop Brain (Basit Farooq Alvi and Amjad Farooq Alvi). The original personal-computer virus "Elk Cloner" had done relatively little damage because it was confined to the Apple II world, but the widespread adoption of the IBM PC standard had created a whole new world of opportunities for digital contagion.

The 1980s had witnessed a rapid rise in the status of the software engineer. No longer a nameless cog in vast corporate bureaucracies, the software engineer had emerged from the personal computer revolution as the cyberspace equivalent of the medieval knight-errant. Steven Levy's book "Hackers - Heroes of the Computer Revolution" (1984) glorified them. The decade would end with Hans Moravec's "Mind Children" (1988), that quipped "Robots will eventually succeed us: humans clearly face extinction" and with Fereidoun "FM-2030" Esfandiary's futuristic vision "Are You a Transhuman?" (1989), while the following decade would open with Raymond Kurzweil's book "Age of Intelligent Machines" (1990), predicting the coming of machines smarter than humans, what Vernor Vinge had called "the singularity". Suddenly, the future of humankind was in the hands of this obscure new worker, the hacker. In 1981 Wau Holland formed a club of sociopolitically-aware Berlin hackers, the Chaos Computer Club (CCC). In 1984 this club started organizing a conference for hackers in Hamburg, the Chaos Communication Congress (C3) while publishing the magazine Die Datenschleuder. In 1984 David Ruderman and Eric Corley (aka Emmanuel Goldstein) founded the hacker magazine 2600 in New York. Another early society of hackers was Cult of the Dead Cow (cDc Communications), founded in 1984 in Lubbock (Texas) with the motto "Global Domination Through Media Saturation", that spawned the hacker conference HoHoCon in 1990, modeled after the first hacker conference in the USA, the Summercon held in St Louis in 1987. Another influential magazine was Hack-Tic, established in 1989 in the Netherlands and modeled after C3's Die Datenschleuder.

Synthetic Biology

In may 1985 Robert Sinsheimer organized a meeting of biologists in Santa Cruz (south of the bay) to discuss the feasibility of sequencing the entire human genome. In a few months Leroy Hood's team at the California Institute of Technology in Pasadena refined an automated method to sequence DNA, i.e. the first automated DNA sequencer, which made it possible (not just theoretical) to sequence the entire human genome. Lloyd Smith was the main developer of the machine thanks to his background in both engineering and chemistry. Within one year that sequencer was launched on the market by Sam Eletr's Applied Biosystems in Foster City, which also provided an automated protein synthesizer, protein sequencer and DNA synthesizer (these were easier technologies to develop). This gave Applied Biosystems a virtual monopoly in DNA synthesis for several years. Leroy Hood's team included a young Mike Hunkapiller, who was also one of the first employees of Applied Biosystems.

A brand new discipline was born when in 1984 Steven Benner at the University of Florida created a gene encoding an enzyme, the first artificially designed gene of any kind. Synthetic biology is to biology what mechanical engineering is to physics: its goal is to build biological systems that do not exist in nature. In 1988 Benner organized the conference "Redesigning the Molecules of Life', the first major conference on synthetic biology.

Meanwhile a third company joined Alza and Genentech among the successes of the Bay Area's pharmaceutical industry. A 29-years old employee of the venture capital firm Menlo Ventures, Michael Riordan, founded Oligogen (later renamed Gilead Sciences) in august 1987 in Foster City, a company that, after a number of acquisitions, would experience exponential growth thanks to a focus on antiviral drugs to treat chronic and global diseases such as AIDS, hepatitis C and the flu (Tamiflu for the flue, Viread for AIDS, Sovaldi for hepatitis C), becoming the biggest biotech company in the world.

Anthropology of High-Tech Individualism

Only a fraction of the high-tech workforce was born and raised in the Bay Area. The others were, by definition, "strangers". Some of them had come to study, and therefore could count on a network of friends from college. Many of them had come in their 20s or 30s for work. Their social life was not easy in a region where individualism was pushed to the extreme. People lived alone most of the day: commuting by car (one person per car) because public transportation was inefficient, working in a cubicle, living in apartments. The housemate was often a social choice, not an economic one: it was a chance to occasionally talk to somebody. Companies encouraged employeed to mingle by throwing company parties and the likes. Some companies (notably SUN) even organized their workplace to mirror a college campus.

The connections tended to be very weak. Friendship tended to be rather superficial. Most people's "friends" were just random acquaitances who would not hesitate to "flake out" on an appointment.

The weakness of social life was, however, offset by the broad range of summer and winter activities, that quickly became a core aspect of the regional psyche. The geography blessed the Bay Area with proximity to the skiing area of Lake Tahoe, the beaches of the Pacific coast, the forests and waterfalls of Yosemite, the desert of Death Valley, and the mountains of the Sierra Nevada. Furthermore, this part of California enjoyed six months of virtually no rain, a strong motivation to spend weekends outdoors. During the week people who lived in the ubiquitous apartment complex could enjoy the annexed amenities, from the swimming pool to the gym.

The ethnic Babel, the appeal of the outdoors and the apartment life caused a decline of quintessential USA entertainment such as bowling, billiard, baseball, fishing/hunting.

Social events were monopolized by work-related issues. High culture was virtually inexistent, completely subcontracted to San Francisco and Berkeley. Restaurants, not politics, made news.

Silicon Valley engineers were also the users of the technology invented there. The region posted a higher percentage of users of computer technology than any other region of the world. An emblem of this recursive lifestyle was Fry's, the first electronic superstore, that opened in 1985 in Sunnyvale selling everything from cables to computers. The technology manufactured there had therefore a direct influence on shaping the lifestyle of this heterogeneous workforce. In fact, it was a unifying factor. High-tech (not the church or the government) provided an identity to the community.

An endemic problem in Silicon Valley was the turnover of engineers, who could easily switch jobs overnight, and who were generally more interested in short-term financial success in than long-term career advancement within a large organization. At the same time, the volatility of the...

The volatility of the job market in Silicon Valley was even higher than in other parts of the USA. The lifespan of a company was totally unpredictable. A sense of insecurity was inherent in the lives of these highly-paid professionals. At the same time, it was a lot easier to land a high-tech job in Silicon Valley than anywhere else on the globe just because of statistics (the sheer number of high-tech companies). A sense of arrogance was therefore also inherent in the lives of this population. Sometimes the psychological relationship was upside down: the company had to be grateful that an engineer worked for it (whereas in the rest of the world it was usually the worker who was grateful to the company).

Insecurity and arrogance coexisted in the same mind, with wild swings from one to other depending on the company's performance.

The typical career consisted in parasiting on a company's success until that success began to taper off, and then jump onto another company's bandwagon. It was a career path of quantum jumps. It also implicitly required a process of lifelong training in order to avoid obsolescence. It wasn't just instability: it was accelerated and self-propelled instability.

Culture and Society

The main event of those years in the cultural life of San Francisco was probably the WELL. Started in 1985 by Stewart Brand of the Whole Earth fame, and modeled after Murray Turoff's EIES, the "Whole Earth Lectronic Link" (or "WELL") provided a virtual community of computer users, structured in bulletin boards for online discussions. Brand had just invented social networking. Its impact on the "alternative" lifestyle was significant. It was the first time that a computer-based system could have such an impact on a computer-illiterate public.

On the WELL people freely exchanged knowledge without expecting to be paid for it. It was a miniature "sharing" economy. In 1969 the physicist Gerard O'Neill at Princeton University had envisioned a human colony in outer space. The WELL implemented O'Neill's space colony not in outer space but in what William Gibson had just nicknamed "cyberspace". In 1987 Howard Rheingold coined the term "virtual community". Among the most followed members of the WELL was John Perry Barlow, an alumnus of Timothy Leary's acid trips in New York who had written lyrics for the Grateful Dead.

Stewart Brand's nonprofit Point Foundation (the umbrella for the various Whole Earth projects) and the Homebrew Computer Club organized the first large-scale meeting of computer hackers, in November 1984 at Fort Cronkhite, in the Marin Headlands just north of San Francisco. 150 hackers attended, including Steve Wozniak and Bill Atkinson from Apple, Lee Felsenstein from Osborne, Richard Greenblatt from the MIT, Richard Stallman from GNU, hypertext pioneer Ted Nelson.

Cyberspace continued the metaphor of consciousness expansion, fifth dimension, mystical experience and transpersonal communion of the hippies replacing drugs as the vehicle with computer networks.

One decade after Daniel Bell's "The Coming of Post-industrial Society" (1973), the new rising knowledge-based society was being hijacked by the counterculture of the Bay Area and began shaping not a new political order but a new economic and social order. In particular, this new order privileged networks over hierarchies.

The new utopia, promoted by thinkers like Rheingold, was that computers would create a new democratic world (in cyberspace) that would be fully free and democratic, in which any person, no matter how humble and isolated, would be able to express and spread opinions.

In 1984 Ken Goffman, then better known as R.U. Sirius, started the underground magazine High Frontiers, billed as "the Space Age Newspaper of Psychedelics, Science, Human Potential, Irreverene & Modern Art", making yet another connection between the counterculture of the Bay Area and the emerging high-tech scene (in fact, it would change name to Reality Hackers before becoming Mondo 2000).

In 1984 British-born broadcasting veteran Harry Marks and graphic designer Richard Wurman organized in Monterey (two hours south of San Francisco) the first Technology Entertainment and Design (TED) conference, which would become an annual series starting in 1990.

In 1986 Judy Malloy published the computer-mediated hyper-novel "Uncle Roger" on the WELL. In 1983 Christina Augello had founded the Exit Theatre, that was becoming a reference point for the performance scene.

The Whole Earth Catalog had provided the first link between the art crowd of San Francisco and the high-tech crowd of Silicon Valley. During the 1980s the links multiplied: in 1981, Trudy Myrrh Reagan organized in San Francisco the first YLEM meeting of artists working with new technologies; in 1984, Roger Malina, an astronomer at UC Berkeley, established Leonardo in San Francisco to foster the integration of art and science; in 1984, Marcia Chamberlain at San Jose State University organized the first CADRE conference.

Science was never quite predictable in the Bay Area. In 1984 the "Search For Extraterrestrial Intelligence" or SETI Institute, a non-profit organization supported by private philanthropists, opened its doors in Silicon Valley. It implemented what NASA Ames' "Project Cyclops" had recommended in 1971 under the leadership of HP's director of research, Bernard Oliver.

In january 1985 Kevin Kelly launched the magazine "Whole Earth Review", the successor to Stewart Brand's "Whole Earth Catalog", except that it was now an opinion journal. It introduced Virtual Reality, the Internet and Artificial Intelligence to the masses of Silicon Valley hackers, and its articles embodied the idealistic and futuristic aspects of software development in the Bay Area.

The marriage of academia, corporate world and military-industrial complex was becoming even more promiscuous now that the Bay Area was strategic. For example, in 1987 in Berkeley former SRI's strategist Peter Schwartz, now working for an oil corporation, cofounded the consulting firm Global Business Network with SRI's marketing analyst Jay Ogilvy, and hired Stewart Brand of the WELL, who was already consulting for the same oil corporation. This quickly became one of the most influential "think tanks" in the USA, specializing in the kind of "scenario planning" pioneered by Herman Kahn at the Hudson Institute.

Driven by college radios and alternative magazines, music for young people underwent a major transformation. Despite the limitations of the instruments, the musical avantgarde (for example, Negativland, Naut Humon, Constance Demby and Robert Rich) was experimenting with techniques once reserved to the research centers. Rock music, ranging from avant-metal band Faith No More to avant-folk ensemble American Music Club, displayed a preference for destabilizing genres.

The Mission District continued to be a center of street art. In 1984 Patricia Rodriguez of Las Mujeres Muralistas fame and and Ray Patlan (a Vietnam War veteran and Chicago muralist who in 1970 had helped jumpstart the Pilsen community center Casa Aztlan for Mexican immigrants and who had moved to SF in 1975) launched the PLACA project that gathered more than 30 artists to protest the USA's involvement in the civil wars of Central America. This was the beginning of Balmy Alley, between 24th Street and Garfield Square, soon to boast the highest concentration of agit-prop murals in the state.

In 1986 Larry Harvey started the first "Burning Man" on Baker Beach in San Francisco. By simply burning a sculpture, he started one of the most influential grass-roots festivals of the age. In a sense it represented a fusion of the psychedelic and of the hobbyist cultures of the Bay Area. In a few years Burning Man moved to a desert, and attracted thousands of independent artists willing to burn their art after displaying it. Somehow, that phenomenon mirrored the whole Silicon Valley experience (and, not coincidentally, would become extremely popular among Silicon Valley "nerds" who otherwise had no interest in art). "Burning Man", born out of a counterculture that reacted against what Silicon Valley represented, was an appropriate metaphor for what Silicon Valley represented.


(Copyright © 2010 Piero Scaruffi)

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence