A History of Silicon Valley

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence
Purchase the book

These are excerpts from Piero Scaruffi's book
"A History of Silicon Valley"


(Copyright © 2010 Piero Scaruffi)

8. The Entrepreneurs (1976-80)

by Piero Scaruffi

The Apple Vision

Apple's hometown, Cupertino, was a young city created in october 1955, Santa Clara County's 13th city, originally known as just "Crossroads" because it developed around the crossroads of Stevens Creek Blvd and Saratoga-Mountain View Road (later Saratoga-Sunnyvale Road, and later De Anza Boulevard). The area had one large employer, who was also one of the state's richest men: Henry Kaiser, who owned the local rock quarry and cement plant (renamed Kaiser Permanente Cement Plant in 1939). After the war Cupertino's high school, Homestead High School, started a class on electronics under John McCollum, one of the first in the world; and one of his students and assistants was Steve Wozniak.

In april 1976 hobbyist Steve Wozniak (now a college dropout and a Hewlett-Packard employee) and hippie Steve Jobs (another college dropout who had joined Atari in 1972 and who had experimented with LSD and Buddhism as popular in the 1960s, and had grown up on Stewart Brand's "Whole Earth Catalog", and actually knew very little about semiconductors) started Apple Computer in Cupertino. Wozniak had built their first microcomputer in Jobs' garage in nearby Los Altos using MOS Technology's 6502 microprocessor ($20) because he could not afford the more advanced Motorola 6800 or Intel 8080 (both about $170). The user had to provide her/his own monitor but the Apple I could be hooked up to an inexpensive television set. Their friend Paul Terrell of the Byte Shop was the first one to promote it (he paid Jobs $666.66 for each unit).

Wozniak was unique in that he designed both the hardware and the software of the Apple I. He was still an HP employee when he did that. He was reluctant to resign and did so only in october 1976.

The key difference between the Apple I and its predecessors (such as the famous Altair) was actually in the amount of memory. Wozniak felt that a computer without a programming language was an oxymoron and strived to build a computer powerful enough to run a real programming language. The main requirement was a larger memory than the first personal computers had. Unfortunately, static RAM was expensive. Therefore he had to turn to the cheaper dynamic RAM. The 4K DRAM had just been introduced in 1974. It was the first time that RAM (a semiconductor memories) was lower than magnetic core memory. The DRAM was much cheaper than the static RAM used by all previous microcomputers, a fact which allowed Wozniak to pack more of it in the Apple than the Altair had. The key design issue was how to continuously refresh the dynamic RAM so that it wouldn't lose its information (the static RAM used by the Altair did not lose its information). Roberts had basically just dressed up an Intel microprocessor in order to create his Altair. Wozniak instead dressed up a memory chip in order to create the Apple I. This 4K computer was capable of running a real programming language. Since there was no language yet for that microprocessor, Wozniak also had to write in assembly language the BASIC interpreter for the Apple I. Note that Wozniak's motivation in creating the Apple I was not a business plan but simply the desire to own a computer; a desire constrained by a lack of money. The Apple I was the result of an optimization effort more than anything else: Wozniak had to minimize the parts and simplify the structure.

Wozniak, however, shared the same "business" vision that Roberts had with the Altair: his personal computer was meant for hobbyists, i.e. for technically savvy users who were going to program the computer themselves to solve whatever problems. Wozniak had simply made it much easier to write a program. He did not envision that the average user of a personal computer would be someone who "buys" the application programs (already written and packaged).

It was, however, the Apple II, still based on a 6502 and released in april 1977, which really took off. Funded by former Fairchild and Intel executive Mike Markkula, who had retired at 32 thanks to the stocks of those two companies, this desktop computer was fully assembled, requiring almost no technical expertise, and boasted the look and feel of a home appliance. It had a monitor and a keyboard integrated with the motherboard shell, as well as a ROM hosting a BASIC interpreter and a RAM of 4 kilobytes (but no operating system). Part of its success was due to Apple's Disk II, the first affordable floppy-disk drive for personal computers, which replaced the cassette as the main data storage medium. The other factor in its early success was the software: a dialect of BASIC licensed from Microsoft (popularly known as "Floating Point BASIC") and the word-processor EasyWriter, written in 1978 by John Draper while in jail for phone phreaking.

Apple was just one of many companies that used the 6502. Commodore was another one: after the PET (released in october 1977) in 1980 came the VIC-20, the first personal computer to sell one million units. The Atari 800, announced in late 1978 and designed by Jay Miner, was also based on the 6502. British computer manufacturer Acorn, founded in 1978 near Cambridge University by German-born Herman Hauser, also used the 6502 computer for the BBC Micro that was deployed throughout the British educational system. However, most companies used the 6502 for something else. In october 1977 Atari introduced a videogame console, the VCS (Video Computer System, later renamed 2600). Previous generations of videogame machines had used custom logic. The videogame console that pioneered the use of a microprocessor was Fairchild's Video Entertainment System, released in august 1976 and based on Fairchild's own F8 microprocessor.

The other popular low-cost microprocessor was the Zilog Z80, used for example in Tandy/Radio Shack's TRS-80 microcomputer, another bestseller of 1977. One of the many startups that preferred the Z80 over the IMSAI 8080 was Cromemco, started in 1976 by Stanford students Harry Garland and Roger Melen in their dormitory (hence the acronym Crothers Memorial). The companies that missed the train were, surprisingly, the ones that dominated the market for calculators. Texas Instruments' TI 99/4 (december 1979) was based on its own 16-bit TI 9940 processor that was simply too expensive; and the even more expensive Hewlett-Packard's HP-85 (january 1980), based on an HP custom 8-bit processor, was something halfway a mini and a micro.

48,000 personal computers were sold worldwide during 1977. The following year more than 150,000 were sold, of which 100,000 were Tandy/ Radio Shack TRS-80s, 25,000 Commodore PETs, 20,000 Apple IIs. 5,000 IMSAIs and 3,000 Altairs.

Steve Jobs' vision was to create a computer that was a home appliance. In reality the Apple II was still a hobbyist novelty, like most small computers based on microprocessors. It never became a home appliance, but it got transformed into something no less pervasive: an office tool. The transformation was not due to a hardware idea but to software. In 1979 Harvard University's business-school student Dan Bricklin developed VisiCalc, the first spreadsheet program for personal computers. That's when sales of the Apple II truly started to take off. A software application made the difference between selling thousands of units and selling millions of units. Apple went public the following year. Its IPO (Initial Public Offering) raised a record $1.3 billion. Visicalc was ported to the Tandy TRS-80, Commodore PET and the Atari 800, becoming the first major application that was not tied to a computer. WordStar, a word-processor developed in 1979 by former IMSAI engineer Rob Barnaby for the CP/M operating system and pubished by MicroPro, a company founded in 1978 by another former IMSAI employee, Seymour Rubinstein, became the first widely known word-processor for personal computers (and the new bestselling package for this class of machines); and its success propelled MicroPro to become the first major software company of the Bay Area. WordStar and VisiCalc were the two top-selling programs of the early 1980s, each boasting sales of more than half a million copies by 1983. The company that understood the value of software was Tandy, an old Texas shoe business that in 1963 had converted to electronics buying the RadioShack chain and, thanks to a former Byte Shop employee, Steve Leininger, suddenly found itself at the vanguard of personal computer technology. Its TRS-80 of 1977 boasted an unprecedented library of applications (mostly games, but also word processors and spreadsheets).

The age of Visicalc represented a major shift in the use of computers. Other than for scientific research and military applications, commercial mainframe computers had been useful mainly for business applications such as accounting and minicomputers were mainly employed in factories. The new software for personal computers opened up the market of regular offices that were still mostly based on paper or that were using expensive time-sharing services.

After the Japanese company Epson launched the MX-80 dot-matrix printer in 1979 home printing too became feasible.

Of course, not everybody shared Steve Jobs' vision. Just like in the old days an IBM executive had predicted a very small market for computers, so in 1977 DEC's founder Kenneth Olsen proclaimed "there is no reason anyone would want a computer in their home."

Microprocessor Wars

Meanwhile, the war of the microprocessors was still raging on. In june 1979 Intel introduced the 16-bit 8088 (containing 29,000 transistors), and in september 1979 Motorola introduced the 16-bit 68000 microprocessor (containing 68,000 transistors). In between the two, Zilog introduced the 16-bit Z8000 (only 17,500 transistors). At the same time sales of DRAM continued to skyrocket. Intel then assigned the task of designing the 8086 (eventually released in june 1978) to a software engineer, Stephen Morse: it was the first time that a microprocessor was designed from the perspective of software. In 1974 the size of the chips had reached 4 kilobytes and in 1975 it was already 16 kilobytes. By 1979 there were 16 companies selling DRAMs of 16 kilobytes. Five of them were based in Japan. In 1977 the semiconductor industry of Silicon Valley employed 27,000 people.

Progress in semiconductor technology was no longer making the headlines, but continued faster than ever. By 1980 integrated circuits (the vast majority of which were manufactured in the USA) incorporated 100,000 discrete components. In 1978 George Perlegos at Intel created the Intel 2816, an EEPROM (Electrically Erasable Programmable Read-Only Memory), basically an EPROM that did not need to be removed from the computer in order to be erased. In 1977 the market for memory chips of all kinds was twice the size of the market for microprocessors. Combined, the two markets had grown from $25 million in 1974 to $550 million in 1979. However, it was not obvious yet that computers were going to be the main market for microprocessors. In 1978 the industry sold 14 million microprocessors, but only 200,000 personal computers were manufactured. The vast majority of microprocessors were going into all sorts of other appliances, calculators and controllers.

The competition was intense and forced Intel to experiment new architectures, but that resulted in one of the most embarrassing failures in its history. Already in 1975 Intel wanted to replace the 8080 with a very advanced 32-bit microprocessor, dubbed 432, that Intel viewed as a "micro-mainframe". The project, based at Intel's Oregon facilities, called for an object-oriented architecture a` la Smalltalk and a multitasking operating system written in the high-level language Ada, and designed using the latest CAD methods. One year later Intel also started the project for a more conventional successor to the 8080, the 16-bit 8086. The 432 was eventually introduced in 1981 as the 8800, but the 8086 was already a hit and the 8800 never made news. In 1982 Intel and Siemens formed a joint venture to use a RISC version of the 432 for a fault-tolerant multiprocessor architecture dubbed BiiN, but that wasn't too successful either.

An impressive phenomenon of the era was the number of hardware spinoffs launched by enterprising Chinese immigrants trained in some of Silicon Valley's most advanced labs: Compression Labs (CLI) by Wen Chen (1976) to make video conferencing and digital television components; Solectron by Roy Kusumoto and Winston Chen (Milpitas, 1977) to make printed circuit boards; Data Technology Corporation (DTC) by David Tsang (Milpitas, 1979) for floppy-disk and hard-disk drives; Lam Research by David Lam (Fremont, 1980) for equipment for chip manufacturing (or "etching"); Integrated Device Technology by Chun Chiu, Tsu-Wei Lee and Fu Huang (San Jose, 1980) for semiconductor components; Weitek by Edmund Sun, Chi-Shin Wang and Godfrey Fong (San Jose, 1981) for chips for high-end computers; fiber-optic pioneer E-Tek Dynamics by Ming Shih (San Jose, 1983); magnetic-disk manufacturer Komag by Tu Chen (Milpitas, 1983); etc.

Data-driven Business

Another front was being opened by companies studying how to store data. More computers around meant more data to store. It was intuitive that some day the industry for data storage would be a huge one. Audiocassettes were used for data storage by most microcomputers of the first generation, including the Apple II, the Radio Shack TRS-80 and the Commodore PET. Floppy disks had become increasingly popular, especially after Alan Shugart developed the smaller version (originally in 1976 for Wang). The first "diskettes" were manufactured by Dysan, a storage-media company formed in 1973 in Santa Clara by Norman Dion. The growing number of applications running on personal computers required a growing number of floppy units. Finis Conner, working for Shugart of Memorex fame, had the idea of building a fixed, rigid disk of the same physical size as Dysan's flexible diskette that would provide both high performance and high capacity, equivalent to an entire bunch of floppy drives. Shugart and Conner formed Shugart Technology in december 1979 in Scotts Valley (south of San Jose), later renamed Seagate Technology, with funding from Dysan. In 1980 Seagate introduced the first hard-disk drive for personal computers (capable of 5 Megabytes), and soon hard disks would greatly improve the usability of small machines. That same year Sony introduced the double-sided, double-density 3.5" floppy disk that could hold 875 kilobyte.

Seagate also published the specifications of a computer interface that would allow users to connect different peripherals to the same personal computer, a device-independent "parallel" connection. They named it SASI (Shugart Associates Systems Interface), later renamed SCSI (Small Computer System Interface) when it was adopted as an industry standard. In 1981 the manager of the SASI project, Larry Boucher, quit Seagate to found Adaptec in Milpitas (north of San Jose), taking with him several Seagate engineers in a move that evoked Shugart's own exodus from IBM of 1969. Adaptec specialized in manufacturing computer cards (at Singapore factories) to solve the growing problem of input/output bottlenecks in personal computers as the machines had to deal with higher and higher traffic of data.

Storing data was not enough. It was also important to guarantee that the transactions of those data were reliable. Since computers and their software were prone to crashes, this was not a trivial problem, particularly in the arena of financial transactions. Former HP employee James Treybig convinced a few HP engineers to work at a fault-tolerant machine and started Tandem Computers in Cupertino. In 1976 they delivered the first product, based on a CPU derived from the HP3000 and running a proprietary operating system. Tandem servers were ideal for mission-critical business applications carried out by banks.

They immediately had a competitor in Boston: Stratus, founded in 1980 by Bill Foster from Data General (and a former coworker of the Tandem founders at HP), Gardner Hendrie from Computer Control Company and Bob Freiburghouse, founder in 1974 of Translation Systems, one of the earliest software companies specializing in compilers.

The exponential growth of digital communications led to the rapid expansion of the field of cryptography. The most pressing problem was to secure communications over distances between parties that had never met before, i.e. who could not exchange a secret key in private before beginning their digital communications. This problem was solved in 1976 by combining ideas from Stanford (Whitfield Diffie and Martin Hellman) and UC Berkeley (Ralph Merkle), opening the era of Public-Key Encryption (PKI). The following year the Israeli cryptographer Adi Shamir at the MIT invented the RSA algorithm and added digital signatures to PKI.

The ever larger amount of data stored on disks created ever bigger databases, which, in turn, required ever more powerful database management systems. Larry Ellison, a college dropout from Chicago who had moved to California in 1966, had been employed at Ampex in Redwood City, where he worked as a programmer on a database management system for the Central Intelligence Agency (CIA) codenamed "Oracle" under the management of his boss Bob Miner, the son of Middle-Eastern immigrants. In august 1977 Bob Miner and Ed Oates (also an Ampex alumnus, now at Memorex) founded the Software Development Laboratories to take advantage of a software consulting contract facilitated by Larry Ellison at his new employer, Precision Instruments (also a manufacturer of tape recorders, based in San Carlos, mainly serving NASA and the Navy). The start-up used offices in PI's Santa Clara building. When Ellison joined them, he steered them towards developing an SQL relational database management system of the kind that IBM had just unveiled in San Jose but targeting the minicomputer market. Miner and their fourth employee Bruce Scott (another former member of Miner's team at Ampex) wrote most of it in the assembly language of the PDP-11, the company was renamed Relational Software and relocated to Menlo Park, and in 1978 the CIA purchased the first prototype. In 1979 Relational officially shipped the first commercial SQL relational database management system, Oracle. In 1982 the company would be renamed one more time and would become Oracle Corporation.

A rival project, Michael Stonebraker's relational database system Ingres at U.C. Berkeley, was demonstrated in 1979. For all practical purposes, Ingres looked like a variant of IBM's System R for DEC minicomputers running the Unix operating system. Being open-source software like Berkeley's Unix (BSD), within one year Ingres was deployed by many universities around the country, as the first available relational database system (IBM's System R was not available outside IBM). In 1980 Stonebraker himself started a company, Relational Technology, later renamed Ingres, to market the system. That same year Roger Sippl and Laura King, who had implemented an experimental relational database system, started Relational Database Systems in Menlo Park, a company that was later renamed Informix. In 1984 some of Stonebraker's students (notably Mark Hoffman and Bob Epstein) formed Systemware, later renamed Sybase, in Berkeley. All of these start-ups had something in common: they did not target the huge market of mainframe computers. They targeted the smaller market of minicomputers, in particular the ones running the Unix operating system. IBM's IMS dominated the database market for mainframe computers, but IBM had failed to capitalize on the experimental System R developed at its San Jose laboratories. IBM eventually released a relational database management system, the DB2, in 1983, but it was running only on its mainframe platform: IBM still showed little interest in smaller computers. This allowed Oracle, Sybase and Informix to seize the market for database systems on minicomputers.

The old practices of "business intelligence" and "decision support" were evolving into "data mining" and "data analytics" as the amount of data increased. The pioneers were Britton Lee, founded in 1979 by David Britton, Geoffrey Lee and Ingres' alumnus Robert Epstein (before he founded Sybase), and Teradata, a Caltech spinoff of 1979 that would acquire Britton Lee in 1990 (one year before being acquired by NCR). Teradata came out of research by Phil Neches and others for a high-performance specialized architecture hardware to run large databases, a database management appliance in the tradition of Tandem's machines, implemented by multiple microprocessors working in parallel. Teradata (whose system first shipped in 1983) and Tandem's NonStop SQL (1984) popularized the "shared nothing architecture" in which each server of the cluster computes its own data, an architecture that would become popular in the age of the Web.

Those database management systems were ok for larger machines but obviously not for the personal computer. In 1980 most personal computers had only 32 Kbytes of RAM and a cassette tape drive. The 120-Kbyte floppy drive was still a rarity but random access to storage was coming. Hard drives did not exist yet for personal computers. The first major database program for this class of machines was Personal Filing System (PFS), written by John Page for the Apple II. This would become part of a suite of software products (or "productivity tools") sold by Software Publishing Corportation, founded in 1980 by three former Hewlett-Packard employees to implement Fred Gibbons' vision: he was one of the first people to view a personal computer as the equivalent of a music player, playing not music but applications. In 1983 the same company would start selling the word-processor PFS:Write, followed by a spreadsheet program, a reporting program and a business graphics program.

Communication-driven Business

The other field still in its infancy was the field of telecommunications. The company that put Silicon Valley on the map of telecommunications was probably ROLM, founded by a group of Stanford students. In 1976 they introduced a digital switch, the CBX (Computerized Branch Exchange), a computer-based PBX (private branch exchange) that competed successfully with the products of Nortel and AT&T.

Meanwhile, the Southern Pacific Railroad of Burlingame (south of San Francisco) renamed its Southern Pacific Communications (SPC), which had been selling private phone lines since 1972 during the days of the AT&T monopoly on public telephony, as "Sprint" (Switched PRIvate Network Telecommunications) in 1978 when it was finally allowed by the government to enter the long-distance telephony business. Its Burlingame laboratory was another early source of know-how in telecommunications. Sprint would be later (1982) acquired by GTE and then (1986) by United Telecom of Kansas.

In 1979 Bob Metcalfe, the "inventor" of the Ethernet, left Xerox PARC to found 3Com (Computers, Communication and Compatibility) in Santa Clara. The idea was to provide personal computer manufacturers with Ethernet adaptor cards so that businesses could connect all the small computers in one local-area network. 3Com became the first company to ship a commercial version of TCP/IP (1980). In 1979 Zilog's cofounder Ralph Ungermann and one of his engineers at Zilog, Charlie Bass, formed Ungermann-Bass in Santa Clara to specialize in local-area networks, particularly in the Ethernet technology.

Nestar Systems, started in Palo Alto in 1978 by Harry Saal and Leonard Shustek, introduced a pioneering client-server network of personal computers. That was another Stanford spinoff. Shustek had worked as a student with Forest Baskett at Stanford on a graphics terminal that in 1976 had been licensed to Tektronix.

The main competitor of Ethernet was "token ring", pioneered in Britain after 1974 by the Cambridge Ring developed at Cambridge University by Maurice Wilkes' Polish-born student Andrew Hopper. In 1979 the MIT developed its own token-ring network that was commercialized in 1981 by Howard Salwen's Proteon as the ProNet. Also in 1981 Apollo of Boston introduced their proprietary Apollo Token Ring (ATR), and in 1985 IBM introduced its own token-ring solution, developed in their Zurich laboratories. Also in 1981 Sytek (founded in 1979 in Sunnyvale by Michael Pliner and other former employees of Ford Aerospace of Sunnyvale) introduced LocalNet, a broadband LAN that covered longer distances than Ethernet. The most successful of the proprietary LAN solutions came from Corvus, founded by Michael D'Addio and Mark Hahn in 1979 in San Jose. Ethernet boards were too expensive and too large to fit inside an Apple II. On the other hand, the real problem was that hard drives were expensive and Apple II users wanted to share one. So Corvus developed its own variant of the Ethernet, first Constellation in 1980 and then Omninet in 1981 that allowed multiple Apple II to share the same hard drive.

Meanwhile, in Georgia in 1977 Dennis Hayes, a hobbyist who was employed at National Data Corporation on a project to provide bank customers with modems for electronic money transfers and credit card authorizations, started working on a modem for personal computers, a device that converted between analog and digital signals and therefore allowed personal computers to receive and transmit data via telephone lines. He soon founded his own company, Hayes Microcomputers Products, and announced the Micromodem 100 that could transmit at 110 to 300 bits per seconds (bauds). This modem was a lot simpler and cheaper than the ones used by mainframes, and, more importantly, it integrated all the functions that a modem needed to perform. Texas Instruments too introduced a 300-baud modem for its TI 99/4 in 1980.

Meanwhile, the Bell Labs deployed the first cellular phone system (in Chicago in 1978).

The mother of all computer networks was still largely unknown, though. In 1980 the Arpanet had 430,000 users, who exchanged almost 100 million e-mail messages a year. That year the Usenet was born, an Arpanet-based discussion system divided in "newsgroups", originally devised by two Duke University students, Tom Truscott and Jim Ellis. It used a protocol called UUCP (Unix-to-Unix Copy), originally written in 1978 by Mike Lesk at AT&T Bell Laboratories for transferring files, exchanging e-mail and executing remote commands. Despite the fast growing number of users, at the time nobody perceived the Arpanet as a potential business.

In 1977 the DARPA, working closely with SRI, chose the San Francisco Bay Area to set up a "packet" radio network (Prnet) capable of exchanging data with Arpanet nodes. It was the beginning of wireless computer networking. After early experiments by Canadians ham radio amateurs, in december 1980 Hank Magnuski set up in San Francisco a ham radio to broadcast data (the birth certificate of the AmPrnet). The first wireless products for the general market would not appear for a decade, but, not coincidentally, would come from a company based in Canada, Telesystems, and a company based in the Bay Area, Proxim (founded in 1984 in Sunnyvale). In 1980 CompuServe introduced a chat system, the CB Simulator, that was modeled after Citizen's Band radio (CB radio). This was the most popular form of short-distance two-way radio communications. CB clubs had multiplied all over the world during the 1970s as more and more ham-radio hobbyists had been able to purchase the equipment thanks to a drop in the price of electronics.

On a smaller scale, another influential communications technology was getting started in those days. Radio-frequency identification (RFID) is basically a combination of radio broadcast technology and radar. It is used to have objects talk to each other. The idea, that dates back at least to 1969, when Mario Cardullo sketched it while working at the Communications Satellite Corporation (Comsat) and before starting ComServ in Washington, was tested by pioneering projects like Raytheon's "Raytag" (1973) and Fairchild Semiconductor's "Electronic Identification System" or EIS (1975), but it truly matured at the Los Alamos National Laboratory in the 1970s. Its first mass-scale applications came thanks to two spinoffs from that laboratory: Amtech in New Mexico and Identronix in California (established in 1977 in Santa Cruz and soon managed by Vic Grinich, one of the original "traitorous eight" who had split off from Shockley Labs to start Fairchild Semiconductors). Their work originated the automated toll payment systems that became common on roads, bridges and tunnels around the world after Norway pioneered them in 1987, as well as the remote keyless entry systems.

Entertainment-driven Business

The business of gaming software came to the Bay Area via Broderbund, founded after hobbyist Doug Carlston had written the game "Galactic Empire" for the TRS-80 in 1979. This came one year after Toshihiro Nishikado, a veteran game designer who had developed Japan's first video arcade game in 1973, created the first blockbuster videogame, "Space Invaders". In 1980 it was ported to the Atari 2600, and broke all records of sales, creating the demand for videogame consoles that made the video arcade obsolete. "Galactic Empire" thrived in its wake.

Carl Rosendahl founded Pacific Data Images in Sunnyvale in 1980, a pioneer of computer animation that would provide morphing visual effects for countless music videos.

Hobbyists were also finding ever newer uses for microprocessors. For example, in 1977 Dave Smith, a former U.C. Berkeley student who had started a company to make music synthesizers, built the "Prophet 5", the first microprocessor-based musical instrument, and also the first polyphonic and programmable synthesizer. Dave Smith also had the original idea that led (in 1983) to the MIDI (Musical Instrument Digital Interface), a standard to attach musical instruments to computers.

Fundamental for the development of the multimedia world was the introduction in 1978 of Texas Instrument's TMS5100, the first digital signal processor.

Friendliness

Progress was also needed in developing user-friendly computers. The Xerox Alto had been the single major effort in that area. Xerox never sold it commercially but donated it to universities around the world. That helped trigger projects that later yielded results such as the Stanford University Network (SUN) workstation. In 1979 Steve Jobs of Apple had his first demonstration of an Alto at Xerox PARC, and realized that the mouse-driven GUI was the way to go. Xerox eventually introduced in april 1981 the 8010 Star Information System, that integrated the optical mouse (the creation of Richard Lyons), a GUI (largely designed by Norm Cox), a laser printer, an Ethernet card, an object-oriented environment (Smalltalk) and word-processing and publishing software. Programming this computer involved a whole new paradigm, the "Model-View-Controller" approach, first described in 1979 by Trygve Reenskaug. Xerox PARC was also experimenting with portable computers: the NoteTaker, unveiled in 1976 but never sold commercially, was basically a practical implementation of Alan Kay's Dynabook concept (by a team that included Adele Goldberg). In 1977 the Xerox PARC gave a presentation to Xerox's management of all the achievements of the research center, titled "Futures Day". Despite the spectacular display of industrial prototypes, the management decided that Xerox should continue focusing on document processing. This started the exodus of brains from Xerox PARC towards the Silicon Valley start-ups.

In 1979 filmmaker George Lucas hired Ed Catmull to open a laboratory (later renamed Pixar) in San Rafael (far away from Hollywood, on the quiet sparsely populated hills north of San Francisco) devoted to computer animation for his San Francisco firm Lucasfilm. Catmull had studied with Ivan Sutherland at the University of Utah, established the Computer Graphics Laboratory at the New York Institute of Technology in 1975, and helped create a computer animation in a scene of the film "Futureworld" (1976) that was the first ever to use 3D computer graphics.

The 1980s also witnessed the birth of the first computer graphics studios. Carl Rosendahl started Pacific Data Images (PDI) in 1980 in his Sunnyvale garage. Richard Chuang and Glenn Entis created a 3D software platform (initially running on a DEC PDP-11) that turned PDI into the first mass producer of computer animation, initially for television networks but later (after Thaddeus Beier and Shawn Neely perfected feature-based morphing on Silicon Graphics' workstations) also for feature films.

These projects were way ahead of their time. The ordinary world of computing was content with the VT100, the "video" display introduced by Digital (DEC) in 1978, the computer terminal that definitely killed the teletype. Progress in video terminals had been steady, driven by cheaper and more powerful memory. In 1978 it reached the point where a video terminal was cheap enough for virtually every organization to purchase it.

The Bay Area was also a hub for the lively debate on artificial intelligence. In 1950 Turing had asked: "when can the computer be said to have become intelligent?" In 1980 Berkeley philosopher John Searle replied: "never". Searle led the charge of those who attacked the very premises of artificial intelligence. Nonetheless, in the same year Stanford's Ed Feigenbaum and others founded IntelliGenetics (later renamed Intellicorp), an early artificial intelligence and biotech start-up, the first of many to capitalize on the "expert systems" pioneered by Stanford University.

French conglomerate Schlumberger acquired in 1979 the whole of Fairchild Camera and Instrument, including Fairchild Semiconductor and the following year hired Peter Hart from SRI to establish the Fairchild Laboratory for Artificial Intelligence Research (FLAIR), later renamed Schlumberger Palo Alto Research (SPAR) center, a clear reference to Xerox PARC.

Design in Silicon Valley

Design thinking originated in 1957 at Stanford when John Arnold an eccentric MIT professor moved to Stanford to teach creativity. In 1959 he delivered a talk titled "Creativity in Engineering", in which he accused engineers of lacking creativity. At the time, the most influential figure in industrial design was Henry Dreyfuss, the author of "Designing for People" (1955). Arnold thought that design was not just about art and technology, but required creative thinking to start with. Design thinking became popular after Herbert Simon published "The Sciences of the Artificial" (1969). Robert McKim, who had succeeded Arnold at Stanford, wrote another seminal book, "Experiences in Visual Thinking" (1973). The minds behind design thinking, and behind the Joint Program in Design, met in a loft of Building 500. In 1978 McKim's student David Kelley formed his first design firm, which in 1983 designed the computer mouse for Apple. Rolf Faste became director of the Joint Program in Design in 1984 (and would remain its director for almost 20 years) and gave a further twist to design thinking: he merged spiritual Zen practice, improvisation and engineering. For the last decade of his life Faste worked on a book titled "Zengineering". Out of this movement would come the firm IDEO, founded in 1991 by David Kelley, Bill Moggridge and Mike Nuttall, and in 2005 the Stanford dSchool. The other important school of design in Silicon Valley was the one specifically for personal computers. In 1977 Alan Kay and Adele Goldberg of Xerox PARC wrote a paper titled "Personal Dynamic Media" that enunciated general principles for home computers, a paper that can be considered the beginning of "user-experience" (UX) design. Tim Mott, also at Xerox PARC, was equally important for the birth of user-centered design: he envisioned the digitally-emulated office that became the Xerox STAR. Bruce Tognazzini was another influential figure, hired by Apple in 1978 to write the "Apple Human Interface Guidelines" that would lead to the Macintosh. In 1993 he would be succeeded by Don Norman, a cognitive psychologist from UC San Diego, who is credited with popularizing the expression "user experience".

The Unix Generation

Betting on the Unix operating system was a gamble. The dominating computer company, IBM, had no intention of adopting somebody else's operating system. AT&T (the owner of Bell Labs) had made Unix available to anyone that wanted to use it. No major computer manufacturer was interested in an operating system that all its competitors could use too. The vast majority of the users of Unix were at Bell Labs and in universities around the world. Universities that received the source code of the operating system began to tinker with it, producing variants and extensions. U.C. Berkeley had received its copy in 1974. Within three years its version, assembled by former student Bill Joy as the "Berkeley Software Distribution" (BSD), became popular outside Berkeley. The second BSD of 1978 included two pieces of software developed by Joy himself that became even more popular: the "vi" text editor and the "C shell". Berkeley made it very easy for other universities and even companies to adopt BSD. Unix had become by far the world's most portable operating system. Until then, however, the vast majority of Unix implementations used a PDP-11. Eventually, in 1980, a company, Onyx, started in Silicon Valley by former Harvard professor Bill Raduchel, had the idea to build a microcomputer running UNIX. The Onyx C8002 was based on a Zilog Z8000, had 256-kilobyte RAM and included a 10-megabyte hard disk for the price of $11,000, a cheaper alternative to the PDP-11. It was followed by Apollo in the same year, then SUN Microsystems in 1981 and Silicon Graphics in 1982.

In 1979 Larry Michels founded the first Unix consulting company, Santa Cruz Operation (SCO), another major act of faith in an operating system that had no major backer in the industry. In 1980 Microsoft announced the Xenix operating system, a version of Unix for the Intel 8086, Zilog Z8000 and Motorola M68000 microprocessors. What was missing was the killer application. Help arrived from the USA government: in 1980 when the time came to implement the new protocol TCP/IP for the Arpanet so that many more kinds of computers could be interconnected, the DARPA (Defense Advanced Research Projects Agency) decided not to go with DEC (that would have been the obvious choice) but to pick the Unix operating system, specifically because it was a more open platform. Until that day there had been little interaction between the Internet world and the Unix world. After that day the two worlds began to converge. It is interesting that DARPA decided to unify "nodes" of the network at the operating system level, not at the hardware level.

Membership in the Unix world mainly came through the academia. Just about every Unix user had been trained in a university. All software refinements to the Bell Labs code had come from universities. However, the Unix community soon came to exhibit a "counterculture" dynamics that mirrored the dynamics of the computer hobbyists who had invented the personal computer. Unix was another case of a technology ignored by the big computer manufacturers and left in the hands of a community of eccentric independents who could not avail themselves of the financial, technological and marketing infrastructure of the computer business. The big difference, of course, was that in this case the universities served as local attractors for the community more than magazines, clubs or stores. The Internet played the role that magazines had played in the 1970s, helping to disseminate alternative ideas throughout the nation. Another difference was that the average Unix innovator was a highly educated scientist, not just a garage engineer (hence the widely used expression "Unix guru" instead of the more prosaic "computer hobbyist"). However, just like hobbyists, Unix users came to constitute a counterculture that reenacted rituals and myths of the counterculture of the 1960s. Both movements were founded on dissent, on an anti-establishment mood. Last but not least, both the personal computer and the Unix account had an appeal on this generation as a medium of individual expression in an age in which the media were castrating individual expression.

Both in the case of the personal computer and of the Internet, it was not a surprise that the invention happened: it was feasible and there was a market for it. The surprise is how long it took before it happened. The business "establishment" created a huge inertia that managed to postpone the inevitable. Viewed at a macroscale, government funding (from the 1910s till the 1960s) had accelerated innovation whereas large computer corporations in the 1970s had de facto connived to stifle innovation (outside their territory).

The Visible Hand of Capital

The amount of money available to venture capitalists greatly increased at the end of the decade thanks to two important government decisions. Venture capital became a lot more appealing In 1978 the USA government enacted the "Revenue Act", which reduced the capital gains tax rate from 49.5%to 28%. Even more importantly, in 1979 the government eased the rules on pension funds, allowing them to engage in high-risk investments. Even more important were probably just the numbers themselves: Arthur Rock had invested less than $60,000 in Apple in january 1978 and reaped almost $22 million in december 1980 when Apple went public. For several years Kleiner-Perkins was able to pay a 40% return to the customers of its high-tech fund. The base of Bay Area's venture capital started moving from San Francisco to 3000 Sand Hill Road, in Menlo Park, a complex of low-rise wooden buildings a few blocks from the Stanford Research Park. Within a few years several more venture-capital funds were founded and several of the East-Coast funds opened offices here.

The Invisible Hand of Government

Government spending also helped in less visible manners. In 1977 the Defense Department hired Bill Perry, the former ESL founder, to head their Research and Engineering Lab. The USA had just lost the war in Vietnam, and one country after the other was signing friendship treaties with the Soviet Union. The USA government decided that it was likely to lose a conventional war against the Soviet Union. The only hope to defeat the Soviet Union lay in launching a new generation of weapons that would be driven by computers, a field in which the Soviet Union lagged far behind. In the next four years the budget for the Defense Advanced Research Projects Agency (DARPA) was increased dramatically, leading to a number of high-tech military projects: the B-2 stealth bomber, the Jstars surveillance system, the Global Positioning System (GPS), the Trident submarine and the Tomahawk cruise missile. Many of these projects depended on technology developed in Silicon Valley.

The GPS was a by-product of the "space race" of the 1960s. Now that the USA had satellites orbiting around the Earth, it was possible to imagine that a constellation of satellites provided a better way for navigation than the traditional ways. The signals from three satellites over your head are enough to pinpoint your position. If there are enough satellites to cover the entire Earth, the GPS can pinpoint your position anywhere. The GPS project (originally a military project) was started in 1973 by Air Force colonel Bradford Parkinson, a Stanford alumnus (and future Stanford professor). The first satellites were launched in 1978. After 1978 military planes started using the GPS to figure out their position and their route. Until then they had used the ancient systems of navigation. Civilian airplanes continued to use the ancient systems, and they frequently ended up off route: in 1983 the Soviet Union shot down a Korean airliner killing 269 people because it had accidentally entered Soviet air space.

Biotech

The age of biotech started in earnest in the Bay Area with Genentech, formed in april 1976 by Herbert Boyer (the co-inventor of recombinant DNA technology or "gene splicing") and by (28-year-old) venture capitalist Robert Swanson, who set up offices at Kleiner-Perkins' offices in Menlo Park (their investor) and subcontracted experiments to the laboratories of U.C. San Francisco, City of Hope and the California Institute of Technology in Pasadena (whose student Richard Scheller became one of their early employees) to genetically engineer new pharmaceutical drugs. Genentech's first success came in 1977 when they produced a human hormone (somatostatin) in bacteria, the first cloning of a protein using a synthetic recombinant gene. In 1978 Genentech and City of Hope produced human insulin using recombinant DNA technology (approved for sale in 1982), and in 1979 Genentech cloned the human growth hormone. Biotech business began with human proteins made in bacteria.

The field got another boost in 1977 when Fred Sanger at Cambridge University in Britain developed a method for "sequencing" DNA molecules (genomes), i.e. for deciphering the sequence of the constituents of a DNA molecule, a process not all too different from deciphering the sequence of characters in a computer message. (For the record, the first genome that Sanger sequenced was the genome of the Phi X 174 bacteriophage). Another method was developed by Walter Gilbert's team at Harvard University. Gilbert joined forces with MIT's professor Phillip Sharp and founded Biogen in Geneva in 1978.

In 1979 Walter Goad of the Theoretical Biology and Biophysics Group at Los Alamos National Laboratory established the Los Alamos Sequence Database to collect all known genetic sequences from a variety of organisms and their protein translations (basically, a catalog of genes and their functions), hiring the consulting firm BBN (Bolt Beranek and Newman), the same firm that had set up the Internet.

Thanks to Sanger's method, it was possible to sequence DNA, but it was a cumbersome and expensive process. GenBank (Genetic Sequence Data Bank) was created in 1982 as a successor to Los Alamos' Sequence Database and maintained for a while by IntelliGenetics before 1989 when it was assigned to the newly created National Center for Biology Information, This public database of nucleotide sequences and their protein translations allowed scientists all over the country to share their results. In 1983 John Wilbur and David Lipman created the first "search engine" for DNA, so that a biologist could search the GenBank for sequences. In 1990 Stephen Altschul's team would introduce an even faster algorithm, BLAST (Basic Local Alignment Search Tool), and since then the GenBank would double in size every 18 months, mirroring Moore's Law of electronic chips.

The field of "regenerative medicine" was born in 1981 when, independently, Martin Evans and Martin Kaufman at Cambridge Univ and Gail Martin at UC San Francisco isolated embryonic stem cells of the mouse. Stem cells are the mothers of all the cells of our body. Once they specialize in a specific job, they cannot be used to make cells of a different kind, but, before they specialize, when they are still "pluripotent", they can develop into all cell types. The stem cells of the embryo are pluripotent. The stem cells of your nose are adult stem cells: they can develop into nose cells, not into liver cells. For more than a decade these studies were limited to other animals, but then scientists started studying the human embryonic stem cells. William Haseltine coined the expression "regenerative medicine" in 1992.

More biotech companies surfaced in those years on both coasts of the USA. In 1979 Sam Eletr, who had been the manager of a medical instruments team at HP Labs, founded in Foster City, near Oracle, the biotech start-up GeneCo, later renamed Applied Biosystems to build biotech instrumentation: first a protein sequencer and later a DNA synthesizer. Also notable in the Bay Area was Calgene, formed in 1980 by U.C. Davis scientists. Scientists from U.C. San Francisco and U.C. Berkeley formed Chiron in 1981 in Emeryville. Biotech became a "hot" field for venture capitalists.

A decision by the Supreme Court opened the floodgates of biotech start-ups: in 1980 it ruled that biological materials (as in "life forms") could be patented. Thanks to these scientific and legal developments, the Bay Area's first biotech company, Cetus, went public in 1981, raising a record $108 million. In 1983 Kary Mullis at Cetus would invent the "polymerase chain reaction", a process capable of amplifying DNA, i.e. of generating a myriad copies of a DNA sequence.

Outside the Bay Area the most successful company in recombinant DNA technology was perhaps Los Angeles-based Applied Molecular Genetics (later abbreviated to Amgen), founded in april 1980 by four venture capitalists (notably William Bowes) who hired a stellar team of scientists from Caltech and UCLA: in 1983 Taiwanese-born physiologist Fu-Kuen Lin cloned the hormone erythropoietin (better known as EPO), later patented as Epogen, into the ovarian cells of hamsters, and in 1985 Larry Souza cloned another hormone, granulocyte colony-stimulating factor (G-CSF), later patented as Neupogen. Revenues passed $1 billion in 1992.

Culture and Society

The Bay Area's cultural life was booming at the same time that the computer industry was beginning to boom; and it was still rather eccentric by the standards of mainstream culture. The Residents started the new wave of rock music with their bizarre shows and demented studio-processed litanies. In 1976 William Ackerman launched Windham Hill to promote a new genre of instrumental music, "new age" music. It was the soundtrack to a "new age movement" that simply updated Esalen's "human potential movement" and the spiritual element of the hippie generation for the new "yuppies" (young urban professionals), thereby creating an alternative spiritual subculture that promoted Zen-like meditation, astrological investigation, extra-sensorial powers, crystal healing, holistic medicine). A huge influence on the new-age movement, besides the Esalen center, was the "human potential movement" launched by George Leonard's "The Transformation" (1972). At the same time punk-rock reached California where it mutated into a particularly abrasive and vicious form, hardcore, notably with the Dead Kennedys, while the gay community patronized disco-music. Punk-rock was headquartered at the Mabuhay Gardens (on Broadway) and disco-music at the I-Beam (Haight-Ashbury). The logo of the Dead Kennedys, designed by collage artist Winston Smith, became an international symbol of rebellion. In 1977 rock keyboardist Vale Hamanaka (aka V Vale) started the punk-rock fanzine Search & Destroy, the rare link between the beat and punk cultures because it was originally funded by beat poets Allen Ginsberg and Lawrence Ferlinghetti. In 1980 Vale and Andrea Juno turned it into a much more intellectual magazine, Re/Search, that became one of the most influential publications of the Bay Area counterculture. In 1976 playwright Sam Shepard relocated to San Francisco to work at the Magic Theatre. The Herbst Theatre was established in 1977 on the site of the 1945 signing of the United Nations' charter. In 1977 George Coates founded his multimedia theater group, Performance Works. In 1978 Mark Pauline created the Survival Research Laboratories, which staged performances by custom-built machines. In 1979 Martin Muller opened the art gallery Modernism. In 1980 Sonya Rapoport debuted the interactive audio/visual installation "Objects on my Dresser".

San Francisco, an old port city, was transitioning from a shipping economy to a banking and tourism economy. This was leaving many old buildings empty. The Survival Research Laboratories were the main group taking advantage of those abandoned building for staging unorthodox (and mildly illegal) events, but not the only one. For example, the Suicide Club was founded in 1977 by Adrienne Burk, Gary Warne and Nancy Prussia and staged all sorts of provocative activities: costumed street pranks, sewer walks, a vampyre party in an abandoned funeral home, Nancy Prussia rode naked in a cable car, a treasure hunt that would become a staple of the Chinese New Year parade, pie fights, and especially bridge climbing. Also in 1977 Jack Napier, a member of the Suicide Club, started the Billboard Liberation Front, that coordinated graffiti artists (including the young Shepard Fairey) devoted to "improving" the commercial billboards, an entertaining form of anti-capitalistic warfare. If SRL was about machine exploration, the Suicide Club was about urban exploration. Both had something in common: an unusual degree of intensity (paralleled in music by the local punk-rock scene).

There were already many examples of philanthropy. For example, in 1979 Stanford University's professor and former Syntex scientist Carl Djerassi purchased land in the Santa Cruz Mountains west of Stanford and started the Djerassi Resident Artists Program. The program would attract dozens of world-class artists to create sculptures in the forest. In the mid-1980s John Rosekrans would establish the Runnymede Sculpture Farm on the family's vast estate in Woodside, acquiring over 160 outdoor monolithic sculptures.

The main scientific achievement of those years (not only in California) was the "Inflationary Theory", devised by Alan Guth at the Stanford Linear Accelerator (SLAC) at the end of 1979, a theory that changed the way cosmologists viewed the history of the universe.

Incidentally, in 1977 San Francisco's city supervisor Harvey Milk became the first openly gay man to be elected to office in the USA.

Meanwhile, Silicon Valley was just a place to work. The only major entertainment was represented by the amusement park Great America, that opened in 1976 in Santa Clara. However, IBM's laboratories in San Jose inaugurated a new way to work: from home. It wasn't the Internet yet, but IBM used telephone lines to connect the homes of five employees to the laboratories and allowed them to "telecommute".


(Copyright © 2010 Piero Scaruffi)

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence