A History of Silicon Valley

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence
Purchase the book

These are excerpts from Piero Scaruffi's book
"A History of Silicon Valley"


(Copyright © 2010 Piero Scaruffi)

7. The Hobbyists (1971-75)

by Piero Scaruffi

The Microprocessor

A microprocessor is a programmable set of integrated circuits; basically, a computer on a chip. It had been theoretically possible for years to integrate the CPU of a computer on a chip. It was just a matter of perfecting the technology. In 1970 Lee Boysel at Four Phase Systems had already designed the AL1, an 8-bit Central Processing Unit (CPU), de facto the first commercial microprocessor. Yet the microprocessor that changed the history of computing was being developed at Intel. Ted Hoff at Intel bet on silicon-gated MOS technology to hold a 4-bit CPU onto a chip. In 1970 he hired Federico Faggin, the inventor of silicon-gated transistors. Faggin implemented Hoff's design in silicon, and in november 1971 Intel unveiled the 4004, a small thumbnail-size electronic device containing 2,300 transistors, spaced by 10,000nm gaps, and capable of processing 92,000 instructions per second. This had started as a custom project for Busicom, a Japanese manufacturer of calculators (that in January had introduced the world's first pocket calculator, the LE-120A Handy). Intel's tiny 4004 chip was as powerful as the ENIAC, but millions of times smaller and ten thousand times cheaper. By august 1972 Intel had ready an 8-bit version of the 4004, the 8008, whose eight-bit word allowed it to represent 256 ASCII characters, including all ten digits, both uppercase and lowercase letters and punctuation marks. Intel was not convinced that a microprocessor could be used to build a computer. It was up to Bill Pentz at California State University in Sacramento to prove the concept. In 1972 his team built the Sac State 8008, the first microcomputer, and helped Intel fine-tune the microprocessor for the task of building computers. Intel's initial motivation to make microprocessors was that microprocessors helped sell more memory chips. A few months earlier Intel had introduced another important invention, the EPROM, developed by the Israeli-born engineer Dov Frohman. An EPROM (Erasable Programmable Read Only Memory) is a non-volatile memory made of transistors that can be erased. By making it possible to reprogram the microprocessor at will, it also made it more versatile. The 4004 and the 8008 had been produced in small quantities (the latter mainly as the basis for DEC's own processors), but in april 1974 Intel unveiled the 8080, designed at the transistor level by Japanese-born Masatoshi Shima, that lowered both the price and the complexity of building a computer while further increasing the power (290,000 instructions per second).

(In 1968 a reclusive electrical engineer, Gilbert Hyatt, had founded Micro Computer in the Los Angeles region and filed for a patent on what would be known as the microprocessor, but apparently never built one).

It was bound to happen. At least 60 semiconductor companies had been founded in Santa Clara Valley between 1961 and 1972, many by former Fairchild engineers and managers. It was a highly competitive environment, driven by highly educated engineers.

The center of mass for venture capital had steadily shifted from San Francisco towards Menlo Park. In 1972 venture-capitalist company Kleiner-Perkins, founded by Austrian-born Eugene Kleiner of Fairchild Semiconductor and former Hewlett-Packard executive Tom Perkins, opened offices on Sand Hill Rd, followed by Don Valentine of Fairchild Semiconductor who founded Capital Management Services, later renamed Sequoia Capital. That year the electronics writer Don Hoeffler popularized the term "Silicon Valley", the new nickname of the area between Pale Alto and San Jose, mostly consisting of Santa Clara Valley. In 1974 Reid Dennis (a member of the "group") and Burton McMurtry (of Palo Alto Investment) founded the investment company Institutional Venture Associates (that in 1976 split into two partnerships, McMurtry's Technology Venture Associates and Dennis' Institutional Venture Partners), while Tommy Davis launched the Mayfield Fund. In 1968 Harvey Wagner and several UC Berkeley professors had founded Teknekron, one of the world's first startup incubators focused on IT.

In 1970 Regis McKenna, a former marketing employee of General Microelectronics (1963) and National Semiconductor (1967), started his own marketing agency, one of the many that were proliferating to help the engineering startups with advertising and public relations (in the days when engineers read magazines such as "Electronic News"). McKenna was one of the people responsible for making New York listen to the West Coast. In those days most magazines had little interest in the West Coast. McKenna and the other marketers managed to get the attention of New York and often the "inventor" would capture the attention of the New York media more than its product. In a sense, the "cult of personality" that would become a staple of Silicon Valley was born back then, when the eccentric personality of founders and engineers was often more easily "sold" to influential magazines than their exotic technologies. Regis McKenna was one of the people who promoted Silicon Valley as an "attitude". This process would culminate in 1982, when Time magazine would put Steven Jobs (then 26) on the front cover.

Very few people knew what "silicon" is, but many began to understand that it was important to build smaller and cheaper computers that could be embedded into just about any device. For example, in 1973 Automatic Electronic Systems (AES) of Canada introduced the "AES-90", a "word processor" that combined a screen (a cathode ray tube or CRT monitor), a floppy disk and a microprocessor. The most direct impact was on calculators. MITS (Micro Instrumentation and Telemetry Systems) of New Mexico built the first calculator to use the Intel 8008, the MITS 816 (1971). In 1972 alone Hewlett-Packard, Texas Instruments, Casio (a Japanese manufacturer of mechanical calculators) and Commodore (a manufacturer of typewriters founded by Polish-Canadian Jack Tramiel that had relocated from Silicon Valley to Pennsylvania) all debuted small calculators. Texas Instruments soon produced its own microprocessors, notably the 4-bit TMS 1000 series (1974) that integrated the CPU, the ROM, and the RAM on a single chip. In 1973 Japan's Sharp developed the LCD (Liquid Crystal Display) technology for the booming market of calculators.

The Intel 8008 was used by the companies targeting the electronic hobbyist market, which was quite huge. These companies were mostly selling kits by mail-order that hobbyists could buy to build exotic machines at home. The Scelbi (SCientific, ELectronic and BIological), first advertised in march 1974 by a Connecticut-based company, and Mark-8, developed by Virginia Tech's student Jon Titus and announced in july 1974, were the first ones. Magazines such as "Radio Electronics", "QST" and "Popular Electronics" were responsible for creating excitement about the microprocessor. Basically, the microprocessor reached a wider audience than its inventors had intended to reach thanks to the magazines. Otherwise it would have been known only to the few large corporations that were willing to buy microprocessors in bulk. The most creative and visionary users were not working in those corporations.

However, they were all beaten at the finishing line by Vietnamese-born engineer Andre Truong Trong Thi, who used the 8008 to build the Micral in february 1973 for a governmental research center in France (and it was an assembled computer, not just a kit).

Networking Computers

Most of this section has moved to A Brief History of Electrical Technology

At the same time that microprocessors were revolutionizing the concept of a computer, gigantic progress was underway in the field of networking, although its impact would be felt only decades later. In 1972 Ray Tomlinson at Boston's consulting firm Bolt, Beranek and Newman invented e-mail for sending messages between computer users, and also coined a procedure to identify the user name and the computer name separated by a "@".

In 1971 computer-science students from UC Berkeley, including Lee Felsenstein (formerly an activist of Berkeley's Free Speech Movement in 1964) formed Resource One, an organization operating out of an abandoned warehouse in San Francisco and aiming to create a public computer network. Lee Felsenstein (the software specialist), Efrem Lipkin (the hardware specialist) and Mark Szpakowski (the user interface specialist) had access to a Scientific Data Systems' time-sharing machine. The first public terminal of what came to be known as the "Community Memory" was set up in 1973 inside Leopold's Records, a record store run by the Student Union of UC Berkeley, and it was later moved to the nearby Whole Earth Access store. This college drop-outs had created the first public computerized bulletin board system.

In 1973 Bob Metcalfe at Xerox PARC coined the term "Ethernet" for a local-area network that they were building. PARC wanted all of its computers to be able to print on their one laser printer. Unlike the Internet, which connected remote computers using phone lines, the Ethernet was to connect local computers using special cables and adapters. Unlike the Internet, which was very slow, the Ethernet had to be very fast to match the speed of the laser printer. The first Ethernet was finally operational in 1976. Metcalfe also enunciated his law: the value of a network of devices increases exponentially with the number of connected devices. This was popularly translated in terms of users: the value of a network increases exponentially with the number of the people that it connects.

Meanwhile the Arpanet had 2,000 users in 1973, and (the year when the first international connection was established, to the University College of London), Vinton Cerf of Stanford University had nicknamed it "Internet". The following year Cerf and others published the Transmission Control Protocol (TCP), which became the backbone of Internet transmission: it enabled Arpanet/Internet computers to communicate with any computer, regardless of its operating system and of its network. Vint Cerf figured out a 32-bit address space for an estimated 16 million time-sharing machines for each of two networks in 128 countries (TCP/IP would run out of addresses in 2011 and replaced in 2012 by IPv6).
Built into Cerf's engineering principle was actually an implicit political agenda: the "packets" represent an intermediate level between the physical transport layer and the application that reads the content. The packets, in other words, don't know what content they are transporting. All packets are treated equal regardless of whether they are talking about an emergency or about tonight's movie schedule. Each computer in the network picks up a packet and ships it to the next computer without any means to decide which packet is more important. The routing is "neutral". The hidden policy was soon to be called "net neutrality" and later vehemently defended by activists: the Internet was being designed so that participants in the network (later called Internet Service Providers) could not favor one content over the others (e.g. a Hollywood movie over an amateur's Youtube movie).
A transformation was taking place in the nature of the Arpanet that was not visible from outside. The Arpanet, a military tool, had been handed (out of necessity, not of design) to the Unix hackers. These hackers were imbued with a counterculture that was almost the exact opposite of the military culture. De facto the Arpanet was increasingly being hijacked by a bunch of hackers to become a social tool (although originally only to chat and play games).

The Arpanet had not been designed with any particular application in mind. In fact, its "application neutrality" would remain one of the main drivers of innovation. It actually appeared to be ill suited for any application. The Arpanet worked pretty much like a post office: it made a "best effort" to deliver "packets" of data in a reasonable timeframe, but packets might be delayed and even lost. Email could live with a delay of minutes and even hours, but not with a loss of text. A phone call, on the other hand, could live with the loss of a few milliseconds of voice but not with a delay. Being application-agnostic, the Internet solved no problem well.

The Hobbyist Market

By that time a large number of young people were children of engineers. They were raised in technology-savvy environments. Many of them picked up electronic kits as teenagers and eventually continued the local tradition of the high-tech hobbyists. In fact, that tradition merged with the mythology of the juvenile delinquent and with the the hippie ideology in legendary characters like John Draper (better known as Captain Crunch), the most famous "phone phreak" of this age who in 1971 built the "blue boxes" capable of fooling the phone system. Phreaking went viral and in october 1971 Esquire Magazine published an article exposing the underground phenomenon, resulting in Draper's arrest. One of his fans was Steve Wozniak, back then an engineer at the Cupertino public radio station KKUP.

The year 1974 ended in december with the advertisement on hobbyist magazines of Ed Roberts' kit to build a personal computer, the Altair 8800, based on Intel's 8080 microprocessor and sold by mail order (for $395). It was the first product marketed as a "personal computer". Roberts' company MITS, which used to make calculators, was based in Albuquerque, New Mexico. Two Harvard University students, Bill Gates and Paul Allen, wrote the BASIC interpreter for it, and then founded a company named Micro-soft, initially also based in Albuquerque. MITS sold 2,000 Altair 8800 systems in one year.

One of the most daring architectures built on top of the Intel 8080 came from Information Management Science Associates (IMSAI), a consulting company for mainframe users founded by William Millard in 1972 in San Leandro, on the east bay. Its engineers realized that a number of microprocessors tightly coupled together could match the processing power of a mainframe at a fraction of the price. In october 1975 they introduced the Hypercube II, which cost $80,000 (an IBM 370 mainframe cost about $4 million). Ironically, they were more successful with IMSAI 8080, a clone of the Atari 8800 that they sold to the hobbyist market starting in december 1975, while only one Hypercube was ever sold (to the Navy).

In 1973 Gary Kildall, who was an instructor at the Naval Postgraduate School in Monterey, developed the first high-level programming language for Intel microprocessors, PL/M (Programming Language /Microprocessor). It was "burned" into the Read Only Memory (ROM) of the microprocessor. Intel marketed it as an add-on that could help sell its microprocessors. However, when Kildall developed an operating system for Intel's 8080 processor, CP/M (Control Program/Microcomputer), which managed a floppy drive, Intel balked. Intel was not interested in software that allowed users to read and write files to and from the disk; but makers of small computers were. The 8080 had inspired several companies to create 8080-based kits, notably MITS and IMSAI. Both needed software for the ever more popular floppy disk. MITS offered its own operating system. IMSAI bought Kindall's CP/M. Kildall's CP/M was largely based on concepts of the PDP-10 operating system (VMS). Kildall then rewrote CP/M isolating the interaction with the hardware in a module called BIOS (Basic Input/Output System). This way CP/M became hardware-independent, and he could sell it to any company in need of a disk operating system for a microprocessor. In 1974 Kildall started his own company, Digital Research, to sell his product on hobbyist magazines. CP/M soon became a standard. Kildall's operating system was a crucial development in the history of personal computers: it transformed a chip invented for process control (the microprocessor) into a general-purpose computer that could do what minicomputers and mainframes did.

In 1975 Alan Cooper, a pupil of Gary Kildall's at Monterey and a DRI alumnus, started his own software company, Structured Systems Group (SSG) in Oakland to market his General Ledger, a pioneering business software for personal computer, only sold via computer magazines.

In 1972 the Portola Institute had spawned a magazine, the People's Computer Company, run by Bob Albrecht and devoted to computer education, as well as a coworking space, the People's Computer Center (PCC) in Menlo Park, for proto-nerds to play with a minicomputer, one of them being Lee Felsenstein of the "Community Memory". In March 1975 a group of PCC regulars such as Bob Marsh, Lee Felsenstein and Steve Wozniak met in Gordon French's garage to discuss the Altair. This was the first meeting of what would be known as the Homebrew Computer Club and later relocated to the SLAC's auditorium. These were young people who had been mesmerized by the do-it-yourself kits to build computers. Some of them would go on to build much more than amateur computers. For example, Hewlett-Packard's engineer Steve Wozniak demonstrated the first prototype of his Apple at the Homebrew Computer Club meeting of december 1976. Several of them rushed to mimick the Altair's concept. For example, Bob Marsh and Lee Felsenstein used the Intel 8080 to design the Sol-20 for Processor Technology in Berkeley, released in june 1976, the first microcomputer to include a built-in video driver, and the archetype for mass-produced personal computers to come.

Another incluential member of the Homebrew Computer Club was Li-Chen Wang, who in 1976 signed his Tiny Basic with the motto "Copyleft - All Wrongs Reserved" to mock the standard "Copyright - All Rights Reserved" of proprietary software, an idea that predated Richard Stallman's "GNU Manifesto" (1983).

The role played by hobbyists should not be underestimated. The computer market was split along the mainframe-mini divide. IBM and the "BUNCH" sold mainframes. DEC, HP and others sold minis. These large corporations had the know-how, the brains and the factories to produce desktop computers for the home market. They did not do it. The market for home computers was largely created by a culture of hobbyists. They were highly individualistic home-based entrepreneurs who worked outside the big bureaucracies of corporations, academia and government. Many of them did not have higher education. Many of them had no business training. Many of them had no connections whatsoever with universities or government agencies. However, it was a grassroots movement of hobbyists that created what the colossal apparatus of the corporate world had been unable to. They created their own community (via magazines, stores and clubs) to obviate to the lack of financial, technological and marketing infrastructure. The personal computer was not invented by an individual, by a laboratory or by a company; it was invented by a community. In fact, its dynamics was not too different from the dynamics of a community that had taken hold a decade earlier in the Bay Area: the community of the counterculture (agit-prop groups, hippie communes, artistic societies).

Until then progress in computer technology had been funded by governments, universities and corporations. The next step would be funded by humble hobbyists spread all over the nation.

To serve the growing community of computer hobbyists, in 1975 a Los Angeles hobbyist, Dick Heiser, had the idea to start a computer store, Arrowhead Computers, which became the first computer retail store in the world. In december 1975 a member of the Homebrew Club, Paul Terrell, opened a store in Silicon Valley, the Byte Shop, which became a reference point for the local hobbyists, and sold the first units of Wozniak's Apple. In 1976 William Millard of IMSAI 8080 fame opened the "Computer Shack", a store located in Hayward (again in the east bay) that offered everything a personal computer user needed. That store would soon become a nation-wide chain, Computerland, selling computers to the public, a proposition that only a few years earlier (when computers were astronomically expensive and impossible to use) would have been unconceivable. The retail sale of a computer represented a monumental paradigm shift not only for the industry but for society as a whole.

Journalists and store owners were the true visionaries (not corporate executives with their gargantuan staffs of planners and strategists). They relayed news across the country. They organized the newsletters, clubs and conferences that cemented the community. It was the editor of one such magazine (Dr Dobb's editor Jim Warren) who in april 1977 organized the first personal computer conference in San Francisco, the "West Coast Computer Faire". It was attended by 13,000 people, the largest computer conference ever. Newsletters, clubs (such as the Southern California Computer Society formed in september 1975) and user conferences proliferated in the following years. This network was necessary to compensate the fact that most of those early microcomputers came with no customer support, very little quality control and only the most elementary of software. The network provided the education and the support that large computer manufacturers provided to their customers. The network even did the marketing and proselytizing. At the same time the network influenced the manufacturers: the community of users probably helped shape the personal computer market more than any technological roadmap.

Many of the real geniuses of this microcomputer revolution never made money out of it. Often they didn't even get recognition from the ones who did make money. In particular, software was still a strange "good", that the law didn't know how to treat. When it came to software, intellectual property basically did not exist. IBM had officially lost the monopoly of software but de facto still owned most of the business software in the world, and software did not have a perceived value. The pioneers of microcomputers freely exchanged software, or, better, borrowed from each other. No surprise then that some of Bill Pentz's original code ended up in Gary Kildall's CP/M and that Kildall claimed Microsoft had "stolen" some of his code. (Microsoft did not develop DOS, but bought it from Tim Paterson, who clearly plagiarized some of Kildall's code). And, of course, later on when software became more valuable than hardware, the promiscuous origins of the software industry would cause major controversies. Financial success almost never reflected technical merits. Few people doubt that CP/M was not only the original but also the better operating system, and Kildall had even conceived before Microsoft the low-cost licensing model that would make Microsoft rich.

Calculator and Microprocessor Wars

Meanwhile, a business decision in Texas involuntarily launched another wave of personal computers. Texas Instruments owned the market for CPUs used in calculators. When in 1975 it decided to increase the price of the CPU to favor its own calculators. The other manufacturers were left scrambling for alternatives. The market for calculators collapsed. Out of the ruins one of them decided to change business: Commodore. Tom Bennett at Motorola in Arizona had created the 8-bit 6800 in 1974, a more advanced microprocessor than anything that Intel had introduced yet. Chuck Peddle, a former employee of Tom Bennett's at Motorola, developed the 8-bit 6502 at MOS Technology (1976) in Norristown (Pennsylvania), much cheaper ($25) than the 6800 ($180) or Intel's 8080 ($360), and then was hired by Commodore to build an entire computer, the Commodore PET (Personal Electronic Transactor), demonstrated in january 1977.

Texas Instruments was justified in that it was getting difficult to compete with Intel. Intel boasted a full line of state-of-the-art semiconductor products: RAMs, EPROMs and CPUs. Microprocessors drove sales of memories, and sales of memories funded improvements in microprocessors. Competition to Intel eventually came from Silicon Valley itself. In 1975 Jerry Sanders' Advanced Micro Devices (AMD) introduced the AMD8080, a reverse-engineered clone of the Intel 8080 microprocessor, putting further pressure on prices. AMD then developed the 4-bit 2901 chip that used the faster Schottky bipolar transistors instead of the unipolar MOS transistors used by Intel. Federico Faggin left Intel with coworker Ralph Ungermann right after finishing the 8080, taking Shima with them, and, having convinced Exxon to make a generous investment, started his own company, Zilog, which became a formidable competitor of Intel when (july 1976) it unveiled the 8-bit Z80 microprocessor, which was faster and cheaper than the 8080 (designed at transistor level by the same Shima). National Semiconductor had already introduced PACE in december 1974, the first 16-bit microprocessor.

Relational Databases

On the software front, a new field was born at IBM's San Jose laboratories (later renamed Almaden Research Center). In 1970 Edgar Codd had written an influential paper, "A Relational Model of Data for Large Shared Data Banks", in which he explained how one could describe a database in the language of first-order predicate logic. A Relational Database group was set up in San Jose. In 1974 Donald Chamberlin defined an algebraic language to retrieve and update data in relational database systems, SEQUEL, later renamed SQL (Structured Query Language). It was part of the development of the first relational database management system, code-named System R, begun in 1973 and finally unveiled in 1977 (running on a System 38). However, IBM's flagship database system remained the IMS, originally developed in 1968 for NASA's Apollo program on IBM's mainframe 360. That was, by far, the most used database system in the world. Since IBM was not eager to adopt a new technology, it did not keep it secret and the idea spread throughout the Bay Area.

In particular, IBM's work on relational databases triggered interest in a group of Berkeley scientists led by Michael Stonebraker, who started the Ingres (INteractive Graphics REtrieval System) project in 1973, a project that would transplant the leadership in the field of databases to the Bay Area and create colossal fortunes.

Meanwhile, computers started sharing data. In 1974 DEC introduced DECnet, a product to connect two PDP-11 minicomputers, one of the earliest peer-to-peer network architectures, which the following year included the ability to access files on other machines (via DEC's Data Access Protocol or DAP).

Software for Manufacturing

This section has moved to A Brief History of Electrical Technology

User Interfaces

Experiments with new kinds of hardware and software platforms were changing the concept of what a computer was supposed to do. Engelbart's group at the SRI had lost funding from ARPA (and was eventually disbanded in 1977), so several of his engineers started moving to Xerox's PARC. In 1973 the PARC unveiled the Alto, the first workstation with a mouse and a Graphical User Interface (GUI). Inspired by Douglas Engelbart's old On-Line System, and developed by Charles Thacker's team, it was a summary of all the software research done at the PARC and it was way ahead of contemporary computers. More importantly, it wasn't just a number cruncher: it was meant for a broad variety of applications, from office automation to education. It wasn't based on a microprocessor yet, but on a Texas Instruments' 74181 chip (that wasn't a full microprocessor), In 1974, Hungarian-born Charles Simonyi developed Bravo, the word processor that introduced the "what you see is what you get" (WYSIWYG) paradigm in document preparation.

Most of this section has moved to A Brief History of Electrical Technology

The State of Computing

This section has moved to A Brief History of Electrical Technology

Other Industries

While the semiconductor industry was booming, another Bay Area industry was in its infancy. A local biomedical drug industry had been created by the success of Alza, founded in 1968 by former Syntex's president Alejandro Zaffaroni in Palo Alto. Silicon Valley soon developed as a center for biomedical technology, the industry of medical devices that draws from both engineering and medicine.

Meanwhile, several groups of biologists were trying to synthesize artificial DNA in a lab, trying to extract a gene from any living or dead organism and insert it into another organism ("recombinant DNA"). In 1972 Paul Berg's team at Stanford University synthesized the first recombinant DNA molecule.

In 1973 Stanford University's medical professor Stanley Cohen and U.C. San Francisco's biochemist Herbert Boyer invented a practical technique to produce recombinant DNA. They transferred DNA from one organism to another, creating the first recombinant DNA organism. That experiment virtually launched the discipline of "biotechnology", the industrial creation of DNA that does not exist in nature but can be useful for human purposes. Boyer had just discovered that an enzyme named EcoRI allowed him to slice DNA molecules to produce single strands that could be easily manipulated. Cohen had just devised a way to introduce foreign DNA into a bacterium. They put the two processes together and obtained a way to combine DNA from different sources into a DNA molecule. Cohen decided to continue research in the academia, while Boyer opted for getting into business.

The Asilomar Conference on Recombinant DNA, organized by Paul Berg in february 1975 near Monterey, set ethical rules for biotechnology. Meanwhile, in 1974 the Polish geneticist Waclaw Szybalski coined the term "synthetic biology", opening an even more ambitious frontier: the creation (synthesis) of new genomes (and therefore of biological forms) that don't exist in nature.

That attention to Biology was not coincidental. In 1970 Stanford had set up an interdepartmental Human Biology Program focused on undergraduate students. The founders were all distinguished scholars: Joshua Lederberg, a Nobel laureate who was the head of Genetics at the Medical School, David Hamburg, chair of Psychiatry at the Medical School, Norman Kretchmer, chair of Pediatrics, Donald Kennedy, chair of Biology, Paul Ehrlich, who had jump-started environmental science with his book "The Population Bomb" (1968), Sanford Dornbusch, former chair of Sociology, and Albert Hastorf, former chair of Psychology. This was an impressive cast, and Lederberg and Hamburg were already teaching a pioneering course titled "Man As Organism" since 1968. However, there was little support for this idea from the establishment. The reason that Stanford went ahead was money: the Ford Foundation (not the Stanford establishment) believed in multidisciplinary approaches and its generous funding made the program possible. The class on "Human Sexuality" started in 1971 by Herant Katchadourian attracted a record 1,035 students in its first year. Well into the 21st century the Human Biology Program would still remain the single most successful program at Stanford.

A major oil crisis hit the world in 1973. It was a wake-up call that the USA did not control the main material required by its economy: oil. It became the first impulse to seriously explore alternative sources of energy, although it would take many more crises and wars before the government would launch a serious plan to get rid of fossil fuels. In 1973 the Lawrence Berkeley Lab founded the Energy and Environment Division, that came to specialize in lithium-ion batteries. The federal government, instead, chose Colorado as the site for the main research center in alternative sources of energy: in 1974 it mandated the establishment of the Solar Energy Research Institute, later expanded to wind and biofuel and renamed National Renewable Energy Laboratory (NREL).

Research on lasers at Stanford University yielded a major discovery in 1976: John Madey invented the "free-electron laser", a laser that differed from previous varieties (ion, carbon-dioxide and semiconductor lasers) because it can work across a broader range of frequencies, from microwaves to (potentially) X-rays.

In 1962 Texas Instruments introduced what would be known as LED (Light-emitting diode) technology, invented by James Biard and Gary Pittman. A few months later Nick Holonyak at General Electric developed the red LED, soon followed by all the other colors. Progress in LED displays was slow. In 1968 Hewlett Packard introduced red LED displays developed by Monsanto, but the technology was still expensive and rudimentary. In the 1970s Thomas Brandt at Fairchild Semiconductor finally used Hoerni's planar process to mass produce LEDs. From this point on progress in LEDs would become exponential. In fact, a Hewlett Packard scientist, Roland Haitz, formulated the Moore's law for LEDs: the efficiency of LEDs was doubling approximately every 36 months.

Culture and Society

The New Games Movement was an offshoot of the participatory spirit of the hippy era. Its practitioners staged public highly-physical games whose goals were therapeutic, creativity-augmentation and community-bonding, not unlike Steve Russell's computer game "Spacewar" (1962) and not unlike Stewart Brand's mock-simulation anti-war game "Soft War" (1966). The most famous prophet of the movement was Bernie De Koven, who in 1971 created the Games Preserve in Pennsylvania, a farm where people could convene a play physical games; but San Francisco had pioneered the idea in the 1960s, new-age writer George Leonard had promoted it, and in 1973 Patricia Farrington created the New Games Foundation with its first "tournament" held in the Marin Headlands north of San Francisco.

The utopian sharing economy of the hippies manifested itself again in Menlo Park in 1974, when Dick Raymond of the Portola Institute, just before he helped to jumpstart the Homebrew Computer Club, founded the Briarpatch society, a community of mutually supporting businesses linked together via a newsletter (edited by Gurney Norman), kept alive by local philanthropists, and spreading thanks to the missionary work of Andy Phillips.

By the mid-1970s San Francisco's art scene was shifting towards video, performance arts, participatory installations, mixed media, time-based art, video and installation, often accompanied with live electronic music. Alternative art spaces popped up in the Mission and South of Mission (SOMA) districts, notably Southern Exposure in 1974 (a nonprofit collective that gathered at Project Artaud, a "live and work" artist space), New Langton Arts in 1975 and Gallery Paule Anglim in 1976. "Conceptual" artists Howard Fried and Terry Fox pioneered video art and performance art. Lynn Hershman's "The Dante Hotel" (1973) pioneered site-specific installations, The ultimate site-specific installation was David Ireland's own house at 500 Capp Street, which the artist began to remodel in 1975 with sculptures made of found objects. Chip Lord's Ant Farm created one of the most influential installations in 1974 in the desert of Texas, "Cadillac Ranch", using parts of old cars. The Ant Farm also organized multimedia performances such as "Media Burn" (july 1975) during which they burned in public a pyramid of television sets. In 1970 Tom Marioni founded the Museum of Conceptual Art (MOCA), one of the first alternative art spaces in the nation. There he debuted his "Sound Sculpture As" (1970) which, together with Paul Kos "The Sound of Ice Melting" (1970), that recorded the sound of the disintegrating ice, pioneered sound sculpture.

The Mission District had a large Mexican and Chicano (descendants of Mexican immigrants) population. In 1972 two Chicanas, Patricia Rodriguez and Graciela Carillo, started painting murals in their neighborhood, soon joined by other women, mostly students of the San Francisco Art Institute. They came to be known as Las Mujeres Muralistas, and specialized in sociopolitical activism, such as the large mural "Panamerica" (1974).

In 1973 British painter Harold Cohen joined Stanford University's Artificial Intelligence Lab to build AARON, a program capable of making art, thus creating an artistic equivalent of the Turing test: can a machine be said to be a good artist of the experts appreciate its art? The project would continue for several decades. In 1975 John Chowning founded Stanford's laboratory for computer music, later renamed Center for Computer Research in Music and Acoustics (CCRMA).

Lloyd Cross was a physicist who in Michigan had worked on lasers and holograms, and (in 1970) organized the first exhibition of holographic art (at the Cranbrook Academy of Art in Bloomfield Hills). In 1971 he moved to San Francisco and founded the San Francisco Holography School in the basement of a warehouse known as Project One that also worked as a hippie commune. There he invented the "integral hologram" that combined holography with cinematography and creates moving three-dimensional images (a holographic film). In 1976 Simone Forti (now based in New York) worked with Cross to develop an innovative technique of holographic dance pieces, such as "Striding/Crawling" (1977). These "movement holograms" were first exhibited in 1978 at the Sonnabend Gallery in New York.

The Bay Area stole a bit of Hollywood's limelight in 1971 when film producer George Lucas opened Lucasfilm in San Francisco, a production company that went on to create "American Graffiti" (1973), "Star Wars" (1977) and "Indiana Jones and the Raiders of the Lost Ark" (1981).

The "underground comix" movement continued to prosper, but now the reference point was the magazine "Arcade" (1975-76), started by Bill Griffith (of "Young Lust" fame) and Swedish-born cartoonist Art Spiegelman (later more famous for the character "Maus").

The city's main sociocultural development was the rapid rise of the homosexual community. The first "Gay Pride Parade" was held in 1970. Around that time gays and lesbians started moving to the "Castro" district in large numbers. It was the first openly gay neighborhood in the US and it would elect the first gay politican (Harvey Milk) in the US later that decade.

Arthur Evans formed the "Faery Circle" in San Francisco in 1975, which evolved into the "Radical Faeries" movement at a conference held in Arizona in 1979 and later became a worldwide network of groups that mixed gay issues and new-age spirituality and that staged hippie-style outdoors "gatherings".

Meanwhile, the "Seed Center", which was really just the backroom of DeRay Norton's and Susan Norton's Plowshare Bookstore in downtown Palo Alto, had become a new centre of counterculture by (re)publishing Thaddeus Golas' "The Lazy Man's Guide to Enlightenment" (1971), the Bible of California-style spirituality.

Down in Silicon Valley the entertainment and intellectual scene was far less exciting. On the other hand, credit must be given to hard-drinking establishments like Walker's Wagon Wheel (282 E. Middlefield Road, Mountain View, demolished in 2003) for the "technology transfer" that Silicon Valley became famous for. There was little else for engineers to do than meet at the Wagon Wheel and talk about technology.

In 1973 the Chinese painter Paul Pei-Jen Hau opened the Chinese Fine Arts Gallery in Los Altos, the precursor to the American Society for the Advancement of Chinese Arts (1979), but at the time few paid attention.


(Copyright © 2010 Piero Scaruffi)

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence