A History of Silicon Valley

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence
Purchase the book

These are excerpts from Piero Scaruffi's book
"A History of Silicon Valley"


(Copyright © 2010 Piero Scaruffi)

12. The Surfers (1990-95)

by Piero Scaruffi

The Web

The 1990s opened with one of the most influential inventions of all times: the British engineer Tim Berners-Lee of CERN, Geneva's multinational high-energy physics laboratory (funded by multiple European governments) and the largest Internet node in Europe, realized that applying the hypertext paradigm to the Internet would create a worldwide network beyond the imagination of the original visionaries of hypertext. He set out to define a HyperText Markup Language (HTML) to write hypertext documents linking each other. He then implemented the server on a NeXT computer, and wrote the first client, a "browser" that he named "World-wide Web". The server transferred ("served") webpages to the client according to a simple protocol, HTTP (HyperText Transfer Protocol). A major quantum leap in high-tech had come from a government-funded laboratory. The browser was inspired by DynaText, developed in 1990 by Electronic Book Technologies, which had been founded in Rhode Island by original hypertext visionary Andries van Dam and some of his collaborators at Brown University, notably Steven DeRose (who had done most of the design of DynaText). The web was disclosed to the whole Internet world in august 1991. In december 1991 physicist Paul Kunz set up the first World-wide Web server in the USA at the Stanford Linear Accelerator Center (SLAC). In april 1992 there was already another browser, Enwise, written for Unix by four Finnish students at the Helsinki University of Technology. The first major browser in the USA was ViolaWWW, completed by Taiwanese-born student Pei-Yuan Wei at U.C. Berkeley in december 1992, a project inspired by the look and feel of Apple's Hypercard. These two were notable for being graphical browsers.

At just about the same time (december 1991) the USA government passed the High-Performance Computing and Communication Act, originally proposed by senator Al Gore. Gore envisioned a "National Information Infrastructure" that would create a vast network of public and private information systems to deliver potentially all the information of the nation to potentially all the citizens of the nation. The High-Performance Computing and Communications Initiative funded a number of research projects around the USA, and in particular it funded the project for a graphical browser at the National Center for Supercomputing Applications of the University of Illinois. The employees in charge of it were Marc Andreessen and Eric Bina. Their goal was to create a user-friendly browser with a graphical user interface. They completed their Mosaic web browser in 1993. The project existed because in 1985 the National Science Foundation had set up four (then five) "supercomputing centers", the main one being at the University of Illinois because the original proposal had come from its professor Larry Smarr. The center's first hit had also been related to the Internet: the first (and free) Telnet for the Macintosh and the Windows personal computers (1986), allowing anyone with a modem to access the Internet from home. In retrospect, the Mosaic browser was simply a smarter Telnet for the newly invented World-wide Web. Mosaic's graphical user interface made all the difference. It also made it easier to display documents containing both texts and images. Originally developed for UNIX, it was soon ported to Windows, turning any PC into a client for the World-wide Web. The National Center for Supercomputing Applications released Mosaic in November 1993. Within six months more than one million people were using Mosaic to access the World-wide Web. Andreessen found a job in Silicon Valley. There he met Silicon Graphics' founder Jim Clark who encouraged him to commercialize Mosaic. In april 1994 the duo opened Mosaic Communications Corporation, later renamed Netscape Communications, in Mountain View. In october 1994 Mosaic Netscape was available for download. In 1995 about 90% of World-wide Web users were browsing with Netscape's Navigator. Netscape went public in august 1995 even before earning money. By the end of its first trading day, the company was worth $2.7 billion and Clark had become overnight half a billionaire.

In 1994 Berners-Lee introduced the Uniform Resource Locator (URL) to express the hierarchy of domain names of the Internet into World-wide Web names (e.g., www.stanford.edu). The most popular domain was ".com", which became known as "dot com": it was originally meant to identify commercial activities, as opposed to "edu" (for educational institutions), "gov" (for government agencies) and "org" (for non-profit organizations). The craze that followed Netscape's IPO became known as "dot-com craze".

Netscape did more than simply start a new gold rush. It made the Web easy to navigate for anybody, as long as they knew how to type on a keyboard and they could find a computer connected to the Internet. It leveled the field so that the illiterate computer user could browse the Web the same way that a pro did. Thanks to Netscape's browser, the shapeless and non-intuitive cluster of digital information that had accrued on the Internet became intelligible and meaningful to everybody. This in turn prompted more and more people to add content to the Web. It now became clear that one boom had enabled the other one: the personal computer boom of the 1980s had placed a computer in millions of households, and that now constituted the vast public of the Web. A key factor was that the Netscape browser was free for individuals and non-profit organizations. Netscape also "protected" the Internet from monopolies that would have loved to hijack it: it used open standards and indirectly forced much larger corporations to adopt those same standards, thus avoiding the kind of wars that still plagued the world of operating systems.

Meanwhile the World-wide Web had created another kind of application. Already in 1990, before anyone had heard of the Web, some students in Montreal had created a "search engine" named "Archie" to find sites on the Internet, which in those days were accessed via FTP (File Transfer Protocol). Before the Web took hold, the most popular way to catalog and transmit documents over the Internet was Gopher, created by Mark McCahill at the University of Minnesota and also debuted in 1991; and immediately two applications were born to search Gopher catalogs, Veronica (in Nevada) and Jughead (in Utah). EINet's Galaxy, launched in january 1994 in Texas, was the first catalog of websites. WebCrawler, created by Brian Pinkerton at the University of Washington and launched in april 1994, was the first search engine for the web, a website that indexed and then searched the texts it found on the web.

At about the same time Michael Mauldin at Carnegie Mellon University started the project Lycos to catalog pages of the web, which went live in july. By 1999 Lycos, one of the first dotcom to post profits from advertising, had become the most visited website in the world. For the record, Mauldin also developed an Artificial Intelligence-based chatterbots, Julia, who in 1997 mutated into Sylvie and in 2000 into Verbot.

In 1993 the Web had clearly won over Gopher, and a catalog of websites was casually circulating at Stanford University. In january 1995 the authors of that catalog, Stanford's students Jerry Yang (originally from Taiwan) and David Filo, launched Yahoo! (Yet Another Hierarchical Officious Oracle!), which was simply a website dedicated to cataloging all the existing websites in some predefined categories. In october 1994 the Internet already consisted of 3,864,000 hosts. It had increased in size by 61% in one year. The need for search tools was becoming obvious. Tools like Yahoo greatly increased the usefulness of the Web: instead of knowing only the few websites run by friends, one could now find out about websites run by complete strangers.

It was writer Jean Armour Polly who coined the phrase "Surfing the Internet" (1992). Surfing soon became an activity for a growing population of Internet users. Some did it for entertainment and some did it for work/study. The Internet had existed for a long time. It took the Web to turn it into a major attraction. And, thanks to the Web, applications that had been around for more than a decade became widely popular, notably e-mail.

Email had created a powerful alternative to "snail" mail. A new technology would soon provide a powerful alternative to the telephone. The origins of instant messaging for personal computers go back at least to the CompuServe CB Simulator of 1980 and Q-Link, the original America OnLine chat system (acquired from PlayNET, that had operated it since 1984, basically a version for Commodore machines of the old Unix "talk" command). However, it was only with Tribal Voice, founded in december 1994 in Colorado by John McAfee (of antivirus software's fame) and later relocated to Scotts Valley, ICQ, introduced by the Israeli company Mirabilis in november 1996, and AOL Instant Messenger, launched by AOL in may 1997, that instant messaging reached the masses and became a viable alternative to a phone call.

Microsoft responded in 1999 with the text chat service MSN Messenger, that made "texting" popular with teenagers.

In theory, Netscape allowed anyone to see any website. In practice, however, most people used services like America OnLine to access the Internet. AOL provided a simple way to connect a home computer to the Internet: the customer would receive a floppy disc in the mail with all the software needed to perform the magic. The price to pay was freedom. The customers of AOL would typically only see what AOL wanted them to see, i.e. an AOL-sanctioned subset of the World-wide Web. Most people were content to visit the AOL pages and rarely ventured outside the AOL world.

The Net Economy

For decades the Internet had been used only for research and entertainment (if e-mail and Usenet groups can be defined "entertainment"). Commercial activity was de facto banned from the Internet. Somehow the advent of the Web led to relax that ethical rule, and the most shameless commercial activities began to surface. The impact on society was colossal.

Technically speaking, commerce on the Internet had always been illegal, but de facto most corporations maintained an Internet node and did so for business purposes. However, it was still illegal to blatantly market and sell products or services on the Internet (with the exception of the Usenet, because UUCP was administered separately). The Internet backbone (the NSFnet) was run by the National Science Foundation (NSF). In 1992 the USA government allowed commercial networks to link to the NSFnet, despite protests from the academia. The result was that in a few years the commercial networks made the NSFnet look expensive and obsolete, and in 1995 the government finally decided to relieve the NSF of the responsibility for the backbone, therefore de facto legalizing commerce over the entire Internet.

In 1994 about 100 million consumers in the USA purchased goods for about $60 billion using the telephone, and mostly based on mail catalogs and television shopping channels. This was the alternative to "brick-and-mortar" shops until the mid 1990s. In 1994 for the first time a consumer (Phil Brandenberger of Philadelphia) purchased a good (a CD) by using instead a personal computer connected via a modem to the Internet, which in turn sent his request to a startup in New Hampshire, Net Market Company, founded by a 21-year-old Swarthmore College graduate, Daniel Kohn. The customer's personal computer ran the Unix operating system and the Unix version of the Mosaic browser (the X-Mosaic), and Net Market used the encryption program PGP (Pretty Good Privacy), developed by Philip Zimmermann in Colorado and based on Adi Shamir's RSA encryption algorithm. At the same time, Enterprise Integration Technologies, founded in Palo Alto by Marty Tenenbaum (the former director of Schlumberger's AI lab in Palo Alto), developed a platform for e-commerce called CommerceNet in collaboration with Stanford University. CommerceNet's encryption software was developed by RSA Data Security in Redwood City, Adi Shamir's own startup (later acquired in 2006 by EMC).

In 1991 William Porter, who already owned a stock-brokerage firm in Palo Alto, founded E*trade to offer online electronic trading via AOL and CompuServe. David Chaum at UC Berkeley invented an electronic cash that was published in the article "Blind Signatures for Untraceable Payments" (1983) and in 1990 he also founded a company DigiCash that survived a few years. But the WWW basically provided a friendlier (and free) user interface, which encouraged many more businesses to go online. The first online stock brokerage, E*Trade, was launched in 1992 from Palo Alto by William Porter and Bernard Newcomb via America Online and Compuserve, offering brokerage services directly to individual investors. In 1994 the first online bank opened, First Virtual, based in San Diego and designed by two email experts such as Nathaniel Borenstein (author of the MIME standard) and Marshall Rose (author of the POP3 protocol). First Virtual also introduced the first online payment service, rivaled by CyberCash in Virginia, also launched in 1994.

In 1996 Douglas Jackson in Florida debuted a digital currency on the Web, E-gold, that by 2009 would count 5 million users before being shut down by the US government (it also became a safe haven for money launderers).

The venture capital world of Silicon Valley was ready for it. A number of founders of successful companies of the Bay Area had retired and had become venture capitalists themselves, so called "angels". Hans Severiens knew many of them and proposed that they joined forces. So in 1994 the "Band of Angels" was born. In the true spirit of Silicon Valley it wasn't just a scheme to pool money together. The primary goal was to pool knowledge together, not money. They met every month. They led the way. Collaboration among venture capitalists had always been a trademark of Silicon Valley, and probably one of the reasons for its success. Frequently sharing a history in the valley's high-tech industry, venture capitalists and angels formed a highly interconnected network of firms. Successful entrepreneurs became successful because of that network, and were expected to join the network after they had become successful. Since venture-capital firms frequently invested with other firms in the same start-up, they depended on each other's well being. Since they invested in multiple companies at the same time, their main interest was not in a particular start-up but in the broader picture. In a sense, the venture-capital world of Silicon Valley did not invest in a company but in Silicon Valley as a whole. Last but not least, venture capital firms in Silicon Valley exhibited a high degree of technological competence, either directly through their partners or indirectly through their consultants. Here venture capitalists nurtured start-ups, shaping their management structure and providing advice at every stage of development. They relied on informal networks of high-tech specialists and knowledge workers. Venture capital had not grown much since the heydays of the microprocessor. It was about $3 billion in 1983. It was about $4 billion in 1994. Then it skyrocketed to $7.64 billion in 1995.

Netscape's dazzling IPO in august 1995 is a dividing line in the history of Silicon Valley just like the 1956 foundation of Shockley Transistor and the 1971 Intel microprocessor. Internet companies multiplied and many of them received shocking amounts of funding. It had never been so easy for a start-up to go public. The dot-com craze had reinvented yet again the landscape of the Bay Area. This time the repercussions on Wall Street were direct and immediate. The new Silicon Valley propelled the technology-heavy stock index Nasdaq to the stars, creating wealth all over the world.

A software industry that was not glamorous but was becoming increasingly strategic had to do with Internet security. In particular, Midwestern entrepreneur Kevin O'Connor invested in Internet Security Systems, founded in 1994 by Georgia Institute of Technology's student Christopher Klaus. (The company would be purchased by IBM in 2006 for $1.3 billion).

Not everybody was trying to make money out of the Internet. In 1993 three students at the University of California in Santa Cruz (Rob Lord, Jeff Patterson and Jon Luini) launched IUMA (Internet Underground Music Association), a platform for independent musicians to publish their works and share them

Multimedia, Networking and Mobility

Progress in desktop publishing continued at an ever more rapid pace. Apple's most impressive product of those years was perhaps QuickTime, introduced in december 1991, which allowed developers to incorporate video and sound in Macintosh documents. In 1992 Macromedia was born in San Francisco from the merger of Authorware, which sold a graphical programming environment, and MacroMind, whose Director was a multimedia-authoring environment. Director turned its users into "directors" of a film: it was a novel metaphor for building applications, mainly useful for creating the software of stand-alone kiosks. In 1993 Adobe Systems introduced the file format PDF (or Portable Document Format) to create and view professional-quality documents, and the free Acrobat reader for it.

After original founder John Walker relocated to Switzerland, in 1992 Carol Bartz from SUN revitalized Autodesk and became one of the first women of power in Silicon Valley. In 1990 Autodesk had released 3D Studio, a 3D modeling, rendering and animation tool designed by Tom Hudson that would be used for films such as "Johnny Mnemonic" (1996) as well as best-selling videogames such as "Tomb Raider" (1996), the "Lara Croft" series and "World of Warcraft" (2004). By 1994 Autodesk had become the sixth-largest personal computer software company in the world. Bartz was credited as having coined the "3F" philosophy: "fail fast-forward", i.e. risk failure but recognize it when it happens and move on quickly (a concept later popularized by Marissa Mayer).

The boom of graphic applications led to a demand for better graphic processors. Santa Clara-based Nvidia was a fabless semiconductor company founded in 1993 by Jen-Hsun Huang, previously at LSI Logic and AMD, and two SUN engineers (Chris Malachowsky and Curtis Priem) to design graphic chipsets for personal computers. In 1995 Nvidia debuted the NV1, the first commercial graphics processor that integrated 3D rendering, video acceleration and GUI acceleration; but the timing was unfortunately: just a few months after the NV1 started shipping Microsoft released its DirectX specifications.

Founded in 1989 in Fremont by Dado Banatao (who had pioneered fabless manufacturing with Chips and Technologies), S3 Graphics introduced in 1991 the first graphics accelerator card, the 86C911, in 1994 the Trio64 chipsets that became a hit in the high-end OEM market, and in 1995 the Virtual Reality Graphics Engine (ViRGE), a low-cost graphics chipset that was a 2D/3D hybrid.

Rambus was founded in 1990 by Mike Farmwald and Stanford professor Mark Horowitz to commercialize a new DRAM architecture to deal with the rapidly increasing graphic and video applications, requiring high data transfer rates to display high-resolution images.

Communications were also driving rapid progress. Cisco had entered the Ethernet switch business by acquiring Crescendo Communications in 1993 and Kalpana in 1994. By 1997 Ethernet switching was producing more than $500 million in annual revenues for Cisco.

C-Cube, started in August 1988 in Milpitas by Weitek's Edmund Sun and Alexandre Balkanski, had already begun making chips for video compression technology (MPEG codecs).

In 1990 Marc Porat at Apple started a project code-named Paradigm that aimed to build an innovative hand-held mobile device. In may 1990 Porat and two legendary Apple software engineers, Bill Atkinson and Andy Hertzfeld, decided to start a company to develop the idea, General Magic. Their vision was now more ambitious: they wanted to put the power of a real computer into the hands of a casual mobile user. At the time this was technologically impossible, so they thought of creating a "cloud" of services running on interconnected devices: by roaming the cloud, even a simple, weak device could muster the computing power of a real computer. They came up with the Telescript programming language to write applications for hand-held device (a "personal intelligent communicator") that would physically and opportunistically spread onto remote computers but eventually deliver back a result to the user of the hand-held device. Telecom and IT (Information Technology) giants such as Sony, Motorola, Matsushita, Philips and AT&T invested in the idea. Commercially, it was a spectacular flop, but a new paradigm had indeed been introduced: "cloud computing". And their vision was basically the vision of the future "smartphone". Not coincidentally, both Tony Fadell (future leader of the iPod project at Apple), and Andy Rubin (founder of Android) worked at General Magic.

EO, launched in 1991 by French-born C-Cube's executive Alain Rossmann, manufactured a personal digital assistant that was also a cellular telephone using Go's PenPoint operating system that recognized handwritten commands.

Meanwhile Apple invested in developing a pen-based tablet computer with software for handwritten recognition, eventually released in 1993 as the Newton platform. Newton was another flop, but launched the vogue for small, mobile Personal Digital Assistants (PDAs) in Silicon Valley (a decade after the Psion). Incidentally, it was running the ARM processor from Britain: Newton failed but indirectly it helped ARM survive by targeting small devices. In 1991 HP too had entered that market with the Jaguar. Apple therefore was a late comer, but its MessagePad (the first device based on Newton) came with a stylus and handwriting recognition. More importantly, it looked "cool".

In 1990 Los Angeles-based Dycam, probably the first company founded (in 1988) to link electronic photography and computers, introduced the first digital camera, Model 1, capable of storing pictures as digital files on an internal one-megabyte RAM chip and of downloading them to a PC. The Kodak DCS100 arrived half a year later. It had taken 15 years from invention to commercialization because Kodak's engineer Steven Sasson had invented the digital camera already way back in 1975. (The first consumer digital camera had been introduced in 1988 in Japan by Fujifilm). In 1991 the Nikon D1 set the standard for digital single-lens reflex cameras. Meanwhile in 1990 Kodak had launched the Photo CD, a box to convert negatives or slides to image files on a CD, but it used a proprietary format instead of Jpeg. The first camera that could download images into a personal computer was the Apple QuickTake 100 (introduced in february 1994), that could store up to 32 images at a resolution of 320x240 pixels on a flash memory, although it was the Kodak DC40 (march 1995) that made the concept popular worldwide. In may 1994 Epson introduced the Stylus Color, the world's first color inkjet printer that allowed households to print their own digital photos. Early digital cameras used an analog process to convert the image into a set of pixels (a process originally invented at Bell Labs in 1969 by Willard Boyle and George Smith for computer data storage). Image sensors made of CMOS technology, the same technology used to make computer processors and computer memories, were invented in 1993 by Eric Fossum at NASA's Jet Propulsion Laboratory in southern California although these Active Pixels Sensors (APS) would not become popular until the 2000s.

In a rare instance of cross-industry collaboration, IBM, Intel, Microsoft, Compaq, DEC and others joined together to define a Universal Serial Bus (USB) for personal computers, eventually introduced in 1996. It would make it a lot easier to connect peripherals to computers, and it would enable gadgets such as digital cameras to be treated like computer peripherals.

Meanwhile, Finland was the leader in the field of mobile (cellular) phones. In 1982, a European consortium of national telecommunications agencies, notably France Telecom and Deutsche Bundespost, had joined together to create a common standard for mobile phone communications, the Groupe Special Mobile (GSM). They had envisioned that every mobile phone would be equipped with an integrated circuit, called a SIM (Subscriber Identity Module) card, to contain information about the user, so that the information would be independent of the mobile phone and therefore portable across phones (an idea that had been pioneered in 1972 by the German mobile phone system B-Netz). The first SIM card was made in 1991 in German by Giesecke & Devrient for the Finnish wireless network operator Radiolinja, the one that launch the first GSM service (now renamed Global System for Mobile Communications), heralding the second generation (2G) of mobile telephony (digital instead of analogic) that enabled the transfer of data, e.g. of text, instead of just voice. Meanwhile, Qualcomm in San Diego was developing a different technology, CDMA (Code Division Multiple Access), launched in 1992, but adopted only in North America (and eventually sold to Japan's Kyocera), while Europe (and most of rest of the world) adopted a technology called Time Division Multiple Access (TDMA), the one incorporated into GSM. GSM had the additional advantage of being "open source". Hence, no surprise that by the end of 1998 there would be 130 million GSM mobile phones in the world (90% of Europe's mobile phones, but only 35% of Asia's mobile phones) versus only 20 million CDMA subscribers (almost all in North America).

GSM and CDMA made Bell Labs' AMPS obsolete.

In 1993 Nokia introduced a feature to send text messages to other cell phones, the Short Message Service (SMS). British computer scientist Neil Papworth is credited with having sent the first message, thereby creating the popular acronym "LOL" in December 1992, but he sent it from a computer, not from a mobile phone (phones didn't have a full keyboard yet).

Despite the fact that there were already 34 million cell-phone subscribers in the USA in 1995, the field was virtually inexistent in Silicon Valley.

Qualcomm was computerizing the low-power mobile communications at the time when the emphasis was still on high-power land lines, which were getting faster and faster thanks to fiber-optics. Its founders, in 1985, were Andrea Viterbi and Irwin Jacobs, who had studied at the MIT with Claude Shannon. Viterbi was the main mind behind the Code Division Multiple Access (CDMA), a system to maximize capacity in a medium (the air) were the wireless spectrum was a limited resource. Qualcomm understood that collapsing chip costs made computing a resource less precious than the wireless spectrum and therefore used computing to maximize what can be done with and in the wireless spectrum. (In 2012 Qualcomm's market capitalization would surpass Intel's)

Cell-phone technology truly came to the Bay Area in 1992, when Martin Cooper (Motorola's prophet of cellular phones) established ArrayComm in San Jose to improve the capacity and coverage of cellular systems. While not particularly successful, ArrayComm would train and spread alumni in the region.

Elsewhere, in 1995 the Israeli company Vocaltec, a maker of sound cards for personal computers, introduced the first "Voice over IP" software, i.e. the first Internet phone system, acquired in 1993 from Opher Kahane's and Ofer Shemtov's ClassX. Even in the USA, the pioneers of the Voice-over-IP industry were not in Silicon Valley. Net2Phone was launched in 1996 by Howard Jonas's International Discount Telecommunications (IDT) from New Jersey, and Dialpad, the world's first free Internet telephony service when it launched in October 1999, was based in Santa Clara but was a spin-off of Serome, founded in South Korea in 1993 by Wongyu "Ted" Cho.

Redwood City-based Unwired Planet (founded in december 1994 by Alain Rossmann as Libris and later renamed Unwired Planet, Phone.com and finally Openwave) pioneered mobile Internet browser software technolony (or "microbrowser), for which it pioneered HDML (Handheld Device Markup Language), basically an HTML for handheld devices. While most companies in the mobile-phone business were busy adopting the "push" paradigm of SMS, Openwave adopted its own "pull" paradigm. One year later Openwave and three giants of mobile communications (Ericsson, Motorola and Nokia) would turn HDML into WML (Wireless Markup Language), the international standard for cell phones to access the Internet.

Speech-recognition technology would turn out to be crucial for the user interfaces of mobile devices, and one of the main research centers was the old SRI International. Michael Cohen led a team that developed the technology used in the Air Travel Information System (ATIS), a project originally funded by the DARPA, a technology that combined two in-house backgrounds, one in voice recognition and one in natural language processing. In 1994 Cohen quit and founded Nuance in Menlo Park that would become one of the leaders in the sector (Nuance would be licensed by Siri for the Apple iPhone and Cohen would be hired by Google in 2004).

Thad Starner, a student at the the MIT Media Laboratory, started wearing a self-made custom computer device in 1993. He would eventually graduate with a thesis titled "Wearable Computing and Contextual Awareness". At roughly the same time Daniel Siewiorek at Carnegie Mellon University designed wearable computing devices, mainly for the military, such as the VuMan3.

A precursor of wearable computing was the "active badge" developed in 1990 at Olivetti's German labs, a badge that broadcast a person's location. In 1994 Steve Mann, a student at the MIT Media Lab, started experimenting with wearable devices, which years later would make him the hero of the documentary "Cyberman" (2001). In 1998 he built the first smartwatch, a watch that ran Linux. In 1997 Carnegie Mellon Univ, the MIT and Georgia Tech organized the first IEEE International Symposium on Wearable Computers. In 1994 DARPA and the US Army had funded a project code-named "Land Warrior" to redesign the clothing of the infantry soldier. This was originally assigned to Hughes Aircraft Company and other giant defense contractors but in 1999 it was wisely transferred to Exponent, a Palo Alto-based engineering firm with roots in Stanford University, originally called Failure Analysis Associates, that specialized in investigating aviation and space disasters. Exponent would deliver a wearable computer, a head-mounted display and wireless communications. (In 2001 the contract would be granted to a consortium of giant corporations under General Dynamics that would lead to the program's dismissal after a huge hemorrage of taxpayers' money).

Meanwhile, "augmented reality" was implementing the "ultimate display" that Ivan Sutherland had described in his paper "A Head-Mounted Three-Dimensional Display" (1968). Augmented reality referred to systems capable of displaying virtual objects in a real environment in real time. They became fashionable again after Robert Zemeckis' film "Who Framed Roger Rabbit?" (1988). The first successful systems included Michael Bajura's medical data display application at the University of North Carolina at Chapel Hill (1992), Louis Rosenberg's Virtual Fixtures at a military laboratory in Ohio (1992), Steven Feiner's KARMA (Knowledge-based Augmented Reality for Maintenance Assistance) at Columbia University (1992), and Tom Caudell's manufacturing application at Boeing (1992), Caudell being credited with coining the expression "augmented reality". See-through head-mounted displays already existed, for example those made by Hughes Electronics. In 1993 Paul Milgram's group at the University of Toronto built ARGOS (Augmented Reality through Graphic Overlays on Stereovideo). During the 1990s research focused on the problem of calibrating see-through head-mounted displays so that the real world and the virtual world could be synchronized. Then came augmented-reality conferencing system (that allowed users to see each other as well as virtual objects) such as Dieter Schmalstieg's Studierstube at Vienna University of Technology (1996) and Jun Rekimoto's Transvision at Sony in Japan (1996). In 1999 Hirokazu Kato in Japan developed ARToolKit, an open-source software for the creation of augmented-reality applications.

Contributions to wearable and virtual reality also came from the world of videogames, such as Nintendo's Virtual Boy console (1995). Nintendo had released in 1989 its Power Glove, a wearable three-dimensional input device. The history of virtual reality intersects with the history of videogames. For a while Britain, where Roy Trubshaw had pioneered the idea, was the leader in the field of MUDs with games such as "MIST" (1986) and "AberMUD" (1989), followed by Denmark with "DikuMUD" (1991); but their virtual worlds were text-based, not graphical. In 1990 "GemStone III" was launched in Missouri, a graphical MUD that would spread on CompuServe, Prodigy and America OnLine (AOL).

There was an excitement in the virtual-reality community comparable to the excitement in the personal-computer community a decade earlier. In 1990 Mark Bolas and others founded Fakespace in Mountain View, a spin-off of NASA's Ames Research Center, to build devices for virtual reality. In 1990 Brenda Laurel and NASA veteran Scott Fisher, both alumni of the Atari Systems Research Laboratory, founded Telepresence Research. 1990 also saw the first sales of the virtual-reality system developed by W Industries (founded by former IBM scientist Jonathan Walden in 1985 in Britain and later renamed Virtuality), targeting the research labs of big corporations. In 1991 Ben Delaney started the magazine "CyberEdge Journal" and Brenda Laurel published "Computers as Theatre" (1991). Virtual Research Systems, founded in 1991 in Sunnyvale by Bruce Bassett, began selling a cheap "Flight Helmet", based on the old design of NASA's VIVED. In 1993 the IEEE organized the first academic conference on virtual reality: the Virtual Reality Annual International Symposium (VRAIS), held in Seattle. In 1994 Linda Jacobson published "Garage Virtual Reality", a survey of the independent 3D-graphics scene. Brett Leonard directed the film "The Lawnmower Man" (1992) and Robert Longo directed "Johnny Mnemonic" (1995). Sega demonstrated (but never released) the Sega VR in 1993. In 1995 Future Vision Technologies, a spinoff of the University of Illinois at Urbana-Champaign, developed a head-mounted display for the consumer market, the Stuntmaster, and in the same year a company based in Sacramento marketed the iGlasses goggles. They all had the same problems: a true stereo display made of two high-resolution color LCD screens coupled with motion tracking was too expensive and caused serious motion sickness. Unfortunately, the excitement vastly exceeded the reality of the technology. Nonetheless, virtual-reality platforms such as EON Reality, founded in 1998 in Sweden by Mats Johansson, Dan Lejerskar and Mikael Jacobsson, would soon begin to appear.

IBM vs Microsoft

Microsoft continued to dominate the operating-system market. In may 1990 Windows 3.0 finally obtained the success that had eluded previous versions of Windows. The difference was that Windows now boasted a vast portfolio of well-tested applications, starting with Microsoft's own Word, Excel and Powerpoint. Windows 3.0 gained widespread third-party support. When release 3.1 was released in april 1992, three million copies were sold in just two months. When Windows 95 was released in august 1995, the frenzy was even bigger. In 1991 Microsoft had revenues of $1,843,432,000 and 8,226 employees. Bill Gates was becoming one of the richest men in the nation. More importantly, millions of computer users were abandoning the text commands of MS-DOS for the overlapping windows, the pull-down menus and the mouse clicks of Windows.

The holy alliance between IBM and Microsoft that had turned Microsoft into a software powerhouse came abruptly to an end in 1990. Microsoft realized that Windows 3.0 was a much more successful product than OS/2 could ever become: it was available on a lot more platforms, it was sold already installed by a lot of computer manufacturers, and it boasted a fast-growing catalog of third-party applications. On the other hand, OS/2 was sponsored only by IBM and it was much more expensive. Microsoft decided to part ways and continue to focus on Windows. IBM had lost its grip on the computer industry: for the year 1992 it reported a loss of $4.96 billion, the highest in the history of the USA. In january 1993, as one stock kept rising while the other one kept plunging, Microsoft's market value ($26.78 billion) passed IBM's ($26.76 billion), that still employed many more people and still had revenues of $64 billion. The market just didn't believe that IBM's business model had a future.

Digitalk and ParcPlace had introduced commercial versions of the object- environment Smalltalk and had created a small but devoted following for it. In 1991 IBM launched its own Smalltalk-based project that in 1995 yielded the object-oriented environment VisualAge. Apple, in turn, had started Pink, a project to design an object-oriented operating system written in the object-oriented version of C, the C++ programming language. In 1992 IBM and Apple banded together and formed Taligent in Cupertino with the goal to complete Pink and port it to both platforms. The whole plan was widely viewed as an anti-Microsoft move, now that Microsoft was a common enemy. Again the project failed, but it yielded at least one intriguing idea: the "People, Places and Things" metaphor that provided procedures for these three categories to interact at a high level of conceptualization.

In fact, the deluge of Windows could not be stopped: Microsoft became the largest software company in the world with annual sales in 1994 of over $4 billion. It even began to draw the attention of the USA government, which feared a monopoly in software. The USA government forced Microsoft to be less "evil" towards the competition, but it was only the beginning of a series of lawsuits and investigations into Microsoft practices both in the USA and in Europe.

The rise of Microsoft and the decline (and sometimes demise) of the traditional giants of IT came at a price: the shrinking research lab. When the IT and telecom worlds were ruled by the likes of IBM and AT&T, the research laboratories were huge and their ambitions were huge. Those labs invented the transistor, the Internet, the programming language, the operating system, the hard disk, the relational database and Unix. Microsoft's research laboratories invented nothing. AT&T's descendants (the regional Bell companies and the new telecom companies) invented nothing. IBM and AT&T did not need to acquire other companies: their products came from their research labs (the likes of Rolm and Lotus contributed relatively minor product lines to IBM). Microsoft and later Google bought their most famous products from start-ups. Their cash did not buy them great research teams: it bought them great intellectual-property lawyers who filed patents for countless trivial features of their products, a tactic meant to discourage competitors from venturing into the same areas of development. The Microsoft era was the era of the business plan: companies relied on a business plan, not on technological innovation, in order to achieve domination. No wonder that they did not produce anything comparable with the transistor or the many inventions of IBM.

In Silicon Valley, in particular, the rate at which companies were created and destroyed was one practical explanation for why the old-fashioned research lab became less and less viable. It was basically inherent to the Silicon Valley model of frenzied growth that no single company could afford to have a long-term plan, and especially one that was not focused on any one product. Life expectancy was very low. And the only way to prolong your life was, in fact, to live by the day. It was the law of the wilderness transferred from the lawless towns of the Far West to the high-tech industry. That was, ultimately, the lesson learned from the experience of Xerox PARC.

Free Unix

While the great Unix wars were still going on, in 1991 a Finnish student, Linus Torvalds, a believer in the philosophy of Stallman's Free Software Foundations, developed a new Unix kernel, called Linux, and equipped it with GNU tools (at the time GNU did not offer a kernel yet). In this way Torvalds had accomplished what Stallman had advocated: a free and open-source version of Unix. However, initially the only support came from independents, certainly not from the big corporations who were fighting the Unix wars. In 1994 Marc Ewing, a graduate from Carnegie Mellon, completed his Red Hat Linux and started Red Hat in North Carolina. Novell's engineer Bryan Sparks in Utah founded Caldera, with funding from Novell's former boss Ray Noorda, to distribute a high-end version of Linux. In january 1993 Novell itself bought all the rights to the Unix source code from AT&T for $150 million.

Storage

EMC, founded in 1979 near Boston in Massachusetts by Richard Egan and Roger Marino, had begun as a humble maker of memory boards for workstations (the first one in 1981 for the Prime workstation) then it soon invented storage business. Before EMC the external storage of a computer came with the computer. EMC started selling external computer storage independently in 1989, and the following year it launched its best-selling product, Symmetrix, the first "storage array" for computer mainframes sold by an independent company (IBM was selling its own 3390 disk subsystem). EMC would become the fastest growing high-tech company in the nation with the best performing stock of the decade.

Competing with storage specialists such as East-Coast colossus EMC and Japanese colossus Hitachi Data Systems, Network Appliance, an Auspex spin-off founded in 1992 in Sunnyvale by David Hitz and James Lau, introduced a lower-cost and more scalable "network attached storage" (NAS) appliance of the kind pioneered by 3Com. Data storage over Ethernet was a simple concept but also a gold mine, that allowed NetApp to double revenues every year throughout the 1990s. Theirs was a Copernican revolution: instead of using (expensive) custom hardware to run a general-purpose operating system as fast as possible, they used standard hardware running proprietary software that did only one thing: store and retrieve data.

One of the first consumer products to use flash memory as storage was the Psion MC 400 of 1989. In 1991 SanDisk created an affordable flash-based solidstate drive. Up to that point the flash memory had been mostly used inside the computer, not for removal storage. In 1995 Toshiba replaced the floppy disk with a NAND-type flash memory, the SmartMedia card, while Intel introduced a different NAND format, the MiniCard, backed by Fujitsu and Sharp. Flash memory would become ubiquitous in digital cameras and other consumer electronics, but soon the winning format would be CompactFlash, unveiled by SanDisk in 1994 and adoped by Canon and Nikon for their digital cameras and camcorders.

ERP

ERP and Supply Chain Management were ripe in the Bay Area. PeopleSoft added a financial module in 1992, pushing revenues up to $575 million in 1994, and a manufacturing module in 1995 after acquiring the supply chain management system developed by Red Pepper Software, pushing revenues to $816 million in 1997. It was one of the most sensational success stories of the client-server era. Red Pepper had been founded in 1993 in San Mateo by Monte Zweben, a former scientist at NASA's Ames Research Center, where he had developed an artificial intelligence-based scheduling system for NASA's Kennedy Space Center. In 1993 former Oracle sales executive Thomas Siebel started Siebel to market a software application for sales force automation, the first step towards Customer Relationship Management (CRM). Meanwhile, in 1992, SAP had launched R/3, moving its ERP system from the mainframe to a three-tiered client-server architecture and to a relational database. It was an immediate success. In 1994 SAP's revenues increased 66% and then 47% the following year to $1.9 billion, three times what they had been in 1991.

In 1994 Japan's Denso invented the QR Code (Quick Response Code), an alternative to the barcode with greater storage capacity, the first major innovation in identifying manufacturing parts since 1974 (when the first item was sold by a store using a scanner to read a barcode using the Universal Product Code invented by IBM).

Another software invention that came from Germany was the multi-dimensional array data model, originally conceived by Peter Baumann in 1994, an extension of the relational data model (with query language that was an extension of SQL). This technology, organizing data in multi-dimensional arrays instead of one-dimensional sets, would introduce a more powerful way to handle data such as images.

Competing Economic Models

In 1991 the USA computer industry posted its first "trade deficit": more computer technology was imported than exported (in dollar value). The dynamics of the various regions were wildly different. In the USA the computer industry was constantly changing, with large companies disappearing overnight and new giants emerging almost as quickly. In Europe, on the other hand, the computer market was dominated by old bloated companies: Nixdorf in Germany, Bull in France, ICL in Britain and Olivetti in Italy. The Europeans had generally been more eager to jump on the bandwagon of "open systems", despite the fact that their revenues largely depended on proprietary custom design, notably in operating systems. Their business model typically emphasized the one-stop shopping experience for their customers, and those customers were typically very large businesses (notably banks). Europeans preferred to sell "solutions" rather than individual pieces of hardware and software, a solution being a tailored combination of hardware and software. For many years these components had been made in house. It was basically a scaled-down and nationalistic version of the IBM business model. They could successfully compete against IBM because of two factors: political protection from national governments, and national non-English languages (in an age in which only a tiny minority of Europeans understood English). They all employed hardware and software from the USA, but they made a point of changing the name and the interface to turn it into a proprietary product. Their products could be very creative. Olivetti, in particular, had developed a Unix-like real-time operating system, Multi-functional Operating System (MOS), designed by Alessandro Osnaghi and initially implemented at their Cupertino labs. They mostly endorsed Unix (as an alternative to IBM), and joined the Open Systems Foundation and similar standardizing alliances. An OSF laboratory was opened at Grenoble in France. However, the large computer manufacturers of Europe had begun a decline that would rapidly lead to disaster: in 1982 Bull has been nationalized by the French government, in 1990 German conglomerate giant Siemens acquired Nixdorf, in 1990 Japanese conglomerate Fujitsu acquired 80% of ICL, and Olivetti struggled to survive (it would capitulate in the mid 1990s). The transition to the open systems turned out to be a form of mass suicide. Europe did not fare much better in the realm of software: only SAP began a worldwide power, dominating the ERP sector even in the USA. Japan managed to become a powerhouse of hardware but failed to make a dent in software.

However, it was not clear from what the USA computer industry derived its advantage. After all, the World-wide Web was "invented" at CERN by a pan-European team, and Finland was the first country to implement a GSM network. If one had to name a region that was forward-looking, it wouldn't be Silicon Valley but Switzerland, the world's most energy-efficient economy, that pioneered both cleantech (renewable energy, green buildings, waste management, sustainable transportation) and biotech since the late 1980s. Nonetheless, only a specialist in Swiss affairs could name a Swiss company, and even that specialist would probably not be able to name a single Swiss invention of the era.

The overall economic model clearly played a role. The German economic model favored gradual innovation and long-term planning (the kind of thinking that was useful, say, in metallurgy); whereas the USA model favored disruptive innovation in real time (besides favoring overconsumption and overborrowing). Silicon Valley was the ultimate implementation of the USA philosophy. At the same time the Silicon Valley model was not just unbridled capitalism as often depicted elsewhere. For example, in 1993 leaders from government, business, academia and the community established Joint Venture Silicon Valley Network. They sponsored the Economic Development Roundtable (later renamed Silicon Valley Economic Development Alliance) and other initiatives to make the region as business-friendly as possible, while another spin-off, Smart Valley, helped schools, local government and community centers get on the Internet.

Intel benefited from the booming market for personal computers, and in 1992 became the world's largest semiconductor company, ahead of NEC and Toshiba. Its revenues had doubled in five years, reaching $8 billion. It was taking its revenge against the Japanese that almost bankrupted it a few years earlier.

In 1991 the USA computer industry posted its first trade deficit, but it was not good news for Europe; nor for Japan. Unseen to most, it was good news for the still very poor countries of India and China.

Biotech

Thanks to the success of Genentech, the biotech industry was expanding in the Bay Area, albeit at a modest pace. In 1990 Michael West from the University of Texas' Southwestern Medical Center in Dallas started Geron with funding from oil-industry tycoon Miller Quarles who wanted a "cure" against aging (in other words, immortality). In 1992 the company relocated to Menlo Park where West had found more venture capital, and in 1998 its scientists, led by Calvin Harley, would isolate human embryonic stem cells (but never get any closer to marketing immortality). Geron was one of the earliest startups of regenerative medicine. In 1992 Calgene, a spin-off of U.C. Davis, near Sacramento, created the "Flavr Savr" tomato, the first Genetically Manufactured Food (GMF) to be sold in stores (in 1994). The Flavr Savr tomato caused a new bubble in biotech. In 1996 there was another spike in biotech IPOs, with 53 companies raising almost $1.5 billion.

The science itself, however, was progressing rapidly. To start with, the Human Genome Project was finally underway. In 1992 the saga of Craig Venter got its start. Venter, raised in the San Francisco peninsula, had joined in 1984 the National Institutes of Health in Maryland, at the time still run by James Watson as a traditional biomedical center. In 1992 Venter, frustrated that the center wouldn't move faster towards automation of genetic processing, quit his job and the Human Genome Project to set up the Institute for Genomic Research (TIGR) a few kilometers away, in Rockville, funded by venture capitalist Wallace Steinberg of New Jersey with $70 million over seven years and staffed with many of Venter's old coworkers at the NIH. Meanwhile, Steinberg hired William Haseltine, who had pioneered research on AIDS at Harvard University since the 1970s in collaboration with Robert Gallo of the National Cancer Institute (who a few years later would discover the cause of AIDS, HIV), and then had formed several biotech start-ups during the 1980s. Steinberg put Haseltine in charge of a new company named Human Genome Sciences (HGS), the business plan being that Venter's TIGR would create a database of genetic information and Haseltine's HGS would sell it to pharmaceutical companies. This was a bold plan because until then no biomedical company had ever made a profit by simply selling information, and corresponded to a vision of future medicine as "bioinformatics". In 1993 HGS sold its genetic database to SmithKline Beecham for $125 million. In 1995 Robert Fleischmann of TIGR used research by Nobel laureate Hamilton Smith of Johns Hopkins University in nearby Baltimore to sequence ("map") the genome of a free-living organism, the bacterium Haemophilus Influenzae, responsible for ear infections. This success triggered a series of genome sequencing projects around the USA.

The ability to modify a complex genome with precision remained one of the goals of synthetic biology. In 1994 Srinivasan Chandrasegaran's team at the Johns Hopkins Bloomberg School of Public Health invented a technique for editing genomes, called Zinc Finger Nuclease (ZFN), that opened the doors to gene therapy. In 1995 Edward Lanphier founded Sangamo BioSciences in Richmond (north of Berkeley) to commercialize their technology.

Stephen Fodor was doing research at the Affymax Research Institute, a pharmaceutical company founded in 1988 by Alza's Zaffaroni in Palo Alto, on fabricating DNA chips using the same manufacturing techniques used to make semiconductors, while Peter Schultz was a pioneer in combinatorial chemistry at the Lawrence Berkeley Lab. They both wanted to overcome the slow pace at which genetic testing was carried out and find a method for simultaneously testing thousands of molecules. In 1991 Fodor succeeded in creating a "DNA chip" using photolithography. With help from Zaffaroni the duo started Affymetrix in Santa Clara in 1992 to produce "gene-chips" (microarrays), the biological equivalent of electronic chips, by printing a huge number of DNA molecules on a silicon wafer.

Affymetrix introduced the first DNA chip in 1994, the GeneChip. Pat Brown and Mark Schena at Stanford University worked on a different method (a robotic method) and in 1995 introduced the term "DNA microarray". Edwin Southern at Oxford University (and the founder of Oxford Gene Technology in 1995) was working on a technique based on inkjet printing, and so did Alan Blanchard at the University of Washington, who in 1996 invented the technique adopted by Agilent. Nimblegen Systems adopted an improved version of Affymetrix's technique. Illumina adopted the method invented by David Walt at Tufts University in 1998. They all wanted to leverage techniques originally developed for silicon semiconductors and used them to instead improve the speed at which DNA tests could be performed. Their microarrays made it possible to simultaneously test thousands of molecules.

Along the same lines in 1992 South African-born geneticist Sydney Brenner (a pupil of Francis Crick at Cambridge University who had just moved to the Scripps Institute in San Diego) joined Applied Biosystems' founder Sam Eletr to start Lynx Therapeutics in Hayward, a company that developed a massively parallel method for simultaneous interrogation of multiple DNA samples in a single test.

After the Human Genome Project had enumerated all human genes, the next step would be to understand what those genes mean. Molecular Applications Group, founded in 1993 in Palo Alto by Stanford's biologists Michael Levitt and Christopher Lee, applied the techniques of data-mining software to genetic information in order to help biotech companies figure out the function of a protein from its DNA.

The business potential of human-genome science was obvious from the beginning to pharmaceutical companies: knowing how the genome works would allow medicine to understand which ones cause which diseases, and possibly how to cure them. The first "genomic" start-up to have an IPO (is november 1993) was the invention of a New York venture-capital firm that in 1991 acquired a St Louis-based biotech start-up, Invitron, founded by scientist Randall Scott, and transformed it into Incyte Pharmaceuticals, a Palo Alto start-up that in 1994 launched a database for personal genomics, LifeSeq, accessible by yearly subscription for several million dollars and basically containing two databases: a catalog of genes of the genome, and a catalog of where each was expressed and what its probable function was. LifeSeq had actually been constructed for Incyte by consulting biotech firm Pangea Systems, founded by Joel Bellenson, who had worked at Stanford on an early DNA synthesizer, and Dexster Smith in 1991 in Oakland. Pangea later (1997) introduced a search engine for genetic databases, and eventually (december 1999) put it online as DoubleTwist.

In 1995 Greg Schuler at the National Center for Biotechnology Information designed a way to automatically refine the GenBank into a new database, UniGene.

Meanwhile, in 1993 Cynthia Kenyon, pupil of Brenner at Cambridge University now at U.C. San Francisco, discovered that a single-gene mutation could double the lifespan of the roundworm Caenorhabditis Elegans, a finding that stimulated research in the molecular biology of aging.

A few years later (1999) Leonard Guarente at the MIT would discover a gene that increases the lifespan of yeast, and the family of these genes, "sirtuin", became known as "the anti-aging gene". In 1999 Guarente and Cynthia Kenyon founded Elixir Pharmaceuticals to make anti-aging products.

The media focused on biotechnology for pharmaceutical applications ("red biotech") and for genetically-modified food ("green biotech"), but there was also a third application: biotechnology to produce biochemicals, biomaterials and biofuels from renewable resources ("white biotech"). This technology was based on fermentation and biocatalysis. It typically required "bioreactors" fueled with tailored micro-organisms (e.g. yeast, fungi and bacteria) to convert crops or other organic materials into sugars and then into useful chemicals. The "products" of white biotech ranged from sophisticated ingredients for red and green biotech to bulk chemicals such as biofuels. The promise was to replace the synthetic, petroleum-based materials that had become popular in the 20th century with biodegradable materials (that would also require less energy to manufacture and create less waste when disposed). Biologists such as Chris Sommerville at Stanford University and Yves Poirier at the University of Lausanne in Switzerland carried out pioneering experiments to prove the feasibility of environmentally friendly (biodegradable) plastics, a line of research picked up by Kenneth Gruys at food giant Monsanto in St Louis. In 1992 the Bio/Environmentally Degradable Polymer Society was established in Ontario (Canada).

Robotic surgery was pioneered by Computer Motion, founded in 1989 in Goleta (southern California) by Yulun Wang. They developed a voice-activated robotic system for endoscopic surgery called AESOP that in 1993 became the first robot approved in the USA for surgery. A few years later, in 1999, Intuitive Surgical (founded in 1995 in Sunnyvale by Frederic Moll as Integrated Surgical Systems) introduced the DaVinci Surgical System, a commercial version of work done at SRI Intl in the 1980s, the Green Telepresence Surgery system.

French Anderson carried out the first human gene therapy in 1990 at the National Institutes of Health (NIH). The news helped generate enthusiasm for the new technology. Within a decade there would be, worldwide, more than 400 clinical trials to test gene therapy. (Alas, Anderson became more famous as a glaring example of miscarriage of justice, spending several years in jail for a crime that he did not commit).

Greentech

Another nascent industry had to do with energy: fuel cells. Their promise was to open an era of environmentally clean power plants. In 1991 United Technologies Corporation, a large East-Coast defense contractor, became the first company to market a fuel-cell system, mainly used by NASA in the Space Shuttle. Meanwhile, research on lithium batteries at the Lawrence Berkeley Lab spawned PolyPlus Battery, which opened offices in Berkeley in 1991.

Anthropology of the Digital Frontier

Ironically, all the technological changes had not changed the nature of society all that much. Society in Silicon Valley still embedded the "frontier" model: most individuals were single and male. Most immigrants were male, whether coming illegally from Mexico to clean swimming pools or legally from India to write software. The main difference from the old days of the Frontier was that they, at some point, would bring their wives with them (as opposed to patronizing brothels). The national immigrants (coming from other states of the USA) were mostly male (and almost exclusively male in the engineering and executive jobs). Graduates from San Jose State University in the late 1990s were still only 18% female. Women, however, did not just stay home: more and more women studied business, law and marketing, to mention only the professions that could potentially lead to highly paid jobs. The "saloon", too, had changed dramatically: the new saloon was itinerant, located at the home or bar where a party was being thrown, and the party was often meant as a business opportunity (an event to find a new job, not an event to find a wife).

Spare time was still chronically limited, with many people working weekends and even holidays. The standard package granted only two weeks of vacation, and immigrants (whether from other countries or other states) used them mostly to visit family back home. The real vacation often came in the form of a "sabbatical" of six or twelve months. It was difficult to take a few days off during a product's lifecycle. It was easier to just take six months and be replaced by someone else during that period.

At the same time there was increased pressure to know the world, because business was increasingly international. And the world was increasigly the Pacific Ocean, not the Atlantic one.

After many venture capitalists moved to Menlo Park (in the 1980s) and Palo Alto (in the 1990s), San Francico lost its role as the financial center of the Bay Area and resumed its old role as the entertainment hub for the "Frontier". Silicon Valley engineers and entrepreneurs moved to San Francisco for, essentially, the nightlife. Compared with the sterilized and structured lifestyle of Silicon Valley, where people's main forms of entertainment were the gym and the movie theater (or, at home, the videogame and the video) and everything closed at 10pm, San Francisco promised a "wild" lifestyle.

However, Palo Alto itself was rapidly becoming an ideal concentration of restaurants, cafes and stores, most of them lining University Avenue (the continuation of Stanford's Palm Drive). It had lost its "student town" charm and become one of the most expensive places to live. The nearby Stanford Shopping Center was becoming the most celebrated shopping experience in the whole Bay Area.

Mountain View, which historically had no fewer influential startups, proceeded to create something similar to University Avenue in its "intellectual" heart, Castro Street. It was originally lined with bookstores, via a seven-block renovation project that ended in 1990. Mountain View had just inaugurated in 1986 its outdoor 22,000-seat Shoreline Amphitheatre, a major venue for music concerts.

Culture and Society

The marriage of the World-wide Web and of the utopian WELL culture recast old debates about personal freedom into the new distributed (and government-operated) digital medium. Notably, the Electronic Frontier Foundation was formed in San Francisco in july 1990 by Lotus' founder Mitch Kapor, by John Perry Barlow (a former lyricist for the Grateful Dead now turned libertarian activist), and by Usenet's and GNU's veteran (as well as former Suicide Club member) John Gilmore to defend civil liberties on the Internet (he had become rich as one of the early employees of SUN). It evoked the myth of the hippy commune settled in a rural world. It was created in response to "Operation Sundevil" launched in 1990 by the FBI to arrest hackers involved in credit card and calling card fraud. It was the first major operation of its kind, since the authorities had rarely bothered to investigate (let alone prosecute) the "phreakers". In 1996 John Perry Barlow would also publish the "A Declaration of the Independence of Cyberspace".

The cyberlibertarian movement got its manifesto in 1994, when Esther Dyson, the daughter of the famous physicist Freeman Dyson, who had joined the Global Business Network in 1988, published "The Magna Carta for the Knowledge Age", co-written with George Gilder (author of "Microcosm - The Quantum Revolution In Economics And Technology" in 1989), Hewlett-Packard's physicist George Keyworth, and Alvin Toffler (author of "Future Shock" in 1970 and "The Third Wave" in 1980). That Magna Carta caused quite a stir among those who thought of cyberspace as potentially a space for the counterculture. Despite its title, the manifesto directly or indirectly supported the big corporations lobbying the government for maximum deregulation, something that hardly justified their claim that cyberspace belonged to the people. Many saw it as an attempt to steal cyberspace from the people and hand it over to big telecom corporations eager to reshape cyberspace to their economic advantage, just what the conservative politicians were advocating. Two years later (in 1996) the USA adopted a landmark Telecommunications Act, that boosted the online activities of the telecom giants.

The debate around the new digital technologies yielded books such as Howard Rheingold's "Virtual Community" (1993), Kevin Kelly's "Out of Control" (1994), Nicolas Negroponte's "Being Digital" (1995) and, a little later, "The Long Boom" (1999) by Peter Schwartz, Peter Leyden and Joel Hyatt.

In 1993 Kevin Kelly, the publisher of the "Whole Earth Review", joined the magazine Wired, where he continued to deal with the interaction of technology, culture and society. Wired's founder, Louis Rossetto had run Electric Word in Amsterdam and founded Wired in San Francisco with his employee Jane Metcalfe, and using funds provided by the Global Business Network and by Nicholas Negroponte of the MIT Media Lab. Incidentally, its online emanation, the commercial web magazine HotWired, launched one year later with a new form of advertising, the "banner", that would soon spread around the web turning websites into billboards. Wired had been preceded by a far more radical publication, Mondo 2000, started in 1989 by Ken Goffman and Alison Kennedy in Berkeley, a glossy magazine devoted to underground cyberculture that dealt with both the technology of the Internet age and the "smart drugs" of the rave age.

Mondo 2000 and Wired were instrumental in moving technology to the lifestyle of young college-educated people. They served, and helped grow, the hacker culture created in the 1980s. Hacker conferences multiplied, notably Defcon, started in Las Vegas in 1993 by Jeff Moss, and HOPE (Hackers on Planet Earth), founded in New York in 1994 by the hacker magazine 2600.

Echo, an online community founded in 1990 by Stacy Horn in New York's Greenwich Village, was the East Coast equivalent of the WELL, but with a much higher rate of female participation. In 1995 a former SUN marketing executive, Dan Pelson, launched the online magazine Word, and hired Jaime Levy as creative director, one of the few people who had experience in digital publishing, having produced a floppy disk magazine, Electronic Hollywood (1990) and having started a salon, CyberSlacker (1994), for programmers and animators at her East Village loft. Levy, gathering a cast of creators met on New York's online community Echo, turned Word into a futuristic website devoted to multimedia storytelling: audio, animated graphics, video streaming, the interactive social game Sissyfight (2000), and even (in 1998) the chatbot Fred The Webmate (styled after Joe Weizenbaum's artificial intelligence Eliza of 1966 by Japanese-born art director Yoshi Sodeoka). The Word was one of the earliest online magazines and arguably the most experimental. (When The Word shut down in 2000, it revealed the fundamental problem of Internet-based culture: the original tools used to create its multimedia contents rapidly disappeared and the magazine's issues are virtually impossible to recreate, unlike, say, the Dead Sea Scrolls that we can still read 2,000 years later). In 1998 one of these online magazine, The Drudge Report, run by Matt Drudge from his Florida home, would make history by breaking the Monica Lewinsky scandal to the public before printed magazines.

The 1990s were the age of the raves, all-night dance parties often held illegally in abandoned warehouses. The longest economic expansion in the history of the USA helped fuel the decade-long party. The music scene expanded dramatically yielding not pop stars but all sorts of alternative concepts: prog-rock trio Primus, folk-rock combo Red House Painters, acid-rock project Subarachnoid Space, stoner-rockers Sleep, the surreal electroacoustic act of the Thessalonians and electronic/digital composer Pamela Z. The visual arts were also expanding dramatically, fueled by a growing art market. A new generation of art galleries emerged in the early 1990s, such as the Catharine Clark Gallery (established in 1991), and the Yerba Buena Center for the Arts was inaugurated. San Francisco continued to specialize in creating subcultural movements (as opposed to high-brow art and music). The 1990s witnessed a boom in mural and graffiti art, notably Ricardo "Rigo 23" Gouveia, Margaret Kilgallen, Barry McGee ("Twist") and Ruby "Reminisce" Neri. The Clarion Alley Mural Project (CAMP) was established in october 1992 by local residents. Students from the San Francisco Art Institute originated the "Mission School", an art movement centered in the Mission district that was inspired more by street art than by museum art. They often built their artworks with found objects. Chris Johanson emerged from this crowd. In 1990 even the San Francisco main dump began a program of artists in residence. mainly an idea of pioneering environmental artist Jo Hanson. Harrell Fletcher and Jon Rubin, both students at the College of Arts and Crafts, opened a "gallery" in Oakland where they created art installations about the neighborhood using neighborhood residents. In 1995 Amy Franceschini founded the artist collective Futurefarmers to promote participatory art projects.

In 1992 a group of San Francisco street artists (Aaron Noble, MichaelO'Connor, Sebastiana Pastor, Rigo 92, Mary Snyder, Aracely Soriano) started painting murals in Clarion Alley, between Mission and Valencia streets and between 17th and 18th Streets, the beginning of what came to be known as the Clarion Alley Mural Project (CAMP).

Mark Pauline's Survival Research Labs had raised an entire generation of machine artists, several of whom were responsible for the apocalyptic sculptures of Burning Man. Kal Spelletich, founder of the influential machine art collective Seemen in Austin, moved to San Francisco in 1989 and joined SRL. The 1990s opened with the robotic opera "Trigram" (1991) by Chico MacMurtrie and a few years later (in 1994) Marc Thorpe organized the first "Robot Wars" at Fort Mason in which remote-controlled cybernetic gladiators battled to death like in medieval tournaments. Meanwhile, Death Guild, a 1993 club modeled after London's legendary Slimelight by David "DJ Decay" King, defined the look of the gothic/industrial scene. In 1992 Chico MacMurtrie founded Amorphic Robot Works, a San Francisco collective of artists/engineers intent on creating robotic art performances.

San Francisco was also becoming the hub of one of the most vibrant design schools in the world, thanks to designers such as the four Michaels: Michael Vanderbyl, Michael Manwaring, Michael Cronin and Michael Mabry. While the first graphic designer to embrace the Macintosh was April Greiman in Los Angeles, San Francisco was the base of Rudy Vanderlans, who started Emigre (the first major magazine that used the Mac), and of John Hersey, another pioneer of computer-based graphic design.

The environmental movement was still vibrant, sometimes bordering on guerrilla warfare. In 1992 a small crowd of bicyclists staged an event called "Commute Clot" that began a tradition of invading the streets of San Francisco on the last friday of every month. It spread to many cities of the world, better known as "critical mass biking".

However, the Bay Area that had witnessed the hippies and the punks was falling into the hands of high-tech nerds, as its universities graduated thousands of software and hardware engineers every year, and thousands more immigrated into the area. The age witnessed another wave of immigration of youth from all over the world, just like 30 years earlier. But, unlike in 1966, this time the drivers were not "peace and love" but vanity and greed.

The metropolitan area of the bay expanded dramatically in all directions, but especially south and east. Its population was cosmopolitan and young. Agriculture had definitely been wiped out. The demographic change made the Bay Area even more tolerant and open-minded. In 1993 political science professor Condoleezza Rice became Stanford's youngest, first female and first non-white provost.


(Copyright © 2010 Piero Scaruffi)

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence