A History of Silicon Valley

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence
Purchase the book

These are excerpts from Piero Scaruffi's book
"A History of Silicon Valley"

(Copyright © 2010 Piero Scaruffi)

16. The Downsizers (2003-2006)

by Piero Scaruffi

Distributed and Small

The early 2000s were the age of downsizing. Silicon Valley companies had to learn the art of cost cutting, and start-ups had to learn the art of actually developing a product and selling it. One more time the beneficiary was India. Creating a lab in India (where software engineers earned a fraction of Silicon Valley engineers) was a relatively painless way to dramatically cut costs. By 2005 more than 50% of all jobs outsourced by Silicon Valley companies went to India.

The Smartphone

Computing devices had been getting smaller since the first Eniac was unveiled. That trend had never really stopped. It just proceeded by discontinuous jumps: the minicomputer was a significant downsizing from the mainframe, and so was the personal computer from the minicomputer. The laptop/notebook, however, was just a variation on the personal computer, the only major difference being the screen. In 2005 sales of notebook computers accounted for 53% of the computer market: the traditional desktop computer was on the way out. IBM pulled out of the market for desktop computers. There was a clear trend towards a portable computing device, but the laptop per se did not truly represent a quantum leap forward, just a way to stretch the personal-computer technology to serve that trend.

At the same time sales of smartphones were booming too, but there was a lesson to be learned. In 2004 Motorola introduced the mobile phone Razr, an elegant-looking device that by july 2006 had been bought by over 50 million people, propelling Motorola to second position after Nokia; but sales started dropping dramatically in 2006. Motorola learned the hard way an important rule of the cell phone market: phones went in and out of fashion very quickly. There was room for more players, and Silicon Valley had largely been on the sidelines until then.

Ironically, the mobile smartphone of the 2000s rediscovered the three-lined navigation menu button (known as the "hamburger icon") that the Xerox Star personal workstation had used in 1981 (because of graphical limitations).

A humbler form of wearable computing was born in April 2003 with the Nokia HDW-2, the first Bluetooth headset for mass consumption.

No Silicon Valley company (or, fot that matter, US company) was part of the consortium formed in 2004 to develop and promote Near Field Communication (NFC), basically the smartphone equivalent of the old RFID. This was a method to allow smartphones to exchange data by simply pointing at each other at close range. The founders were Nokia, Philips and Sony. It would take seven years for Silicon Valley to catch up (when Google would introduce the technology in its NFS-enabled smartphones).

The Dotcoms

The positive note for the dotcoms was that the Web was spreading like wildfire all over the world. By 2006 Google had indexed more than eight billion pages, coming from the 100 million websites registered on the Web. In march 2006 the English version of Wikipedia passed one million articles. The Internet was being accessed by 1.25 billion people in the world. The dotcom bubble had not been completely senseless: one just had to figure out how to capitalize on that massive audience.

By 2005 Yahoo, Google, America OnLine (AOL) and MSN (Microsoft's Network) were the four big Internet "portals", totaling a combined audience of over one billion people. Never in history had such a large audience existed. Never in history had Silicon Valley companies controlled such a large audience (most of that billion used Google and Yahoo). There were only two threats to the Internet: spam (undesired marketing emails) and viruses (malicious software that spread via email or downloads and harmed computers).

Free Software

The desktop computer was still dominant device, and the Windows operating system was still the dominant platform, but there was grumbling, especially about the costs and the new (and often unpopular) releases of the WIMP camp. Ubuntu, a free and user-friendly variant of the Linux operating system for personal computers, was first released in Britain in 2004 by South African entrepreneur Mark Shuttleworth and quickly became popular in the open-source community. It was a descendant of Debian, invented in 1996 by a German student of Purdue University (in Indiana), Ian Murdock. One could run free applications such as the VLC player, develop in 2001 by students at the ╬ß╬Ý╬Ýcole Centrale of Paris, and a full Microsoft-compatible office productivity suite, SUN╬Ú╬¸s open-source project Open Office, lunched in 2000 (that would evolve in LibreOffice was purchased by Oracle in 2010). Within a few years Ubuntu would become a popular alternative to Windows, used to run Wikipedia and to provide cloud services for Netflix, Uber, Lyft, Dropbox, Paypal, Snapchat, Pinterest, Reddit, and Instagram. It would be even used in 2013 on China╬Ú╬¸s Tianhe-2 supercomputer.

In july 2003 AOL spun-off Mozilla, originally founded by Netscape to foster third-party development on the browser under a free open-source license, that quickly built a reputation for its new browser. The first chair of the Mozilla Foundation was Lotus' founder Mitch Kapor. The lesson learned by Netscape through Mozilla is that the open-source model works, but it is a Darwinian process, and, just like in nature, it works very slowly. The Mozilla community hated Microsoft's Internet Explorer and therefore loved the Netscape browser. Unfortunately, this means that dozens of people added features to Mozilla to the point that it became famously fat and slow. The reaction was that a new batch of developers produced a "lighter" version of the Mozilla browser, eventually named Firefox. Firefox was indeed a state-of-the-art browser that could match IE, but precious time had been lost. In 2003 Microsoft's Internet Explorer (IE) owned 95% of the browser market. The market share of Apple's Safari was negligible, but Apple made its WebKit rendering engine open source in 2005, a fact that would have far-fetched consequences a few years later in the age of the smartphones.

Another influential project by Linux inventor Linus Torvalds was Git, launched in 2005 originally for Linux developers but later open to everybody. Git was a system for controlling changes made to software, a "version control system". Git became a sort of Wikipedia for software. People from all over the world were able to collaborate on it. It became a sort of social networking platform for open-source software developers. Git was built on top of Concurrent Versions System (CVS), developed in 1986 by Dick Grune in Holland, and CVS originally was the user interface for the Revision Control System (RCS), developed in 1982 by Walter Tichy at Purdue University, and RCS followed in the footsteps of the first successful version control system, Source Code Control System, developed in 1972 at Bell Labs by Marc Rochkind on the IBM /370 and later ported to Unix.

The end of Moore's Law

Intel capitalized on the popularity of the Web with a new generation of microprocessors. Wi-Fi became a household name after Intel introduced the Centrino for laptops in March 2003 (largely the vision of Indian-born Anand Chandrasekher). From that point on a laptop would be associated with wireless Internet as much as with mobility.

In 2004 the first Wi-Fi-certified cell phones reached the market. These were portable devices that provided both Wi-Fi and cellular communications. In other words they merged the worlds of email (and web-browsing) and phone conversation (and text messaging). The Nokia's Symbian-based 9500 Communicator and Motorola's Windows-based MPx were the vanguard of this generation. Both were equipped with a QWERTY keyboard, and the MPx also incorporated a 1.3 megapixel camera (which by this point had become a fairly standard feature on high-end phones).

Mobile television, already available in South Korea since 2005, spread worldwide in a few years, finding millions of customers in Asia, Africa and Latin America. Ironically, the West lagged behind, and in 2010 mobile TV was still a rarity in the USA. But even in this case Silicon Valley was actually at the vanguard: the leading mobile TV chip maker, Telegent Systems, a fabless company founded in 2004 by LSI Logic's inventor Samuel Sheng, was based in Sunnyvale.

In reality, Moore's Law started failing in 2005, when Intel and AMD introduced their first "dual-core" processors. The problem was the heat generated by denser and denser circuits. In 2001 Patrick Gelsinger, an Intel engineer, had extrapolated the trends and showed that computer chips were heading for temperatures as hot as nuclear reactors. Intel's solution was to put multiple cores on the same chip. The original microprocessor was basically a computer on a chip. A multi-core processor was the equivalent of putting many computers on one chip. Moore's Law appeared to continue to hold because the consumer cared about the chip, not about how the chip was configured. The clock speed was actually reaching a plateau, but transistors were becoming cheaper and more power-efficient, thanks to progress in lithography at the foundries.

For the record, AMD beat Intel to market with its Opteron dual-core microprocessor in April 2005. Intel's first dual-core microprocessor, the Pentium D, came out in May. Within one year Intel already had a quad-core processor, and soon the single-core processor would become an ancient relic.


Google's growth had been particularly stunning, dwarfing even the excesses of the dotcom bubble. In 2003 Google had 10,000 servers working nonstop to index the Web (14 times more servers than employees). In 2002 Google acquired Blogger and in 2004 they acquired Keyhole, the source for their application Google Earth. More than a search engine, Google was expanding in all directions, becoming a global knowledge provider. In early 2004 Google handled about 85% of all search requests on the Web. In fact, a new verb was coined in the English language: to "google" something (search for something on the Web). Google's IPO in august 2004 turned Google's founders, Sergey Brin and Larry Page, into billionaires. In 2004 an ever more ambitious Google launched a project to digitize all the books ever printed. In 2004 Google hired German-born natural-language expert Franz-Josef Och, whose machine-translation system at the University of Southern California had been selected by DARPA; and in 2005 Google introduced its own automatic-translation system to translate webpages written in foreign languages. In october 2004 Google acquired Danish-born Australian-based Berkeley alumnus Lars Rasmussen's company Where2 and its mapping software; and in 2005 Google introduced Google Maps. MapQuest, the pioneering Web-based mapping service acquired by AOL in 2000, lost to Google Maps because the latter allowed third-party developers to add information to the map and use the map into their own software. The time-consuming process of scaling a web application was more easily done by "exploiting" the Internet community of software developers.

Much was being said of Google's ethics that allowed employees vast freedom to be creative. However, almost all of Google's business was driven by acquisition of other people's ideas. Gmail, developed internally by former Intel's employee Paul Buchheit and launched by invitation only in april 2004, was not much more than Google's version of Hotmail: what made it popular was the hype caused by the "invitation-only" theatrics. The only clear difference over Hotmail was that Gmail offered a gigabyte of storage versus Hotmail's two megabytes, i.e. 500 times more memory. Google Checkout, introduced in june 2006, was a poor man's version of PayPal. Google Streetview, introduced in 2007, was so similar to Vederi's ScoutTool, launched in 2000 and later renamed the StreetBrowser, that Vederi sued. Google's Android operating system for smartphones, acquired in 2005 from the namesake startup and introduced in 2007, was widely believed (not only by Steve Jobs) to be a diligent but uninspired imitation of the iPhone's operating system. The "semantic" improvement to the Google search engine of 2009 was due to the Orion search engine, developed in Australia by Israeli-born Ori Allon an acquired in 2006. Google replicated Microsoft's model: its own research labs were incredibly inept at inventing anything original, given the huge amount of cash poured into them. That cash mostly bought them patents on countless trivial features of their products, a tactic meant to prevent innovation by the competition. What drove Google's astronomical growth was (just like in Microsoft's case) the business strategy, not the inventions.

Google's real innovation was in the field of advertising. In June 2003 Google introduced AdSense, designed by Paul Buchheit right after Google acquired the technology from Applied Semantics (which had been founded in 1998 by Caltech graduates Gilad Elbaz and Adam Weissman in Los Angeles). It was a vast technological improvement over AdWords: content-targeted advertising. AdSense was capable of "understanding" the topic of a webpage and therefore automatically assign to it the relevant ads among all the ads provided by paid advertisers. By systematically monitoring the behavior of its search engine's users, Google had invented (or, at least, perfected) an automated system with three aims: first, for advertisers to create more effective ads; second, for Google itself to display more relevant ads; and third, for users to view the most relevant ads. The traditional ad in a newspaper had mostly been a one-sided decision, based on what an advertiser wanted to print and how much it was willing to pay, mediated by one of the newspaper's salespeople. In Google's world the ad became a computer-mediated deal among three entities: the advertiser, Google's AdSense and the user. Basically, AdSense created an infinite feedback loop that allowed advertisers to continuously improve their adverts, and at the same time promoted a race among advertisers to develop the "fittest" ad in a sort of Darwinian process. If previous progress in search-based advertising had lowered the barrier from large corporations to small businesses, AdSense enabled any content provider (from established news media to the smallest website on a rock star run by a teenage fan) to monetize its content. Of course, this also led to an alienating process in which very serious texts were being used to publicize trivial products (famously AdSense associated ads about plastic bags with the news of a murderer who had stuffed its victim's body parts in a plastic bag). The new landscape for advertisers was the whole behavior of the user, that Google monitored as much as possible through the user's searches. Thus, for example, someone dying of cancer and desperately searching the Web for medicines and devices would automatically be turned by AdSense into a golden business opportunity for any company advertising those medicines and those medical devices.

Yahoo had lost part of its sheen, but still generated yearly revenues of $1.6 billion in 2003 (up from $953 million in 2002), with an astronomical yearly growth and market value. In 2003 it acquired Overture/GoTo, nurtured by Los Angeles-based incubator Idealab, and introduced the "Pay per click" business model for advertisers. instead of the traditional "per view" model. GoTo had also introduced the idea of letting advertisers bid to show up higher in the results of a search (the "pay-for-placement" model). In 2006 revenues would reach $6.4 billion. Note that these dotcom companies were mainly selling ads. The initial dotcom business plan of simply becoming popular had eventually worked out: all you needed was a large audience, and then the advertisers would flock to your website. What was missing in the 1990s was... the advertisers.

Social Networking

Initially, instead, the idea behind the dotcoms had been to transfer commerce to the Web; hence e-commerce. This was a more than viable business, but, in hindsight, it lacked imagination (and it soon proved to be viable mostly for the already established "brick and mortar" corporations).

It took a while for the dotcoms to imagine what one could "sell" to one billion people spread all over the world: social networking. For the first time in history it was possible for one billion strangers to assemble, discuss and organize.

Social networking was another practical implementation of Metcalfe's law, that the value of a network of users increases exponentially with each new user.

One idea came from the Far East. In Japan the idea of the Unix bulletin board had morphed into the "textboard" for anonymous posting (no registration required). The first major anonymous textboard in Japan, Ayashii World, had been created in 1996 by Masayuki Shiba. In 1999 Hiroyuki Nishimura had launched the most influential textboard, 2channel. When graphics became commonplace on cheap computers, the textboard evolved into the imageboard, again starting in Japan, notably Futaba Channel (2001). The first English-language imageboard debuted on World2ch, originally created in early 2003 by a 16-year-old Japanese kid, code-named RIR7, as an English version of 2channel. Meanwhile, in 1999 Richard Kyanka had launched Something Awful, an online comedy platform, which soon added an anime subforum called ADTRW (Anime Death Tentacle Rape Whorehouse). In late 2003 a 15-year-old ADTRW user living in New York, Christopher Poole (code-named "m00t"), launched the 4chan imageboard platform, originally to discuss manga and anime, i.e. as a US version of Futaba Channel. 4chan quickly became the mother of all (English-language) imageboards. But, by then, the Internet was ready to move beyond imageboards.

Friendster had already been imitated by a few websites, notably tribe.net, founded in 2003 in San Francisco by Mark Pincus, Paul Martino and Valerie Syme.

The first truly successful social-networking site was MySpace, launched in 2003 in Los Angeles and purchased in july 2005 for $580 million by Rupert Murdoch's News Corp.

In february 2004 Harvard student Mark Zuckerberg launched the social-networking service Facebook. It soon spread from college to college. Weeks later Zuckerberg and friends relocated to Silicon Valley and obtained funding from Peter Thiel of PayPal. Somehow this one took off the way that previous ones had not. Facebook started growing at a ridiculously fast pace, having signed up 100 million users by august 2008 on its way to becoming the second website by traffic after Google by the end of the decade.

In 2005 Gina Bianchini and Netscape's founder Marc Andreessen launched Ning, a meta social-networking software: it allowed people to create and customize their own social networks.

Inktomi's founders Brian Totty and Paul Gauthier formed Ludic Labs in San Mateo in 2006, a venture devoted to social media software for consumers and businesses that launched offerfoundry.com, talkfilter.com and diddit.com.

Three factors contributed to the birth of "podcasting". First, the adoption in 1995 of the mp3 standard for audio files. Second, in 1999 Netscape launched RSS (Rich Site Summary, but mostly known as "Really Simple Syndication"), a system that allowed users to subscribe to the articles published on a blog. Developed by Dan Libby and Ramanathan Guha (a former A.I. scientist and Apple engineer), it incorporated Dave Winer's own system of syndication. Third, in 2001 Apple introduced the iPod, in theory a music player but de facto a general-purpose platform to play digital audio files. The iPod wasn't necessary (and, in fact, initially it didn't even play mp3 files), but it popularized the idea of the digital audio file that one can download on a personal device. Podcasting is "audioblogging" in the age of the mp3 and RSS: mp3 audio files distributed via RSS. It was pioneered by Kevin Marks in 2003 in Britain, and by serial Internet entrepreneur and reality-show producer Adam Curry in 2004 in the Netherlands. In 2005 Evan Williams (of Blogger fame) and Noah Glass launched the podcasting platform Odeo.

Last but not least, in 2006 Jack Dorsey created the social-networking service Twitter (initially just as a component of Evan Williams' Odeo) where people could post short live updates of what was going on in their life. It soon became popular for current events the way CNN had become popular during the first Gulf War. Dorsey, a New York University dropout who moved to California in 1999, sent the first tweet from @Jack in March 2006: "just setting up my twttr." In 2004 an MIT scientist, Tad Hirsch, had developed TxTMob, a platform that enabled individuals to send anonymous text messages to large groups of people. This had been designed for political activists just like Indymedia, except that it was even more oriented towards the "mob". One of Twitter's developers was Evan Henshaw-Plath, who was precisely one of those political activists with a background in Indymedia and TxtMob, and he took TXTMob as the foundation for Twitter's technology.

The Unix (and in particular Linux) world had been the first example of a social networking platform. It was used to refine the platform itself. Facebook and the likes simply adopted the concept and transported it to the sphere of private life.

Facebook's sociological impact was colossal. For example, Facebook offered a "Like" button for people to applaud a friend's statement or picture, but did not offer a "Dislike" button. Facebook was creating a society in which it was not only rude but even physically impossible to be negative. The profile picture of the Facebook user was supposed to be a smiling face. The whole Facebook society was just one big collective smile. The Web's libertarian society was turning into a global exercise in faking happiness. After all, the French historian Alexis de Tocqueville had warned in 1840 (in his study "Democracy in America") that absolute freedom would make people lonely and desperate. In a sense, social networking universes like Facebook were testing the possibility of introducing a metalevel of behavioral control to limit the absolute freedom enabled by the Web.

Google, eBay, Facebook and Twitter shared one feature that made them such incredible success stories was: simplicity. Initially, they all had a humble, text-only "look and feel" in the age of graphic design, banners, chat rooms, etc. All that Twitter had needed to change the world was 140 characters.

In Asia the most successful of these sites was still Friendster.

By this time instant messaging had become very popular but there were countless different providers, each of their own protocol. Meebo, founded in 2005 in Mountain View by Sandy Jen, Elaine Wherry and Seth Sternberg was an instant-messaging service that integrated all the most popular instant-messaging services such as AIM, Windows Live Messenger, MySpaceIM, Google Talk, ICQ and Yahoo Messenger. (Meebo was acquired by Google in 2012).

The story of "crowdfunding" began in New York: in 2001 Brian Camelio launched artistShare, the first music crowdfunding site. In 2005 Kiva, founded by Matt Flannery and Jessica Jackley in San Francisco, made waves with a new form of idealistic capitalism, "microlending" (typically for small businesses in third-world countries).

Your Life Online

In november 2005 a group of former Paypal employees, all still in their twenties, got together to launch a new website, YouTube: Steve Chen (a Taiwanese-born software engineer), Chad Hurley (an art designer) and Jawed Karim (a German-born Stanford student working part-time). Based in San Mateo, they were funded by Roelof Botha of Sequoia Capital, another PayPal alumnus. The concept sounded innocent enough: just a way for ordinary people with an ordinary digital videocamera to upload their videos to the Web. It turned out to be the perfect Internet video application. By july 2006 more than 65,000 new videos were being uploaded every day, and more than 100 million videos were viewed by users worldwide every day. In october Google bought YouTube for $1.65 billion.

YouTube did more than simply help people distribute their videos worldwide: it ushered in the age of "streaming" media. "Streaming" means to watch a video or to listen to a recording in real time directly from its Web location as opposed to downloading it from the Web on one's computer. YouTube's videos were "streamed" to the browser of the viewer. YouTube did not invent streaming, but it demonstrated its power over cable television, movie theaters, and any previous form of broadcasting videos to the masses.

Another idea that matured in the 2000s was Internet-based telephony. Skype was founded in Europe in 2003 by Niklas Zennstroem and Janus Friis to market a system invented by Kazaa's founders Ahti Heinla, Priit Kasesalu and Jaan Tallinn. Internet users were now able to make free phone calls to any other Internet user, as long as both parties had a microphone and a loudspeaker in their computer. The lesson learned in this case was that telephony over the Internet was a major innovation for ordinary consumers, not companies, but ordinary consumers could not afford suitable computers until the 2000s. Skype was not charging anything for the service, so, again, the business model was just to become very popular all over the world. Microsoft would purchase Skype in 2012, thus de facto ending the era of its MSN Messenger.

Another service that matured at this time was Internet-based music streaming. This had been pioneered by Listen.com, founded in 1999 in San Francisco by Silicon Graphics veteran Rob Reid, but became Rhapsody, charging a flat monthly fee for streaming music, with the 2001 acquisition of the streaming engine of TuneTo.com. In 2005 David Hyman launched MOG in Berkeley, and in 2006 Daniel Marhely launched Deezer in France.

Expanding the concept of the female-oriented portal iVillage, Indian-born former Apple scientist and NetObjects founder Samir Arora set up Glam Media in 2003 in Brisbane, near South San Francisco, staffing it with the old NetObjects team. They initially focused on the fashion/lifestyle website Glam.com targeting the female audience.

Yelp, founded in 2004 in San Francisco by "Paypal mafia" members Jeremy Stoppelman and Russel Simmons, joined Amazon and social media in letting customers recommend, judge and rate products and services, i.e. do the marketing that really matters. Yelp embodied the new philosophy of crowd-marketing that would come to be called "Likeonomics". Sites like Yelp would rapidly displace CRM as the most effective way to market a product/service.


The net economy was, however, recovering from the dotcom burst. For example, Amazon lost a staggering $2.8 billion between 1995 and 2001. Its first profit was posted at the end of 2001, and it was a mere $5 million. But in 2005 it posted revenues of $8.5 billion and a hefty profit, placing it inside the exclusive club of the "Fortune 500", and in 2006 the revenues would top $10.7 billion. In 2007 sales would increase a stunning 34.5% over the previous year. eBay's revenues for 2006 reached $6 billion. Netflix's revenues were up 48% from the previous year, just short of one billion dollars, and it had almost six million subscribers.

It took a while before the business world understood the benefits of selling online: it makes it easier to track customer's behavior and fine-tune your marketing to attract more customers or to attract more advertisers. The amount of data generated world-wide had been increasing exponentially for years, and those data were mostly ending up on the Internet. Enerprise software was ridiculously inadequate to dealing with that avalanche of data. A new kind of applications, spearheaded by Splunk, launched by serial entrepreneurs Rob Das and Erik Swan in 2002 in San Francisco, filled that niche: analyze customer behavior in real time and churn out business metrics data.

Digital Entertainment

A lesson was creeping underneath the colossal amount of music downloaded both legally and illegally from the Internet. In 2003 the file-sharing system Rapidshare was founded in Germany, the file-sharing system TorrentSpy went live in the USA, and a BitTorrent-based website named "The Pirate Bay" opened in Sweden. In 2005 Megaupload was founded in Hong Kong. In 2006 Mediafire was founded in the USA. These websites allowed people to upload the music that they had ripped from CDs, and allowed the entire Internet population to download them for free. The "fraud" was so extensive that in 2006 the music industry in the USA (represented by the RIAA) filed a lawsuit against Russian-based Internet download service AllOfMP3.com for $1.65 trillion. Needless to say, it proved impossible to stop half a billion people from using free services that were so easy to use. Music download became a pervasive phenomenon. Apple's iTunes store (opened in april 2003) was the legal way to go for those who were afraid of the law, and by the end of 2006 a hefty 48% of Apple's revenues was coming from sales of the iPod, one of the most successful devices in history. Digital videos came next, although the sheer size of the video files discouraged many from storing them on their home computers. The lesson to be learned was twofold. One lesson was for the media company: it was virtually impossible to enforce copyrights on digital files. The other lesson was for the consumer: it was wishful thinking that one could digitize a huge library of songs and films because they required just too much storage. A different system was needed (streaming).

The phenomenon of music digital download was a premonition of an important transformation in computing. From the viewpoint of the "downloader" the whole Web was becoming just one huge repository of music. Its geographical location was irrelevant. It was in the "cloud" created by multiple distributed servers around the world.

The situation was quite different in the field of books. In the late 1990s Companies such as SoftBook Press and NuvoMedia had pioneered the concept of the e-book reader. Microsoft and Amazon had introduced software to read ebooks on personal computers (Amazon simply purchased the technology in 2005 from the French company Mobipocket that had introduced it in 2000). That was at the time when there were virtually no ebooks to read. This changed in 2002 when two major publishers, Random House and HarperCollins, started selling digital versions of their titles. Amazon became and remained the main selling point for ebooks, but "ebookstores" became to appear elsewhere too, notably BooksOnBoard in Austin (Texas) that opened In 2006. In October 2004 Amazon had hired two former Apple and Palm executives, Gregg Zehr (hardware) and Thomas Ryan (software), who in turn hired mostly Apple and Palm engineers, and had started a company in Cupertino called Lab126 to develop a proprietary $400 hand-held e-book reader, the Kindle, that was eventually introduced in November 2007. The Kindle was not just a software application but a custom device for reading books. That device, conceptually a descendant of the Palm Pilot, was the device that tilted the balance towards the ebook.

The company that democratized video was based in San Francisco: Pure Digital Technologies, originally founded in 2001 by Jonathan Kaplan to make disposable digital cameras. In may 2006 it launched the Flip videocamera, sold in popular department stores at an affordable price. Designed for direct conversion to digital media and particularly for Internet video sharing, it helped countless unskilled users of the Internet to become amateur filmmakers. In just 18 months PDT sold 1.5 million of its one-button camcorders and became the leader of that market. The lesson to be learned here was that the smartphone was going to threaten the existence of entire lines of products, well beyond voice communication, but that would come later.

The Bay Area had a mixed record in the field of photography, with only the Flip gaining ephemeral momentum for a few years. Founded by Stanford's computational mathematician Ren Ng in 2006 and based in Mountain View, Lytro aimed at designing more than a cheaper better camera: it went for the light-field cameras, a camera capable of capturing much more information and therefore of creating a much richer digital representation of the scene. The most obvious benefit is to be able to refocus the image after having taken the picture. The technology had been originally invented at the MIT Media Lab in 1992 by John Wang and Edward Adelson, but it was Mark Levoy's team at Stanford University that perfected it made it fit for the consumer market.

Woodman Labs, founded in 2003 in San Mateo by Nick Woodman (later renamed GoPro) would become one of the success stories of the 2010s. It introduced a wrist-worn camera, the GoPro Hero, that let users snap photos on 35mm color film. This rapidly evolved into the Digital Hero (2006) and finally into the HD HERO (2009) that offered audio and high-definition video.

In 2002 the United States government found five DRAM manufacturers guilty of manipulating the market: Samsung, Hynix (spun off by Hyundai in 2001), Infineon (spun off by Siemens in 1999), Elpida (a 2000 joint venture among NEC, Mitsubishi and Hitachi), and Micron (founded in Idaho in 1978 by former Mostek engineers to make DRAM chips). In reality the price for memory had been collapsing after the dotcom bust and, despite a spike in 2002, it continued to do so. In 1980 a gigabyte of memory cost several million dollars; in 2001 that price had plunged to about $1,000, and within a few years it would fall below $100, thus enabling high-esolution digital photography and video.

The Aging Internet

The dramatic success in the 2000s of the new business models of Netflix (videos), YouTube (videos), Apple (music), Facebook (news), Google (news) and Twitter (news) was beginning to bring out a fundamental problem of the Internet. All these services depended on the ability to distribute content over the Internet Protocol (IP). In other words, the Internet was increasingly being used as a media distribution network (in other words, to access data). Unfortunately, it had been designed to be a (host-to-host) communications network. The Internet was becoming simultaneously the world's greatest distribution network and one of the world's worst distribution networks. Nobody was proposing to dump the Internet yet, but it was obvious that the system needed to be tweaked. In particular, the router had to be reinvented. In 2006 Xerox PARC came up with Content Centric Networking (CCN), a project under the direction of Van Jacobson of Cisco and the Lawrence Livermore Laboratory. CCN was an idea already pioneered by Ted Nelson in 1979 and developed by Dan Cheriton at Stanford in 1999. It aimed at redesigning the Internet around data access.

An even more powerful kind of criticism was leveled at the Web: it did not contain enough "semantic" information about its own data. According to Danny Hillis of the Connection Machine fame, it contained information, not knowledge, and what was needed now was a knowledge web.

In july 2005 in San Francisco he established Metaweb, a company that proceeded to develop what in march 2007 became Freebase, an open, free and collaborative knowledge base. For all practical purposes it worked just like Wikipedia, except that its output was a set of structured data, or, better, "meta-data". In july 2010 Metaweb would be acquired by Google.

The program of rewriting the Internet as a safe and anonymous network was continued by MaidSafe, invented in 2006 in Britain by David Irvine. MAID (Massive Array of Independent Disks) SAFE (Secure Access For Everyone) removed the central servers and the central databases from the Internet, and instead added lots of encryption to protect the data. The goal was to build "a safe Internet". Irvine used concepts of volunteer-computing to decentralize the Internet: the storage came from hard-disk space "donated" by volunteers on the Internet connected via peer-to-peer protocols. Data stored on MaidSafe were broken down into tiny chunks, heavily encrypted, and then randomly distributed around the world. Only the owner had the power to reassemble and decrypt these chunks. The transactions were not stored anywhere: there are literally no traces left of any operation performed with MaidSafe. MaidSafe's network was based on SafeNet: a super-secure platform that aimed at "decentralizing" all the services available on the Internet, such as messaging, email, social networks, data storage, video conferencing, etc. SafeNet rewrote the Internet without any need for servers and databases. A user could log into any computer of the network and the computer would instantly become "her" computer, showing her data, her applications and her profile. When she logged out, no trace of her work was left behind.

Serving the Old Economy

The situation was quite different in the field of books. In the late 1990s Companies such as SoftBook Press and NuvoMedia had pioneered the concept of the e-book reader. Microsoft and Amazon had introduced software to read ebooks on personal computers (Amazon simply purchased the technology in 2005 from the French company Mobipocket that had introduced it in 2000). That was at the time when there were virtually no ebooks to read. This changed in 2002 when two major publishers, Random House and HarperCollins, started selling digital versions of their titles. Amazon became and remained the main selling point for ebooks, but "ebookstores" became to appear elsewhere too, notably BooksOnBoard in Austin (Texas) that opened In 2006. In October 2004 Amazon had hired two former Apple and Palm executives, Gregg Zehr (hardware) and Thomas Ryan (software), who in turn hired mostly Apple and Palm engineers, and had started a company in Cupertino called Lab126 to develop a proprietary $400 hand-held e-book reader, the Kindle, that was eventually introduced in November 2007. The Kindle was not just a software application but a custom device for reading books. That device, conceptually a descendant of the Palm Pilot, was the device that tilted the balance towards the ebook.

On the other hand, there was very little that Oracle had to learn from the dotcom revolution. In the 2000s Oracle represented an old-fashioned business model, the one that targeted "brick and mortar" companies. However, the Web had not slowed down the growth of software demand by the traditional companies that manufactured real products: it had increased it. They all needed to offer online shops, backed the fastest and most reliable database servers. The escalating transaction volumes for e-business were good news for Oracle. Oracle was the undisputed leader in providing database management solutions, but these companies also demanded ERP systems and CRM systems. Oracle proceeded to acquire two Bay Area companies that had been successful in those fields: PeopleSoft (2004) and Siebel (2005). Now Oracle could literally connect the plant of a company to its corner offices and even to its traveling salesmen. In 2005 the total revenues of ERP software were $25.5 billion, with SAP making $10.5 billion and Oracle $5.1 billion. Oracle's boss Ellison was estimated to be worth $18.7 billion in 2004, one of the richest people in the world.

A new generation of database management software to improve tasks for data analytics by employing Teradata's "shared nothing architecture" emerged with Vertica, founded in 2005 by Michael Stonebraker of Ingres fame (and acquired by HP in 2011), Greenplum, founded by Scott Yara and Luke Lonergan in 2003 in San Mateo (and acquired in 2010 by EMC), whose Bizgres shipped in 2005, and ParAccel (San Diego, 2005). At the same time, the database management appliance pioneered by Teradata spawned the generation of Boston's Netezza (founded in 1999 as Intelligent Data Engines by Foster Hinshaw and acquired by IBM in 2010), that launched its first "data warehouse appliance" in 2003.

High-density flash memory drives of the kind invented by M-Systems (acquired in 2006 by SanDisk) were improving and replacing hard drives in high-performance servers. The new star in this market was, however, not a Silicon Valley company but Fusion-io, founded by David Flynn in 2006 in Utah.

Robots and Avatars

Recovering from the dotcom crash, Silicon Valley was more awash into futuristic ideas than ever. In 2005 Yan-Tak "Andrew" Ng at Stanford launched the STAIR (Stanford Artificial Intelligence Robot) project to build robots for home and office automation by integrating decade-old research in several different fields. In 2006 early Google architect Scott Hassan founded Willow Garage to manufacturer robots for domestic use.

Insert willow1.jpg

Willow Garage would spawn a new generation of robotic companies: Savioke, OSRF, Suitable Technologies, hiDof, Unbounded Robotics (later Fetch Robotics), Simbe, as well as Industrial Perception and Redwood Robotics (both later acquired by Google). Willow Garage's PR2 robot of 2010 (mostly designed by Eric Berger), as well as the Robot Operating System (ROS), laid the foundations for the robotic developments of the following decade. Willow Garage contributed to the rise of open-source robotics also via the Point Cloud Library (PCL) for three-dimensional perception, spun off by Willow Garage in 2011 and based on original work by Radu Rusu in Germany before he joined Willow Garage; and the Open Source Robotics Foundation, opened in 2012 in Mountain View by a Willow Garage team (Brian Gerkey, Tully Foote, Nate Koenig, Steffi Paepcke) with ROS' architect Morgan Quigley from Stanford. The Open Source Computer Vision Library (OpenCV) had been started at Intel in 1999 by Gary Bradski. Another influential startup was Meka Robotics, an MIT spin-off founded in 2006 by Aaron Edsinger and Jeff Weber, but soon relocated to San Francisco (and acquired in 2013 by Google), that built robotic limbs providing a high degree of dexterity.

Insert picture of Morgan Quigley at the Open Source Robotics Foundation

The old school of industrial robots that preexisted Willow Garage was well represented by Precise Automation, founded in 2004 in Fremont by Brian Carlisle and Bruce Shimano (the founders of Adept).

The field of Artificial Intelligence came out of its long winter when in 2006 Geoffrey Hinton at the University of Toronto introduced a new technique for neural networks, "deep belief networks", thereby launching a new branch of Artificial Intelligence: "deep learning". Thanks to the contributions of scientists such as Yann LeCun and Yeshua Bengio (all of them on the East Coast), deep learning staged dramatic progress in very few years. If one doesn't count the various search engines (namely Xerox PARC-spinoff Outride, Stanford spinoff Kaltix, Brazil-based Akwan, and Australian-based Orion), in 2006 Google made its first move in the world of A.I. by acquiring Neven Vision, founded by the German scientist Hartmut Neven and specializing in face and object recognition. Google would go on to use it to tag images on the Internet and in 2010 it would introduce a mobile application to search for images, Google Goggles.

The emphasis on virtual worlds had a positive effect on the USA videogame industry. After losing the leadership to Japan in the mid-1980s, the USA recovered it in the 2000s thanks to the fact that Japanese videogames were not as "immersive" as the ones made by their USA competitors. For example, the simulation game "The Sims", created by SimCity's creator Will Wright for Maxis in february 2000, had become the best-selling PC game of all times within two years of its release. With the exception of Nintendo, that successfully introduced the Wii home console in 2006, Japanese game manufacturers were losing market shares for the first time ever. The Wii popularized hand-held motion-sensitive controllers, which led to a new generation of videogame consoles controlled by gestures and spoken commands.

However, the next big thing in videogames was online virtual worlds in which users created "second-life" avatars and interacted with each other. In 1999 Philip Rosedale had founded Linden Lab to develop virtual-reality hardware. In 2003 Linden Lab launched "Second Life", a virtual world accessible via the Internet in which a user could adopt a new identity and live a second life. Hotelli Kultakala (later renamed as Habbo Hotel), launched in august 2000 by Aapo Kyrola and Sampo Karjalainen in Finland, and Go-Gaia (later renamed Gaia Online), launched in february 2003 by Derek Liu in San Jose, were among the pioneers. In 2003 Will Harvey (from Stanford) and Jeffrey Ventrella (from the MIT Media lab) launched another virtual world, There.com, in San Mateo. One year later, in 2004, Eric Ries and Will Harvey founded IMVU in Mountain View, an early attemp to merge social networking and virtual reality. They became extremely popular, involving millions of users spread all over the world. In february 2006 Alyssa Picariello even established a website to chronicle life in Gaia Online: the Gaiapedia.

Gamers are correct in claiming that the videogame community came first, as Korean game "Baramue Nara" or "Baram" (1996) and "Ultima Online" (1997), developed by Electronic Arts' game designer Richard Garriott (who also coined the term MMORPG), predated Second Life. And massively multiplayer games such as "EVE Online", introduced in 2003 by Simon & Schuster Interactive (a MMORPG that could be used by tens of thousands of players at the same time) and "World of Warcraft" (WoW), launched in 2004 by Blizzard Entertainment (still the largest MMORPG in the world in 2015), basically transposed the concept of Second Life back to the gaming market.

Mobile Payment

The "electronic wallet" was another application of the smartphone. Nokia pioneered the combination of smartphone and RFID with the Nokia 5140 in 2004. That was the first GSM phone integrated with RFID reading capability. The Japanese used mainly Sony's FeliCa chip for their electronic wallets (by the end of 2009 Sony had shipped more than 400 million FeliCa chips). In 2004 Sony and Philips Semiconductors, the semiconductor arm of Philips (spun off in 2006 as NXP) developed Near Field Communication (NFC) and Nokia joined them in founding the NFC Forum. Like RFID, NFC was a wireless technology for short-range communications between electronic devices. The main advantage was that it was cheaper and easier to implement, and therefore should foster mobile payments using a smartphone. NFC chips allowed for two-way communication instead of only one way. In 2007 Nokia unveiled the first fully integrated NFC phone, the Nokia 6131 NFC, while Sony and NXP were still holding on to their proprietary standards FeliCa and Mifare. At the same time in 2004 Wal-Mart began forcing RFID on its suppliers, a fact that contributed to the boom of interest in RFID. This is when the Bay Area entered the fray. Blaze Mobile, founded by telecom veteran Michelle Fisher in 2005 in Berkeley, invented the NFC payment sticker in 2006.

Those were also the days when "fintech" became a reality. Wesabe, launched at the end of 2006 by Jason Knight and Marc Hedlund in San Francisco, was a web-based finance management tool, one of many web-based competitors of market leader Quicken, a crowded field that, in 2006, also included Nikhil Roy's Houston-based SpendView (later renamed Rudder) and Connecticut-based Geezeo. In 2007 Shashank Pandit and Ashwin Bharambe founded Buxfer in Mountain View. They all succumbed to Mint, a similar service launched in 2007 by Aaron Patzer in Mountain View and acquired by Intuit two years later.

Nonetheless, it was in Africa that mobile payment took off in earnest. Given the poor financial infrastructure but the broad availability of cellular phones, people spontaneously started "trading" airtime credit as a replacement for monetary transactions. In 2004 Mozambique's cellular-service provider Mcel institutionalized the practice by introducing a full-fledged airtime credit swapping service. This service allowed users to remove money from monetary transactions throughout Mozambique. Using a similar principle, Safaricom, de facto an East African subsidiary of British multinational Vodafone, introduced a money transfer system in Kenya called M-Pesa through which mobile phone users could "store" money on their phone and "text" money to others (via traditional SMS technology). M-Pesa remained the largest mobile payment system in the world well into the 2010s. By 2012 Africa was awash in popular mobile payment services: EcoCash in Zimbabwe (2011), mKesh in Mozambique (2011), Easywallet in Nigeria (2012), as well as several Mpesa spinoffs.


Wearables, while still a tiny "novelty" market, continued to stage progress. In 2004 German chipmaker Infineon introduced a smart jacket with a mobile phone and an MP3 player. In 2003 Susumu Tachi at the University of Tokyo started working on "transparent clothes" using a trick called "retro-reflective projection" (thanks to a built-in camera in the back of the "cloak of invisibility"). In 2003 Burton introduced the Amp Jacket, a snowboarding jacket incorporating an iPod. In 2004 Los Angeles-based Vivometrics launched its smart shirt "LifeShirt". Adidas demonstrated its "self-adapting" shoes and North Face unveiled a "self-heating" jacket. In 2006 Eleksen (England) launched a wireless fabric keyboard. In 2006 Sensatex (Maryland), borrowing technology from Georgia Tech that was funded by DARPA for the 21st Century Land Warrior program, debuted its SmartShirt a shirt with fiber-optic wires seamlessly knit into the clothing and even fully washable. This smartshirt monitored the person's movement, heart rate, and respiration rate in real time.

Engineering the Future

The Web was only a decade old but high-profile critics were already complaining that it was inadequate. Berners-Lee in person had written an article in 2001 explaining the need for a "Semantic Web" in which a webpage would be able to declare the meaning of its content. In 2004 the first "Web 2.0" conference was held in San Francisco to promote the idea that the Web had to become an open platform for application development, with such development increasingly decentralized and delegated to the users themselves. The term had originally been coined in 1999 by san Francisco-based writer Darcy DiNucci. In the beginning of the Web one could only be either a producer or a consumer of webpages: the user of a browser was a passive viewer of webpages. Web 2.0 aimed for "active" viewers of webpages. A Web 2.0 webpage is a collaborative effort in which the viewers of the page can modify it and can interact with each other. Wikipedia was an example of a Web 2.0 application; and Google's search indirectly too, since it relied on a "page ranking" algorithm that was based on what was linked by millions of webpages all over the world. The first widely publicized example of Web 2.0 was Flickr, a photo-sharing service that allowed users to "tag" photos (both their own and other people's), founded by Caterina Fake and Stewart Butterfield in february 2004 in Vancouver. Yahoo was the first major dotcom to invest in Web 2.0: it acquired Flickr in march 2005, it introduced in june its own My Web service, which allowed webpage viewers to tag and share bookmarks, and then in december it bought the most popular website for social bookmarking and tagging, del.icio.us (originally started by Wall Street's financial analyst Joshua Schachter in 2003). Basically, Yahoo wanted to present itself as a "social search" that was fine-tuned by humans as they browsed the web, as opposed to Google's impersonal algorithmic search. The technical underpinning of Web 2.0 consisted of free tools such as Ajax (Asynchronous JavaScript and XML), a concept invented in 2003 by Greg Aldridge in Indiana (but the term was coined only in 2005 by San Francisco-based writer Jesse James Garrett). Ajax was a platform for website developers to create interactive web-based applications (essentially HTML, XML and JavaScript). In a nutshell, the goal was simple: to allow the viewer to make changes to a webpage on a browser without reloading the whole page. This, obviously, had been done before: JavaScript, among other tools, had been available in Netscape's browser since 1996, and web-based applications were already pervasive during the first dotcom boom but most of them went out of business quickly. Amazon allowed users to post reviews of books since the very beginning. However, Web 2.0 had a more ambitious vision: that the Web could be viewed as a platform for creating applications, a platform that would eventually replace the individual computer. The Web 2.0 was another example of a software revolution enabled by open-source software. The reason that so many startups emerged in this space is that the cost of starting one was minimal.

Meanwhile, Matt Mullenweg in San Francisco introduced in 2003 a new popular platform for people to create their own website or blog: Wordpress. The reason it spread like wildfire is that it was maintained as "open source" by a growing community of volunteers.

In june 2003 Mark Fletcher, already the founder of ONElist (acquired by Yahoo in 2000), launched Bloglines, the first Web-based news aggregator. It was soon followed by Topix, launched in 2004 by Rick Skrenta of Open Directory fame,

Digg, founded in november 2004 by serial entrepreneur Jay Adelson in San Francisco, pioneered the idea of letting visitors vote stories up or down (i.e., "digging" or "burying"), thus bridging the world of news aggregators and social networking.

By 2012 Digg, initially a darling of the Internet, was worth very little, due to both a more complicated user interface and to the emergence of powerful competitors with very similar concepts for people to share news. San Francisco-based Redditt had been founded in Boston in June 2005 by Steve Huffman and Alexis Ohanian, two graduates from the University of Virginia, one year before Digg. 2012 is the year when its traffic passed Digg's and Reddit kept growing. It was just easier for people to submit a bunch of stories quickly to Reddit than to Digg. Because of its own algorithm, Digg had become a sort of elitist club in which about 25% of the stories that went popular were submitted by the same few dozen of people. Reddit was powered by a more democratic algorithm that, indirectly, distributed "glory" among all its users, including beginners. Digg then started posting "sponsored links" prominently, which definitely turned off the troops. Reddit, meanwhile, in 2008 introduced "subreddits" (categories). One particularly popular subreddit, dating from May 2009, was IAmA ("I Am A") where users can post "AMAs" (for "Ask Me Anything"), a way for users to interact. It was used by many celebrities.

TechCrunch was founded in june 2005 by Michael Arrington out of his home in Atherton to publish high-tech news and gossip about Internet startups.


Biotechnology was becoming mainstream and synthetic biology was the new frontier. In 2004 MIT's scientist Drew Endy, who had just co-founded a public repository called "the Registry of Standard Biological Parts", organized the first synthetic biology conference and started the first company to commercialize synthetic biology, Codon Devices. In 2005 Endy published an influential article in Nature titled "Foundations for engineering biology". In 2006 Jay Keasling inaugurated the world's first Synthetic Biology department at the Lawrence Berkeley National Laboratory. U.C. San Francisco became a major center of biological research: in 2003 Christopher Voigt founded a lab to program cells like robots to perform complex tasks, and in 2005 the university opened an Institute for Human Genetics. In 2006 Chris Voigt's team at UC Berkeley engineered a bacterium to target cancer cells in the human body. In 2007 Craig Venter's team in Maryland carried out a full-genome transplant: they transplanted the genome of a bacterium (Mycoplasma Mycoides) into the cytoplasm of a different bacterium (Mycoplasma Capricolum). The goal of synthetic biology was not clear, but the business behind it envisioned the possibility of building new living species (initially just bacteria) that would perform useful industrial or domestic functions, just like electronics had led to devices that performed useful industrial and domestic functions (the word "military" was carefully omitted). Therefore synthetic biology was actually not too interested in cloning existing species: why not use the existing species then? It was interested in modifying existing organisms to create organisms that do not exist in nature. Genetic engineering is about replacing one gene, whereas synthetic biology is about replacing entire genomes to generate "reprogrammed organisms" whose functions are different from the original ones (because the DNA instructions have changed).

A story that captured the attention of the media in april 2005 was the announcement that Keasling at Amyris had successfully converted yeast into a chemical factory by mixing bacterial, yeast, and wormwood genes. This "factory" was capable of turning simple sugar into artemisinic acid, the preliminary step to making an extremely expensive anti-malarian drug, artemisin. Synthetic biology exploited the power of microbes to catalyze a sequence of biological reactions that transform a chemical compound into another compound. Artemisinin was commonly extracted from a plant, but science was now able to manufacture the anti-malarian drug in the laboratory, pretty much at will. (The world would, however, have to wait until 2013 for French multinational Sanofi to introduce a commercial pill, designed by Keasling's Amyris). The goal of synthetic biology was now to create "designer microbes" by selecting genes based on which protein they encode and the path they follow. Some day synthetic biology could even replace industrial chemistry (that relies on a sequence of chemical reactions to manufacture materials).

Venter's saga continued. After disagreements with Celera's main investor Tony White, in january 2002 Venter left Celera taking Hamilton Smith with him. In 2003 they synthesized the genome of a virus (just eleven genes) which, unlike the artificial polio virus at Stony Brook, truly behaved like a virus. With a keen sixth sense for money and publicity, in september 2004 Craig Venter started his own non-profit institute in both Maryland and California (San Diego) to conduct research in synthetic biology and biofuels. In particular, they worked on building the genome of a bacterium from scratch and on inserting the genome of one bacterium into another. Bacteria are the simplest living organisms, made of just one cell.

The first international conference of Synthetic Biology was held in 2004 at the MIT.

BioInformatics continued to thrive. Two former Silicon Genetics executives, Saeid Akhtari and Ilya Kupershmidt, started NextBio in Cupertino in 2004 to create a platform to perform data mining on public and private genomic data.

In 2003 the biochemist Joseph DeRisi at UCSF used a microarray (a gene chip) to identify the coronavirus causing SARS (Severe Acute Respiratory Syndrome), a method that inaugurated a new way to to identify pathogens (by analyzing DNA).

In 2004 the first commercial microarrays with the whole human genome became available from Affymetrix (whose GeneChip was still dominating the market for microarrays), Agilent (that still relied on the technique based on inkjet printing), Applied Biosystems and Illumina (all based in California, the first three in the Bay Area). Technically speaking, the first company to offer a whole-human-genome microarray was probably Wisconsin-based NimbleGen Systems in 2003. Then the battle began for lower prices and better "annotation" of the genes (in 2009 Arrayit, founded in 1993 as TeleChem International in Sunnyvale by Rene Schena and Todd Martinsky, would introduce the H25K).

US citizens were spending billions of dollars in blood tests. Theranos, founded in 2003 in Palo Alto by Stanford dropout Elizabeth Holmes, offered a cheaper, simpler and faster way to test blood: no needle, just a few drops of blood. The company would skyrocket to stardom and Holmes would be known as the world's youngest female self-made millionaire, while boasting one the weirdest boards of directors ever seen in Silicon Valley, which at one point would include two former secretaries of state (Henry Kissinger and George Shultz), two former senators (Sam Nunn and Bill Frist), a former secretary of defense (William Perry) and even a retired admiral. This would become one of the biggest scandals in the history of Silicon Valley when in 2015 an article in the Wall Street Journal would reveal that it was all a hoax.

In 1998 James Thomson at the University of Wisconsin had isolated human embryonic stem cells. This discovery made it possible for scientists to generate all the building blocks of our body in a laboratory, because stem cells are the cells that can become any other cell in the body. Scientists could see a near future in which medicine would be able to grow replacement tissues (for example, to replace burned skin) and body organs. Regenerative medicine suddenly became a reality. Several companies were born all over the world: Cellectis (France, 1999), Mesoblast (Australia, 2004), Capricor Therapeutics (Los Angeles, 2005), Pharmicell (Germany, 2006), etc. In 2004 the state of California launched a California Institute for Regenerative Medicine. Research in the USA was slowed down by ethical considerations until 2007 when Shinya Yamanaka at Kyoto University in Japan created embryonic stem cells in his laboratory (his technology was immediately licensed by Cellectis).


Nanotechnology was still a mystery. While returns were low and nano start-ups routinely switched to more traditional manufacturing processes, during both 2006 and 2007 venture-capital firms invested more than $700 million in nanotechnology start-ups.

A promising avenue was to wed "nano" and "green", a mission particularly nurtured in Berkeley. NanoSolar engineers formed the core of Solexant, founded in 2006 in San Jose by Indian-born chemist Damoder Reddy and by Paul Alivisatos, professor of Chemistry and Materials Science at U.C. Berkeley and director (since 2009) of the Lawrence Berkeley National Laboratory to manufacture printable thin-film "quantum dot" photovoltaic cells using a technology developed at the LBNL. This was held to be the next generation of solar technology: flexible, low-cost and high-yield.

Meanwhile, Michael Crommie, a scientist at the Materials Sciences Division at the Lawrence Berkeley National Laboratory and a professor of Physics at U.C. Berkeley, was working on solar cells the size of a single molecule.

Canadian nanotechnology specialists Ted Sargent of the University of Toronto developed a "quantum film" capable of a light-capturing efficiency of 90%, as opposed to 25% for the CMOS image sensors employed in digital cameras. In october 2006 he founded InVisage in Menlo Park to make quantum film for camera phones.

The biggest news in the world of nanotech came from England: in 2004 Andre Geim and Konstantin Novoselov at the University of Manchester isolated graphene, a one-atom thick layer of pure carbon, a material that is the lightest material known and, at the same time, the strongest material known (200 times stronger than steel), the best conductor of heat at room temperature and the best known conductor of electricity (capable of carrying electricity at a speed of 1 million meters per second).


Skyrocketing oil prices and concerns about climate change opened a whole new range of opportunities for environmentally friendly energy generation, nicknamed "greentech" or "cleantech". Of the traditional kinds of renewable energy (wind power, solar power, biomass, hydropower, biofuels) solar and biofuel emerged as the most promising. At the same time, the USA started investing in fuel-cell companies in 2005 with the goal of fostering commercial fuel-cell vehicles by 2020. By 2008 it had spent $1 billion. California embarked on a project to set up a chain of stations to refuel hydrogen-driven vehicles (despite the fact that the state had only 179 fuel-cell vehicles in service in 2007).

Silicon Valley entrepreneurs and investors delved into projects to produce clean, reliable and affordable energy. A startup that focused on renewable fuels was LS9, founded in 2005 in South San Francisco to create alkanes (a constituent of gasoline) from sugar by Harvard's professor George Church and Chris Somerville, the director of Berkeley's Energy Biosciences Institute, and financed by Vinod Khosla and Boston-based Flagship Ventures.

After selling in 2000 their e-book company NuvoMedia, in 2003 Martin Eberhard and Marc Tarpenning founded Tesla in Palo Alto to build electrical cars, and in 2006 they introduced the Tesla Roadster, the first production automobile to use lithium-ion battery cells. In 2004 SUN's co-founder Vinod Khosla, who had joined venture capital firm Kleiner Perkins Caufield & Byers, founded Khosla Ventures to invest in green-technology companies. One year later another SUN co-founder, Bill Joy, replaced him at Kleiner Perkins Caufield & Byers to invest in green technology.

Another legendary "serial entrepreneur" of Silicon Valley, Marc Porat of General Magic fame, turned to building materials for the "green" economy, and founded three startups to develop materials for reducing energy consumption and carbon emission: Serious Materials (2002) in Sunnyvale for eco-friendly materials, CalStar Cement (2007), a spin-off of the University of Missouri based in the east bay (Newark) that manufactures eco-friendly bricks, and Zeta Communities (2007) in San Francisco for pre-assembled homes that operate at net-zero energy.

Meanwhile, U.C. Berkeley and the Lawrence Berkeley National Laboratory launched a joint "Helios Project" for artificial photosynthesis, i.e. to convert sunlight into fuel.

Sebastian Thrun at Stanford built the robotic car that in 2005 won a "race" sponsored by the Pentagon in a California desert. Thrun was then hired by Google to work on autonomous vehicles that, in following years, would be seen driving over California highways with only one person in the car: the passenger.

In 2006 Elon Musk (of Paypal, Tesla and SpaceX fame) co-founded with his cousins Peter and Lyndon Rive what would become California's main provider of residential solar power: Solar City, based in San Mateo.

Solar City, Clearview Way, San Mateo

Solar City, Clearview Way, San Mateo

Solar City, Clearview Way, San Mateo

Culture and Society

The arts mirrored progress in the high-tech industry. The 2000s were the decade of interactive digital art, practiced by the likes of Camille Utterback, David Small and Ken Goldberg. Andy Cunningham and Beau Takahara formed Zero1 in San Jose to promote the marriage of art and technology. In 2004 Michael Sturtz, who in 1999 had founded the alternative art school The Crucible in Berkeley, began staging spectacular fire-themed shows, including operas and ballets. In 2005 the Letterman Digital Arts Center opened in San Francisco to house Lucasfilm's lab. The first Zer01 Festival for "art and technology in the digital age" was held in San Jose in 2006, sponsored by San Jose State University's CADRE. Stephanie Syjuco's counterfeit sculptures, Lee Walton's web happenings, and Amy Balkin's ecological projects referenced the issues of the era. In 2006 San Mateo held the first "Maker Faire". In 2006 Josette Melchor and Peter Hirshberg formed the Gray Area Art Foundation. In 2000 Fecalface.com was launched to support the alternative art scene (later also a physical gallery, the Fecal Face Dot Gallery). The Adobe Books Backroom Gallery, another epicenter of new art, opened in 2001. The Mission School's mission of mural paintings and found-object sculpture was being continued by Andrew Schoultz and Sirron Norris, with Dave Warnke focusing on stickers and hand-painted posters, Sandro "Misk" Tchikovani specializing in three-dimensional letters, and Damon Soule exploring mixed media on found wood.

It was widely believed (although poorly documented) that San Francisco boasted the highest per capita consumption of books. In 1999 Jack Boulware and Jane Ganahl started a literary festival called Litstock that later mutated into Litquake.

Hacker parties had always been popular in Silicon Valley but during the 2000s they reached new height, both in terms of size and enthusiasm. In may 2005 a group of high-tech geeks convened at the Hillsborough house of David Weekly, a Stanford graduate who was working on his startup (later incorporated as PBwiki). That was the first "SuperHappyDevHouse", a concept that soon became popular in Silicon Valley: a casual meeting in a casual environment of creative engineers to work in the same building on their pet projects. Unlike the many networking events, the goal was not necessarily to publicize one's idea nor to meet other people: it was to go home having written some actual software or at least come up with ideas for some software. Unlike hacker competitions, it was not about showing one's dexterity at coding. And, unlike the raves of San Francisco, it was not a wild party of drinking and drugs; quite the opposite in fact. It was a way to create a more stimulating environment than the cubicles of an office, and, in fact, more similar to the dormitory of a university campus. (The idea would spread internationally within a few years). The ambition was to emulate the success of the Homebrew Computer Club of the 1970s, although the similarity was mostly superficial.

In 2002 Bram Cohen and Len Sassaman founded the CodeCon conference for hackers at the DNA Lounge in San Francisco. This would become an annual event and spread all over the world.

The Bay Area, already home to the Search For Extraterrestrial Intelligence Institute (or SETI Institute), became the birthplace of another project aimed at discovering life in the universe, and one that, yet again, resurrected the social-utopian spirit of the Bay Area: SETI@Home, launched in 1999 by UC Berkeley astronomers, which harnessed the power of millions of home computers provided by volunteers and distributed around the globe (5 million in 2014) with the goal of trying to detect radio transmissions from extraterrestrial civilizations picked up by a telescope based in Puerto Rico.

More importantly, this was the first major example of "volunteer computing", in which the "crowd" was providing a computational power bigger than any supercomputer could provide. In 2000 Vijay Pande at Stanford launched Folding@Home, a project of volunteer computing for research on how proteins "fold". Most of these projects of volunteer computing used the Berkeley Open Infrastructure for Network Computing (BOINC), developed since 2002 by David Anderson for SETI@Home.

In 2000 the computer scientist Bill Joy, one of SUN's founders, had written an article titled "The Future doesn't need us" warning against the threat posed by robotics, genetics and nanotech. Meanwhile, Jaron Lanier was ranting against the "digital Maoism" of Silicon Valley. The very founders of the high-tech world started a debate about the pros and cons of the new technology.

Anthropology of Pan-ethnic Materialism

The cultural diversity of the Bay Area continued to debilitate religious certainties. A person's loyalty to her/his religious group was undermined by the proximity of so many other religious groups (in the workplace, at shared homes, in sport activities). This led to an increasingly higher degree of flexibility in choosing one's faith. The new-age movement, with its syncretic non-dogmatic view of spirituality, had left its own influence on the region, even though its message was now being interpreted in a more materialistic manner: for many people religion was to be shaped by how one wanted to behave. For example, greed and promiscuity were definitely "in" for the vast majority of independently religious people. Religious axioms that constrained one's lifestyle were not particularly popular. Religious practices that were perceived as beneficial to one's mind and body were. Thus the popularity of Zen retreats and Yoga classes even among people who did not believe in Buddhism.

Santa Clara Valley had traditionally been a Catholic region. It had become a unique experiment within the Catholic world: a Catholic region with sizeable minorities of other religious groups that were not poor segregated immigrants (like it was in Italy or France) but lived on equal footing with the original Catholic families. Both the percentage and the level of integration was unique among Catholic regions.

The time and attention usually devoted to religious functions were translated to the high-tech world. The public rituals of religion were replaced by public rituals of lectures and high-tech symposia. The mass in a church was replaced by a business or technology forum.

The technology being produced downplayed the cultural differences. People tended to recognize themselves more strongly as workers of a company than as members of a religious or ethnic group.


Those who had predicted the demise of Silicon Valley had completely missed the point. In 2005 Silicon Valley accounted for 14% of the world's venture capital. San Jose's population of 912,332 had just passed San Francisco, and San Jose had become the tenth largest city in the USA. The Bay Area as a whole was the largest high-tech center in the world with 386,000 high-tech jobs in 2006.

(Copyright © 2010 Piero Scaruffi)

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence