Humankind 2.0
a book in progress...
Meditations on the future of technology and society...
...to be published in China in 2016
These are raw notes taken during and after conversations between piero scaruffi and Jinxia Niu of Shezhang Magazine (Hangzhou, China). Jinxia will publish the full interviews in Chinese in her magazine. I thought of posting on my website the English notes that, while incomplete, contain most of the ideas that we discussed.
(Copyright © 2016 Piero Scaruffi | Terms of use )
Social Media and Sharing Economy: History, Trends and Future(See also the slide presentation)
Narnia: According to a 2015 study by the Pew Research Center 65% of adults in the USA now use social networking sites. The number is probably very similar in Europe and China. So there are probably 2.5 billion social-networking users in the world. The Internet has changed the way we socialize: for better or for worse?
piero:
Second, their popularity is due to a form of addiction. Alcohol and drug
addictions are not the only forms of addiction.
Social media have discovered a new form of
addiction, which is a mix of gossip addiction, vanity addiction and
voyeur addiction. So the story of social networking media is really a story
of addictions, not of socializing; and then this addiction gets "monetized" by selling advertising space.
Social media have been a big disappointment for the sociologists and activists
who were hoping that social media would connect the whole world and generate
worldwide collaboration to solve the problems of the world. In reality,
social media are mostly used for two things: 1. as a form of entertainment,
and 2. as a marketing tool to advertise products (and 2. is really a consequence
of 1.)
As far as politics go, social media have contributed more to the rise of
extremism (from radical Islam to right-wing movements in Europe) than to
good causes.
I don't think that social media have done much for technological and
scientific innovation. The vast majority of technological and scientific
exchanges still happens at conferences. Most scientists are not even on
Facebook.
There are books that discuss how social media have made us less
social. People spend a lot more time on Facebook or WeChat than on personal
interactions with their friends, neighbors, colleagues, and even family.
There are also books that show how important social media are
for marketing. In fact, if you ask me about the future of social media,
you are basically asking me about the future of marketing.
For example, i don't know if Facebook's M, a virtual assistant introduced in 2015, improves
anybody's social life but i know that it improves sales:
it looks like
Apple's Siri, Google Now and Microsoft's Cortana, but the difference is that
it is part of a "social network": it knows more about you and it knows more
about everybody else around you.
Note that M is not autonomous: there are physically "customer support"
people at Facebook's offices who are answering the requests from users.
The industry of social networking has become
the science of making people want things (mostly things that people don't
need and would not normally want to buy).
Business used to be about making things that people want, and the industrial
revolution turned this business into a science. The radio and
television created a new kind of business, the advertising business, that
is about making people want things. Social networking represents the
equivalent of the industrial revolution for advertising. Today,
instead of advertising products, we "productise" adverts.
In the end, social media are an addictive habit that has become a great
advertising platform for goods and therefore makes people spend money.
Until 1998 the tobacco industry was deliberately making people addicted
to cigarettes in order to make money. The method has changed but it is
essentially the same: social media make people addicted so that they
can make money out of their addiction.
Ten years ago we were hoping that social media would provide better
information than the mainstream media. We have been bitterly disappointed.
Most "information" on social media is just "gossip", but gossip that "goes
viral" in a few minutes. The quality of information has declined, not
increased. The Internet has killed many of the good newspapers and magazines
of the world, and they have not been replaced by comparable online blogs.
My favorite example is Wikipedia, that was born out of an
idealistic aspiration, but has become a danger for civilization. This free encyclopedia
has killed the traditional printed encyclopedias, which means that very soon
you will not be able to doublecheck what Wikipedia says. If Wikipedia says
that Tokyo is the capital of China, all the people who live outside Japan and
China will think that Tokyo is the capital of China because there will be no
other encyclopedia to compare. Secondly, now that Wikipedia has become the
#1 source of information the powerful entities of the world are competing
to control it. Most articles on Wikipedia are now edited by government agencies
(that want to promote their view of the facts), by corporations (that want
to promote their business), by celebrities (who want to promote their image)
and by special-interest groups (that want to promote special interests like
their religion or their political views). Ten years ago we were discussing
whether Wikipedia was more or less accurate than the best printed encyclopedias.
The issue turned out to be different: the real issue was and is whether
Wikipedia can be manipulated more or less easily compared with
the printed encyclopedias, and the answer is obviously: "much much much more easily".
We hoped that social media would provide an ocean of independent information:
instead we got an ocean of professional agencies that specialize in distributing and controlling
information on behalf of the rich and the powerful.
Modern western culture was born with the invention of the
encyclopedia in France by the likes of Diderot and Voltaire. The Internet
replaced it with an anonymous Wikipedia that is a colossal repository
of grotesque dis-information but is the #1 source of information for most
people. The crowd has not exactly created a better culture.
Ten years ago we were hoping
that social media would "democratize" education, but
there is very little education on Facebook and WeChat, and, in fact,
these social media are distractions that hamper education.
The "attention span" has been further eroded by the mass use of social media.
For example, i now have
5,000 Facebook "friends", but this means that these days i miss the notifications
of many birthday events, including the ones that i really care about.
In 2015 Forrester estimated that only 2% of Facebook posts are read by your
"friends", whereas the rate is 90% for email; but email is dying.
The greatest minds of my generation are dying unheard because young people
don't go to their talks/lectures, and, if they go, they spend all the time
on Facebook or WeChat.
The best engineers work day and night
to figure out ways to make you click on ads.
Carlo Sequin at UC Berkeley told me that most students used a 3D printer to print a flat surface (a 2D surface).
Obviously something is wrong with those students.
To judge the value of a social networking product we routinely employ
"vanity metrics": we measure not what matters but what "flatters".
Ten years ago we were hoping the social networks would create the "global
village", a better and larger community. Instead the social networks are
becoming "asocial" networks.
A social network is a place where you don't know whether people truly exists.
The social networks are anarchic lands roamed by hords of trolls, bullies phreaks, spammers, and robots. We have invented a whole new vocabulary for these
"societies".
The "trolls' plant inflammatory speech in discussions,
the "bullies" harass users, the "phreaks" hijack accounts,
the "spammers" bombard you with publicity,
and the "robots" steal your privacy.
These days social networking is not about building a community but
about destroying the existing physical communities.
I am worried that it will get worse.
What happens when there is no physical interaction? Look at what happens when
people become drivers.
People in cars tend to be more aggressive and hateful.
The nicest, kindest, most peaceful citizens can become really angry at
another driver when they are driving a car. The separation created by
the car is enough to turn a gentle person into a ferocious person.
Will that also happen to people separated by social media?
We also have to face the fact that the #1 attraction on the Internet is pornography.
Sociologists used to think that the obsession with sex was due to the restrictions on sex life imposed by
the traditional lifestyle. The obsession with sex has not declined, it has increased, even if those restrictions
have been lifted in the lifestyle of the young single professionals
The statistics on today's porn use are staggering.
In 2013 the Huffington Post published a study according to which
"Porn sites get more visitors each month than Netflix, Amazon and Twitter combined".
In 2015 Pornhub claimed that it received 21.2 billion visits and streamed 75 gigabytes of data per second.
Gail Dines, a professor of sociology in Boston, wrote "Pornland - How Porn has Hijacked our Sexuality" (2010).
Paul Wright and Ashley Kraus of Indiana University have published a study titled
"A Meta-Analysis of Pornography Consumption and Actual Acts of Sexual Aggression in General Population Studies" (2015).
Both show the damage caused to individuals and society by the addiction to pornography, which has multiplied
thanks to social media.
But we have to be careful not to blame the Internet for everything that is happening. Mostly, it was
already happening. When people get wealthier, they tend to like more privacy. They like to live in an isolated
house and drive their own car. Poor people live in crowded buildings and take the bus. The trend towards
a more selfish lifestyle more superficial friendships, and, yes, pornography was there before the boom of the Internet.
The Internet is just a tool that shows us what people want.
Narnia: Can you recommend some studies about the effects of social media on... social life?
piero:
This discipline remained outside the mainstream of neuroscience until a study that was published in China in 2012
about Chinese teenagers titled
"Abnormal White Matter Integrity in Adolescents with Internet Addiction Disorder".
According to this study, and other studies published in the following years,
Internet addiction seems to cause brain changes that are similar to the ones
found in the brains of alcoholics and drug addicts.
In 2014 a famous British scientist, Susan Greenfield, published an article titled "Mind Change" that warned against the danger of creating a whole new mind;
not creating intelligent machines but creating stupid people.
A 2015 study by Susan Snyder from the University of North Carolina at Chapel Hill showed that almost 50% of US
students were addicted to the Internet, and that many young Internet addicts suffered from mental health problems
such as depression, insomnia, attention-deficit disorder, even suicidal tendencies and alcoholism.
Joseph Reagle's "Reading the Comments" (2015) argues
that Internet addiction even reduces their ability to empathyze with others.
It is particularly worrying that social media indirectly help
anonymous behavior. When people feel that their actions cannot be tracked
back to them, people always tend to do things that normally they wouldn't do,
and not only illegal things. For example, "bullying" is much easier when nobody
knows that you are the one doing it. Same for gossiping: people used to
send anonymous letters, not signed letters, to expose someone's private life.
The fact that it is so easy to be "anonymous" on the Internet brings out new
forms of cruelty. We tend to forget that torture is one of the traits that
have been common to all eras and all civilizations, from prehistory to today,
from Europe to China. There is no civilization, no region of the world, no era,
when torture was not practiced. We have to teach children not to torture:
we are born with the impulse to torture things, pets and even other children.
So it is not surprising that people "torture" other people on the Internet.
Sherry Turkle's "Reclaiming Conversation" (2015) is one of the many books
that lament the death of "quality time".
Robert Tokunaga's paper "Perspectives on Internet addiction, problematic Internet use" (2015)
is a good summary.
This book that we are writing is about Humankind 2.0: a humankind enhanced by the technologies that it is
inventing. I hope that my next book will not be "Humankind 3.0" about the decline of humanity.
It is important to remember that social media could be very useful to solve real problems: they can mobilize thousands of people
for good causes. For example, every time there is a natural disaster someone creates a Facebook group to collect
money and help the victims.
Nonetheless, we have to face the unpleasant reality: today's social media exist because
they are addictive, otherwise they would die. The lifespan of a social networking platform is directly
proportional to how addictive it is. If it is addictive, it will go "viral"
and its users will use it many times a day, which means that advertisers will pay to advertise on it,
which means that the platform will survive. If it is not addictive enough, it will be killed by the competition.
So we have created an entire industry that is simply looking for ways to make you "addicted"
instead of looking for solutions to the problems of the world.
There are thousands of young brilliant researchers and entrepreneurs in the
world who spend all their time trying to invent a new "addiction".
Stanford has a laboratory called Persuasive Technology Lab, founded by BJ Fogg,
that studies how computers can create new habits in people.
You need to have a sense of humor.
Social networking has been a failed social experiment.
But a failed experiment that generates billions of dollars of revenues.
It is fascinating that in 2016 i saw the statistics about the exponential
growth of social media at the same time that i saw the statistics on
suicide published by the CDC (Center for Disease Control):
the number of suicides in the USA has been rising since 1999 in every age
group and for both sexes.
The rate of suicide has increased from 10.5 per 100,000 in 1999 to 13 per 100,000 in 2014.
The coincidence is interesting:
1999 is the year that Silicon Valley launched the first social network, Steamtunnels.
Narnia: So we are not a social animal after all?
piero:
Now let's look at the age of the Internet. The great tool
introduced by the World-wide Web in 1991 was the web-browser.
That revolutionized how we access information (Google, Wikipedia, etc)
and interact with people (Facebook, Twitter, etc):
we turned information and communication into webpages, and we used browsers
to play with those webpages.
One of the most important inventions of the last 20 years was the "tab" of
the browser. We tend to forget the inventions that truly changed our lives.
In 1998 a 25-year-old software engineer named Adam Stiles introduced a new web-browser, NetCaptor. When Mozilla launched the very influential browser Firefox
in 2002, it had "tabs" like NetCaptor to open multiple webpages at the same
time. That feature became a big success and every browser that exists today
has multiple tabs: you open one webpage in a tab, then open another webpage
in another tab, etc.
It was shocking how quickly people started using and abusing the multiple tabs of their browser.
Sometimes i have more than ten pages open in my browser.
The "tab" changed the way we experience the Web: browsing became a
"multi-tasking" process, like flipping through multiple books at the same time
or flipping through multiple tv channels.
Why do humans like to multi-task?
Which other animal wants to do multiple things at the same time?
We are the only multitasking animal on the planet.
We are not good at most of the things we do, but we do many things at the same time.
I'd say that humans are curious animals, animals who want to know and do as much
as possible.
We were "social" when it was convenient to be part of a group: we needed others
to feel safe; we needed others to help us; we needed others to take care of us
when we get old; etc.
But now that the police protect us, that we can buy what we need, and that
the state takes care of us when we get old, etc, we are becoming a lot less social. We still socialize but in a superficial manner, while we are doing many
other things, and we do more and more often it in a distant manner, over the Internet, not in person.
People like to be alone, undisturbed,
and instead of socializing in person they like to use their devices to multi-task.
We are discovering that we humans are not social animals but
curious multi-tasking animals.
Narnia: Then there's also the problem of privacy...
piero:
As a consequence, since 2013 the "dark nets" have become more popular with Internet users who want
to remain anonymous and escape network surveillance. We now have
the TOR browser (introduced in 2008),
the DuckDuckGo search engine (2008) and the Wickr instant messenger (2012).
Bitcoin was just one network in a long genealogy of "dark nets".
In October 2014 Facebook added support for the TOR browser and in April 2016
Facebook announced that every month more than one million people
use TOR to access Facebook.
Narnia: Nothing good came out of social networking?
piero:
The other democratizing feature is that the magic moment of social media
is when something "goes viral"; but there is no science to explain what
"goes viral". There is no algorithm to predict what will go viral.
It is unpredictable what will go viral. The best painting of all time
might be seen by only 10 people whereas the drawing of a child may be
shared over and over and over again and become a world sensation.
The "going viral" still eludes the complex algorithms of modern marketing.
It is difficult to defend social media. I think they just represent a
transition from a social world to a post-social world. So it is more
interesting to discuss what socializing will be in the post-social world.
It sounds like a contradiction in terms, but the word "social" has different
meanings before and after the Internet. I don't know how to express it in
words but i can point at some movements that are "social" in a new way.
The first one is the Makers Movement. In theory, the "makers" are just people
who make things. In practice, they have always felt the need to create a
sense of community around their experiments. When you "make" physical objects
(as opposed to just writing texts or posting pictures), you want to share
your experience, you want to learn from others and you want to teach others.
It is natural for a maker to interact with other makers.
It is not something new. The Bay Area had always been famous for the Do-It-Yourself (DIY) culture. For example, the evolution of personal computers was largely determined by communities of DIY kids like the ones who formed the Homebrew Computer Club from which Apple was born. Today there is also a growing DIY movement
in biotech. The Makers Movement is interesting because it marks a return to
simpler objects: not computers and not DNA, but just regular objects made of
wood, metal, plastic, etc. The Maker Faire, that started in 2006, is, first
and foremost, a social event, and it is spreading all over the world. Makers
need expensive tools. Makers need to learn how to use tools.
Tech Shop, that opened in 2006 in Menlo Park, pioneered the concept of the shared workshop offering both classes and tools.
And ten years later there is so much open-source software and hardware that
independent makers can make even smart objects for the Internet of Things,
and can create startups in incubators like Hacker Dojo.
More and more spaces offer even 3D printing.
The second movement is the hackerspace movement. There are hackerspaces that are
simply coworking spaces, but the original hackerspaces were actually formed
by hackers determined to create communities with an ideological purpose.
The first one was probably created in West Berlin during the Cold War
the Chaos Computer Club (CCC). In 1984 this club started organizing a conference for hackers called the Chaos Communication Congress (C3).
At the same time that Facebook and Twitter were starting the revolution in
social networking, these hackers started a physical form of networking,
a sort of
"hackers counterculture", in places like the Metalab in Vienna (opened in 2006)
and Noisebridge in San Francisco (2007). Noisebridge was created by
Jacob Appelbaum of the TOR project and by Mitch Altman,
a veteran of Jaron Lanier's virtual-reality startup VPL Research,
after they attended a C3 hacker conference in Germany.
I have seen estimates that there are now 2600 hackerspaces worldwide.
Most of them have lost the original spirit, but this growing phenomenon,
just like the Makers Movement, is a sign that people still want to socialize.
They don't want to write software at home or in their garage. They want
to write software in a place where they can talk about their project,
learn about other projects, find collaborators, and create bigger and
bigger ideas. It is amazing that the hackerspace seems to be spreading
especially in small towns. Small towns have little entertainment for young
people. Small towns are the places famous for getting drunk because there is
nothing else to do for young people.
Now there is something to do: write software, build
hardware, and share your experience with a community of like-minded young
people.
Any kid a small town can be a "hacker" like in Silicon Valley.
Social media like Facebook and WeChat are alienating friendship
in the cyberspace but hackerspaces are creating real friendships
in the real world, and a new way to express yourself to your friends.
The Makers Movement and the hackerspaces are also important because they
remove the fear of failing. Their communities are communities in which it
is perfectly ok to fail. People are always interested in what you did,
even if your project failed. They know that the best learning comes from
mistakes. It is the exploration that matters. Someone else will reach the
destination, but that will happen because so many pioneers did the
exploration first.
These movements also represent the original spirit of the Bay Area, that today many employees of Apple, Google
and Facebook have forgotten: the spirit of cooperation and helping good causes that has always been part of
the Bay Area counterculture.
Another form of online socializing is the phenomenon of "volunteer computing": people who offer their computers
to help someone achieve a goal that requires a lot of computational power. Most of the time our computers
are not working. Why not let others use them? Individuals who participate in a project of "volunteer computing"
simply leave the computer on instead of turning it off, and allow access to it by the community.
The first such project to achieve "critical mass" was
SETI@Home, launched in 1999 by UC Berkeley astronomers, which harnessed the power of millions of home computers provided by volunteers distributed around the globe (more than 5 million by now) with the goal of trying to detect radio transmissions from extraterrestrial civilizations.
The "crowd" is providing a computational power bigger than any supercomputer could provide.
In 2000 Vijay Pande at Stanford launched Folding@Home, a project of volunteer computing for research on how proteins
"fold".
Most of these projects of volunteer computing use the Berkeley Open Infrastructure for Network Computing (BOINC), developed since 2002 by David Anderson for SETI@Home.
In 2004 IBM helped set up the World Community Grid (WCG), which is a general application of this concept: anybody
can donate "idle time" of their computer so that independent scientists can conduct scientific research that benefit the world.
This platform is helping thousands of independent scientists study the biggest problem of our era that traditional
universities, corporations and governments have not been able to solve, like cancer and ebola.
In 2015 Australia's Garvan Institute launched "DreamLab", a project to use the idle time of smartphones to help research on cancer. If millions of people donate smartphone time, DreamLab can become a "smartphone supercomputer" for cancer research.
Volunteer computing is a form of crowdsourcing: the crowd can solve a problem
better and faster than the traditional research laboratory. We have many
community-based applications, and Waze, the app acquired by Google that helps
drivers navigate in heavy traffic, is perhaps the most famous: drivers who
accept to use Waze become providers of information to Waze, and Waze uses
this information to improve the global information that it sends to all the
drivers. In 2006 Thomas Malone at the
MIT opened the Center for Collective Intelligence that studies how we can use
the resources of the community in the age of the Internet.
The "crowd" is not only useful to bypass the traditional business: traditional
corporations, and especially the big ones, can benefit from the "crowd".
The obvious benefit for them is to
accelerate the feedback loop: every business relies on feedback from its
customers to improve its business, and should encourage customers to contribute
ideas for the future. In 2005 Henry Chesbrough published a famous book titled
"Open Innovation" in which he explained how valuable the "external" inputs can
be to companies that traditionally relied only on "internal" discussions.
When Dell created the website Ideastorm.com (in 2007) and Lego created
the website Ideas (2008) and
Phillips created Simplyinnovate (and the Open Inovation Challenge), they
all applied the "open innovation" principle: those websites allow
customers to propose, share and rate ideas, and, ultimately, to
collaborate with the company. The customers don't socialize in a physical way,
but they work together towards improving the products that they use. There is
still a sense of community, although it is "community" in the
"post-social" world of websites.
Waze is the vanguard of a new kind of "sharing economy": the invisible kind of sharing. Most users of Waze don't realize that they are sending information about traffic to Waze that Waze then uses to send information about traffic conditions to all other users. The future of sharing will be largely invisible. In 2016 Here (originally developed by Nokia but acquired in 2015 by BMW, Audi and Daimler) announced that it will acquire informatino from the sensors of your car and use it to automatically provide information to all other drivers (and, eventually, to driver-less vehicles).
Another innovation that came from online socializing is "crowdfunding"
(Kickstarter, IndieGoGo, GoFundMe, etc).
This is also changing the way projects of art and music get funded. The traditional approach to art and music
has been to consider them industrial products: you make them, you market them, and then you sell them to
the consumers. But art and music are intuitively different from toothpaste and shoes. I rarely hear a
person say "I wish this shoemaker made a new line of shoes" or "I can't wait for this furniture manufacturer
to introduce a new table". On the other hand, the "fans" of a musician can't wait for the musician to make
a new album. That may become the very definition of "art". Art is something useless (for practical applications)
that people really crave. Instead of paying for it when it is produced, the "consumers" are willing to pay
for it to be produced. We will have fans pay for their favorite musicians to make a new album, and then the
album will be available for free on the Web. Fans will pay proportionally to how much money they have and
to how much they desire this new album. Of course, charity events and benefit dinners have always existed,
but crowdfunding creates a worldwide community of people interested in "sponsoring" an artifact.
In April 2016 the total raised by Kickstarter was
$2.3 billion for a total of 105,000 projects from a total of
11 million investors.
Gofundme had similar numbers ($2 billion from 12 million investors).
Indiegogo $800 million.
So crowdfunding websites have already raised more than 5 billion dollars.
This amount is enough to compete with serious venture capitalists.
In 2015 the total amount of venture capital invested across the USA was
$58.8 billion. That's ALL venture capitalists combined.
Another example of online socializing that we should admire takes place among the open-source community.
Open-source software has always been important for the development of Internet business. Linux and the Apache
project are examples of open-source software that were crucial in developing the online services that today
we take for granted.
GitHub, founded in 2008 in San Francisco by Tom Preston-Werner, former Powerset engineer, provide a sort of social networking platform for open-source.
In 2015 it was used by 1.2 million software developers around the world. These engineers believe that it is
important to share software ideas. Obviously most of them will ever become rich, but some of this open-source software
will become the foundation of the "next big thing".
GitHub is the place where the real experiments are being carried out. The big corporations constrain the experiments
that their employees can carry out. GitHub is the place where frustrated developers can carry out their experiments
without constraints. Even if their experiments fail, i think that these are the engineers who
really matter, the ones who have a
vision and are willing to sacrifice a regular salary for it. Secondly, a lot
of startups allow their engineers to make their software open-source: if
the startup fails, the software is available for all the others or for the
employees to continue their own research (in the old days the software developed
by failed startups was simply thrown away). Thirdly, it provides a platform
that can be used in general for writing "text", not only software.
All the tools that help these engineers contribute to software
development can be used equally well to create and manage "content" (just like a blogging platform)
and to allow people to collaborate on it (just like a wiki). My friend Joshua Levy, who used to work for the search
engine Cuil and for the virtual assistant Viv Labs, has a vision of using GitHub
to build a sort of Wikipedia, with the powerful difference that people
will have to sign their contributions with their real names (it is mandated
by GitHub) and that users will be able to rank the experts. Today's Wikipedia is dangerous because it is edited by
anonymous contributors and there is no way to rank the contributors.
Open-source software was a reaction by the "counterculture" to the "big money"
that started changing the software industry in the 1980s. Until 1970 there
was basically no software industry. Software was always free. It was only in
the 1980s that software became a "corporate product" for which corporations
are willing to pay a lot of money; and it also became a "consumer good" that millions of
computer users are willing to buy.
There were idealistic people who disliked the commercial success of software.
They organized projects to create free software. In those days it was not
easy to share and distribute software. It was mainly done on Unix machines
that were connected on the Internet, and this was before the World-wide Web,
with limited connections between computers.
The "social media" of the Unix world
were the "newsgroups" of the Usenet invented in 1980 (the Usenet ran on top of
the Arpanet before the Arpanet was renamed Internet). For example, the most influential "scripting" language of the Unix world, Perl, was released by its creator Larry Wall in 1987 on the comp.sources.misc newsgroup.
In 1991 Guido van Rossum released Python, soon to become a very famous programming language, on the alt.sources newsgroup.
Another example of free software that became
very popular before the age of the World-wide Web was the X Window System,
developed in 1984 at the MIT by Jim Gettys and Bob Scheifler. It was
the most popular windowing system on Unix machines at the time when Apple
and Microsoft where beginning to make "windows" popular.
In 1983 Richard Stallman launched
the GNU Project and founded the Free Software Foundation. The first GNU was
released in 1989 and contained a complete set of software development tools
that were completely free for anybody. The first famous GNU success was
Linus Torvalds' Linux, a variant of the Unix operating system, that was
released in 1991 and became a GNU item in 1992.
Much of the impulse towards free software came from UC Berkeley (better known as the place for left-wing activism),
where in 1977 Bill Joy had created a version of Unix called BSD (Berkeley Software Distribution).
That version of Unix became very popular on the workstations (big desktop computers) of the 1980s.
As part of BSD, in 1979 Eric Allman created the program "delivermail" that became "sendmail" and that managed most of the email traffic on the Internet throughout the 1980s.
This was mostly done by students.
In 1992 William and Lynne Jolitz created a version
of BSD called 386BSD that was completely free. They named it that way because
they wanted a Unix for the personal computers that in those days ran the
Intel microprocessor 80386 that was faster than
the traditional (and much more expensive) computers that ran BSD Unix.
Later their 386BSD evolved into FreeBSD, NetBSD, and OpenBSD.
A former Berkeley student, Brian Behlendorf, was one of the software engineers
who wrote the Apache HTTP Server in 1995. They were known as the
"Apache Group" that later became the Apache Software Foundation.
The Apache software was extremely important for the
success of the World-wide Web.
By the way, when he was not working on software, Behlendorf was helping the
Burning Man festival. In 2006 the same Behlendorf was invited to speak at the World Economic Forum in Davos. His story is typical of how people in the Bay Area can mix counterculture, art, business and technology.
In 1997 Eric Raymond wrote an influential article titled "The Cathedral and the Bazaar".
In 1998 Netscape created Mozilla, the open source version of its Internet
software. Jamie Zawinski was one of the leaders of that project.
The term "open source" was probably invented in 1998 by nanotech guru
Christine Peterson, co-founder of the Foresight Institute in Palo Alto.
The first conference on free software, the "Freeware Summit" organized in 1998 in Palo Alto by the publisher Tim O'Reilly. was known as the "Open Source Summit". Participants included all the
heroes that i mentioned: Linus Torvalds, Brian Behlendorf, Jamie Zawinski, Guido van Rossum, and Eric Raymond.
Open-source software is often better than the software developed by big
corporations. Big corporations want to maximize their profit, not necessarily
create the best technology.
There are many recent successes of
open-source software: MongoDB, released by New York-based 10gen in 2009,
is preferred by millions of developers over
the famous relational databases and it now has more than 10 million downloads.
OpenStack, that originated at NASA, is often preferred over the main virtualization products.
Cassandra, released by Facebook, is also becoming very popular.
In 2005 Ton Roosendaal, who in 1995 had started the highly influential open-source project Blender, led the
project to make the first "open movie", titled "Elephants Dream", a movie entirely created using open-source tools.
Progress in Artificial Intelligence is always hyped by the press, but most of "deep learning" is done on open-source
software such as Torch (New York University), Caffe (Pieter Abbeel's group at UC Berkeley), Theano (Univ of Montreal, Canada), and Tensor Flow (Google). Robotic startups are often using the open-source Robot Operating System.
Big Data is another area in which progress depends on open-source software.
The Internet of Things relies on open-source hardware such as Arduino and on the open-source platform
OpenHAB.
In the last few years we have also seen venture capitalists invest in open-source projects.
Usually, the first success story of open-source software is considered to be Red Hat (founded in 1993), that made money out of the Linux operating system and made Linux the most widely adopted operating system for server machines.
More recently, Cloudera has capitalized on the popularity of Hadoop in the world of big data.
Now that the open-source community has become so big we will see many more success stories.
In 2016 the MIT Media Lab announced that all future software will be released to FLOSS (Free/Libre/Open-Source Software),
and in 2016 Apple turned the Mac OS X into an open-source project.
The Makers movement, the open-source movement, the hackerspace movement, the crowdfunding and the crowdsourcing phenomena, and the volunteer-computing movement
are all happening in the middle of the boom of social media. They are all examples of what i mean by
"socializing in the post-social world".
Narnia: Is the culture of social media infiltrating the office now with new tools that value collaboration?
piero:
Narnia: What is the future for the social networking platforms like Facebook and WeChat?
piero:
There can be two lines of progress. The first one is self-edited video, 3D photography
and virtual-reality content: beyond the old-fashioned photos and videos of smartphones.
The second one is the interaction with the social networking tool, which will increasingly be controlled not
my the user but by the software.
In terms of content (of what you can do on social media),
live video streaming has become so easy and cheap that anybody can turn her or his life into the equivalent of a television program: just broadcast live what
you do and hope that someone is interested.
In fact, the first "lifecast", Justin.tv by Justin Kan that lasted for eight months of 2007 (streaming his life nonstop 24/7 over the Internet via a webcam attached to his head), evolved into Twitch.tv.
After all, social media are mostly a "vanity show": live streaming is the
ultimate, unrestrained, vanity show.
4G cell phone networks have enabled a generation of live-streaming apps like Periscope and Meerkat.
Live-streaming your life can also serve more practical purposes. For example,
security: you can check what your children are doing if they are live-streaming.
I am not sure what more we can do when 5G comes around.
The age of the selfie has quickly turned into the age of the short video. Everybodoy has become a filmmaker.
Facebook has passed 8 billion daily video views.
Snapchat passed 6 billion daily video views in 2015, just three years after the introduction of its video service.
Google's YouTube has over a billion users who add 300 hours of video every minute.
And this "video mania" is creating a huge demand for video editing tools:
in 2014 Shutterstock debuted an in-browser video-editing tool, Sequence;
in 2015 Google acquired Fly Labs, creator of immensely popular video-editing apps for the iPhone;
Cinematique provides a platform for making interactive online videos;
Flipagram offers an app that allows users to quickly produce short video clips combining photos, videos, text and music;
and
in 2016 GoPro acquired Stupeflix and Vemory to improve its video editing tools.
At the sametime some apps are pushing social media beyond panoramic snapshots and towards immersive 3D photography,
for example Fyuse, developed by Fyusion.
And others are venturing into virtual reality, for example
New Zealand-based 8i, that allows users to capture scenes with an array of videocameras and to produce 3D videos that can be viewed from different angles in virtual reality. 8i opened a studio in Los Angeles where content creators can play with the software and create their own 3D videos.
The problem is that a video is many times bigger than a picture.
The video generation is rapidly saturating the capacity of existing cellular
technology, and the big telecommunications companies don't really have a solution, so already in crowded areas like Manhattan people cannot watch videos.
The other line of progress is the way we interact with social media. In theory we interact with other people,
but the truth is that most of the interaction takes place with algorithms.
I always joke that now we should be more interested in the social life of algorithms than of people.
Gartner's study "Top Strategic Predictions for 2016 and Beyond" predicts that
by 2018 about 20% of all business content will be created by machines
and there will be 6 billion connected things, and that
by 2020 virtual assistants will constitute 40% of mobile interactions.
The human role will be reduced to clicking "yes" to what the virtual assistant
proposes.
Today we have to install so many apps on our smartphones, but we will soon
have only one or two "intelligent" apps that will take care of everything.
Our social life in the post-app era is difficult to imagine because
it will be largely controlled by virtual assistants running on our mobile
devices. Maybe we will be able to set the degree of socializing that we
desire, just like today we can set how much energy we want to save
on our laptop; and then the virtual assistant will advise accordingly to
which parties we should attend and which friends we should invite for dinner.
This is not in the distant future: it is actually happening to us and we
are happy to let it happen because it simplifies our lives.
We are already surrounded by thousands of algorithms that tell us where to
eat, which movie to watch, what to buy, how much to exercise, and whom to date.
And we mostly obey. How many people scroll down to find a different restaurant
than the first 2 or 3 recommended by Yelp?
Then our virtual assistants will also interact with smart things around the house,
the office and the city. So i also joke that the social life of machines will be more interesting
than the social life of people.
And Gartner forgot to analyze how robots are going to "socialize" via the cloud.
Facebook also allows developer to develop their own chatbots, that
will live within Facebook Messenger. And so Messenger, that Facebook spun off in 2014 as a separate app,
is becoming a platform for chatbots, much more than an instant messaging service.
There is no question that eventually almost everybody will be on social media. A few individuals will refuse
to be on social media and they will be the digital-age equivalent of the ancient Buddhist monks who lived in a cave
refusing the comforts of civilization. Facebook and Google have plans to bring the Internet to the poor, rural,
underdeveloped places that don't have access, which is about five billion people.
They present it as a humanitarian mission but their problem is quite
simple that they already have almost 100% of the users that they can get today: the only way to expand is to
increase the number of people who have access to the Internet.
Facebook, which in 2013 launched the project Internet.org with Samsung, Nokia and others,
wants to beam the Internet from solar-powered drones.
In 2013 Google launched Project Loon, and originally planned to use a world-wide network of high-altitude balloons to beam the Internet to the regions that have no Internet access yet, and
in 2014 it announced that it wanted to build a system of 180 satellites to beam Internet around the planet.
In New York an alliance of Control Group, Titan, Qualcomm and Comark created LinkNYC: 10,000 communications hubs that provide city residents and visitors with free public gigabit Wi-Fi, access to communications, information and municipal services. In 2015 Control Group and Titan merged to form Intersection with the goal of expanding the LinkNYC model of free Wi-Fi to cities around the globe, and were acquired by Sidewalk Labs, and Google invested in the idea.
Years ago we believed that the poor people of the world didn't matter for ecommerce, but China has proven that
the opposite is true: Alibaba and JD become multibillion dollar companies by targeting a vast population that,
by Western standards, is poor. If you make one dollar out of one billion people, you make one billion dollars,
which is not bad.
Silicon Valley is always good at disguising its corporate self-interest as humanitarian action.
Narnia: Is there any solution to overcome the current limitations of cellular technology?
piero:
Narnia: is there hope for traditional, physical socializing?
piero:
Museum attendance is increasing everywhere. Art sales are skyrocketing.
I get invited to speak in China and the Chinese organizations spend a lot of money to physically fly me from Silicon Valley to Hangzhou or Beijing.
Nobody invites me to speak remotely via Skype or WeChat.
In the 1990s during the "dotcom boom" everybody in Silicon Valley was convinced that soon offices will be
abandoned and employees would be allowed to work from home. Surprise: 20 years later very few employees
are allowed to work from home (and, if your boss allows it, you are probably on the list of people
who will the first ones to lose their job at the first economic crisis).
So there is a counter-trend to the digitalization of life, to the disappearance of books, magazines,
newspapers, physical shops, letters, etc. People still crave the physical presence and the physical object.
I am particularly hopeful about art.
The fact that more people go to museums and buy art is an important signal.
Visual art is particularly valuable because it transcends linguistic barriers.
Visual art is a
sort of "lingua franca" that everybody speaks and understands (although two people may disagree on how good
a specific work of art is). You don't need to learn English in order to appareciate British art and
you don't need to speak Chinese in order to appreciate Chinese art.
The art magazines speak of the globalization of art commerce, but art has always been globalized.
Artists were never limited to their nation. An Italian painter can paint in Germany or Russia,
and a Dutch painter can paint in France or Spain.
Narnia: What is the future of art and culture then?
piero:
It is important to realize that we often blame the Internet for trends that
preexisted the Internet and the Internet simply amplified. For example,
friendship was becoming more and more superficial before Facebook was born,
and Facebook simply accelerated that trend. Ditto for the trend that i just
described in cultural artifacts. We already lived in the age of mass
consumerism, of pop stars and Hollywood blockbusters. There were already
marketing agencies specializing in creating "viral" phenomena.
We already lived in a society in which the crowd chose the stars of music,
not the critics/historians.
The Internet simply accelerated those trends.
It is debatable whether the Internet is shaping modern culture, or culture
has shaped the Internet.
Remove the Internet and those trends would still be there,
just move more slowly. I think it was electricity that really changed the
way culture is created, evaluated, propagated: the grammophone, radio,
television, and, yes, eventually the Internet.
Personally, i cannot complain. If i am a bit of a celebrity (a very minor one),
i owe it entirely to the Internet. I started publishing my writings on
the Internet not by design but simply because of convenience. I had no idea
that in 2016 i would have readers other than close friends.
Narnia: Can social media me useful to historians? What can you learn from the use of social media?
piero:
At the same time,
the Einstein Archive Online contains 80,000 documents written by Einstein or to Einstein
Those documents cover everything from politics to just friendship.
And, of course, a lot of science. For example, the 45-page "Zurich notebook" was written in 1912 when Einstein was learning differential geometry. His first idea of "general relativity" was wrong but for more than two years he tried to convince everybody and himself that he was right.
Then in November 1915 he published 4 papers, one after the other, that, one after the other, corrected his mistakes until he got it right.
In 1916 he finally wrote a very nice manuscript (with crossed-out lines and annotations) that summarizes his new theory
Why did he publish four papers, one after the other, in a hurry? Because he knew that Hilbert, a famous German
mathematician, was working on the same problem and was about to reach the same conclusions.
In fact, Hilbert submitted a paper with the same equation 5 days before Einstein, but it was published after Einstein's final paper.
Everybody who studied general relativity knows about the curvature of spacetime. Well, Einstein didn't mention it in any document until 1921. It was his friend Weyl who first introduced the concept in letters written to Einstein. In 1922 Friedman showed that Einstein's equations predict an expanding universe. Einstein ignored it, convinced that the universe was static. In February 1916 Einstein wrote in a letter that gravitational waves don't exist. Four months later he published a paper that they do exist. In 1938 he wrote a paper that they don't exist, and the paper was rejected and never published. Einstein didn't believe in black holes either. So, at one point or another, Einstein didn't believe in any of the three pillars of modern cosmology: the Big Bang (the expansion of the universe), black holes, and gravitational waves.
This is just a small piece of the story that fascinated me. The archive contains dozens of letters that Einstein exchanged with his friends: Hendrik Lorenz, Michele Besso, Willemde Sitter, Felix Klein, Hermann Weyl, Max Abraham, Gunnar Nordstrom and many others. General Relativity was really the work of a community of scientists, not just one.
And the amazing thing is that this was happening in the middle of World War I and there is almost no mention of it in these 80,000 documents . These scientists lived in the middle of a horrible carnage that killed more than 10 million people but they were completely focused on understanding the universe, not the human race.
Why am I telling you this story? Because studying Einstein's brain will not tell you who he was. He was stubborn, he made many mistakes, he learned from his friends. You will not find this in his brain. You will find it in his "social life",
which is preserved in that archive. His letters shows that he always used a "principle" to guide his research: the principle of relativity, the principle of equivalence, etc. He was frustrated at the end of his life because he couldn't find a "principle" for the unification of Relativity and Quantum Mechanics. One week before dying he still wrote in his notebooks about this.
Well, this is Einstein. His brain will not tell you what I just told you. His archive will. His archive contains all
of his "social networking", and that tells me a lot more about him than the best neuroscientist can tell me.
Narnia: Is the sharing economy another example of people coming together in different ways and does it have a future? Why we haven't had any other major success story after the initial ones?
piero:
But now the sharing apps are also showing us a different aspect of human coexistence: things may be valuable
for others even if you are not valuable for you. Sharing apps are enabling us to think about the valuables that
we own and to share those valuables.
Two of my friends started
MonkeyParking in San Francisco to share parking spaces. San Francisco is notoriously a difficult city for
parking. But there are thousands of people who drive away from their house every morning. These people
never realized that the place where their car is parked is actually a business opportunity because many
other drivers need to park their cars.
Most people today live in cities. The number one asset in cities is space.
So whichever space you own or are using is a valuable resource.
In most cases the "valuable resource" is not obvious because you didn't pay explicitly for it: the car
that is sitting in the garage or the room that you never use or the driveway that you don't use during the
day when you are at work. And then these apps prompt us to change the way we think and our habits.
If we own a valuable resource that we dind't realize is valuable, we can rethink our business.
For example, who has a lot of parking space available? Hotels, restaurants, shopping malls, etc.
Sometimes their parkings are full, but most of the time they are half empty or completely empty.
The parking of a hotel is mostly full at night, but mostly empty during the day.
The parking of a shopping mall is mostly full during the day, but almost completely empty at night.
Maybe there is a day of the week or a season when the hotel can make more money renting out parking spaces
than renting rooms.
Schools complain that they don't have enough funding from the government, but they have huge parking lots
that are empty after 3pm.
People need to discover what is hidden behind their routine.
The most difficult thing for a startup in the sharing economy is always to change
the lifestyle of people, to make people accept different habits. The startup needs to convince some people
that they have a supply of resources, and then they need to convince some other people that they need
those resources. The matchmaker/middleman has a difficult life in the sharing economy because it is not
"trading" the usual resources in the usual places. Usually you trade goods at a shopping mall or at
the vegetable market, not online. The second obstacle is usually the
city or even national rules and regulations, that sometimes restrict what people can do.
Going viral for a sharing startup is not easy because it is difficult to
convince the supply people who are typically not computerized for the new
task. It is easy with the demand people because the app simply requires a smartphone.
The poorer people should be the ones to think about sharing what they own and don't use.
There have been very few success stories in the sharing economy because there are at least two major problems
to growth: first of all, a sharing idea can be easily copied, so if you succeed there will immediately
be many copycats - you cannot patent sharing; and second, sharing is "localized",e.g. i can share my car
only with the people who live or work in my town, not with someone who lives and works in India; so it is
lot of work for a sharing app to go global the way Airbnb and Uber did.
Narnia: Can social networking be used to improve education, to make students smarter, not dumber?
piero:
The Internet can help bridge the gap between rich areas and poor areas. The best teachers tend to be in
the rich cities and in the rich neighborhoods, but MOOCs and these social networks can create a more
uniform environment. Of course, the skills of the teacher remain the main factor in education.
Nothing can completely replace the master-apprentice model, i.e. the physical interaction between a human being
who "knows" and a human being who "learns". But the Internet can create better "masters" everywhere who can then create
better "apprentices" everywhere.
I personally think that hackerspaces are also a great addition to the traditional school. Every high school
and maybe every elementary school should provide a "hackerspace" for students to
explored the most important scientific disciplines. The hackerspace is a place when you are free to experiment
and you don't have to be afraid of failing. In fact, you never fail: you are rewarded for trying. It is a place
for crazy ideas, the exact opposite of the learning/memorizing methods of the traditional school.
It is the "library" of the digital age, where kids can learn, experiment and socialize.
Mitch Altman estimates 2,600 hackerspaces all over the world, and growing very fast.
If we don't solve them, the problems that we have today in education will escalate dramatically because we live in
an age in which the skills required by the society change rapidly. They already changed a lot since i graduated.
I recently met my old manager of the 1980s who still remembers me as the fastest programmer of the company,
but today any 12-year-old child can do what i was doing then. If i had not learned a new job every 4 or 5 years,
today i would be begging for money in the street. Change will happen ever more rapidly for the generation
that is entering school today. We literally don't know which kinds of job will exist when they graduate 10 or
20 years from now. The vast amount of courses and slide presentations that i mentioned will not help them unless
they learn to learn. That's the fundamental skill that we need to teach children: learn to learn.
Learning a specific skill is not bad, but that skill might not exist by the time you finish school: you need
to learn to learn new skills, and not only once but many times over the course of your life.
Many jobs will be replaced by machines. The society of machines will create new jobs that today are hard to
predict. My favorite example: who predicted that computer automation (that killed so many jobs) would created
millions of jobs in telecommunications? Your smartphone (and all the people who have a job related to it) is
the direct consequence of progress in electronic computers, precisely the machines that were accused of stealing
jobs from people. We are very good at predicting the kinds of job that will disappear but very bad at imagining the
kinds of job that will be created. So first of all the schools need to train students to learn (not only a specific
skill, but to learn in general). Secondly, schools need to provide students with a broad knowledge that will allow
them to learn new skills. Stanford and other universities are switching from education for the "I" person to
education for the "T" person. The "I" person is a person whose knowledge is totally vertical: the "I" person only
learns one skill and becomes a great specialist in that skill. The "T" person is a person who learns one skill
very well but also many other things not closely related to that skill. For example, an electronic engineer
may also study biology and physics, and, why not, European classical music and Chinese classics.
Think of it:
1,000 years ago China was inventing everything and the West was copying, now
the West is inventing everything and China is copying. What was special about China during the Tang and Song
dynasties? China has to rediscover the spirit of those ages, and that was
a very interdisciplinary spirit.
The scholar-official of the Song dynasty was a "universal" man, combining the qualities of scholar, poet, painter,
statesman, and sometimes scientist.
Westerners are rediscovering something that China already knew one thousand years ago, as usual.
How social media can help create the "T" person is not clear to anyone. We have just realized that it is the
correct approach to give young people a chance to survive in the chaotic future that is arriving very rapidly.
There is also another aspect of today's education that has to change.
In 1717 Prussia made primary education compulsory, then Britain did the same, and eventually all the countries in
the world have done the same. The model has remained the original one: children go to school for a number of
years, and eventually graduate and then join the workforce. End of education. Some people never read a book again
after graduating from school. I don't think this is the kind of educational system that we need in the future.
We need lifelong education. School should not end at the age of 18 or 22 or 27. I suspect that we are asking people
to study too much when they are children and too little when they are adults. We damage their childhood and
we will soon damage their adulthood.
Social networks can create awareness that there are new skills emerging in the economy, and that there are tools
on the Internet to acquire those skills.
In the old days all the new knowledge that you needed in your adult life could be found in the encyclopedia:
for example, i am traveling to Russia and want to find out which are the big cities of Russia. But in the
future the new knowledge will require studying a slide presentation on Slideshare or watching videos on YouTube.
Ordinary people will need a way to navigate this vast (and confusing) territory of knowledge.
Narnia: How will our online social life influence our physical social life in the smart city of the future?
piero:
Dan Kottke (Steve Jobs' college friend)
Antonio Forenza, Cofounder of Artemis
John Law, founder of the Burning Man Festival
Josh Levy, former engineer at SRI, Cuil and Viv Labs
Irina Pesterean and Paolo Dobrowolny, Founders of Monkey Parking
|