Stanford University researcher Jeremy Bailenson has demonstrated that changing the height of one’s avatar in immersive virtual reality transforms self-esteem and social self-perception. Technologies are extensions of ourselves, and, like the avatars in Jeremy’s lab, our identities can be shifted by the quirks of gadgets. It is impossible to work with information technology without also engaging in social engineering.
When developers of digital technologies design a program that requires you to interact with a computer as if it were a person, they ask you to accept in some corner of your brain that you might also be conceived of as a program. When they design an internet service that is edited by a vast anonymous crowd, they are suggesting that a random crowd of humans is an organism with a legitimate point of view.
Different media designs stimulate different potentials in human nature. We shouldn’t seek to make the pack mentality as efficient as possible. We should instead seek to inspire the phenomenon of individual intelligence.
Before MIDI, a musical note was a bottomless idea that transcended absolute definition. It was a way for a musician to think, or a way to teach and document music. It was a mental tool distinguishable from the music itself. Different people could make transcriptions of the same musical recording, for instance, and come up with slightly different scores.
After MIDI, a musical note was no longer just an idea, but a rigid, mandatory structure you couldn’t avoid in the aspects of life that had gone digital. The process of lock-in is like a wave gradually washing over the rulebook of life, culling the ambiguities of flexible thoughts as more and more thought structures are solidified into effectively permanent reality.
The file is a set of philosophical ideas made into eternal flesh. The ideas expressed by the file include the notion that human expression comes in severable chunks that can be organized as leaves on an abstract tree—and that the chunks have versions and need to be matched to compatible applications.
The way the internet has gone sour since then is truly perverse. The central faith of the web’s early design has been superseded by a different faith in the centrality of imaginary entities epitomized by the idea that the internet as a whole is coming alive and turning into a superhuman creature.
he way we got here is that one subculture of technologists has recently become more influential than the others. The winning subculture doesn’t have a formal name, but I’ve sometimes called the members “cybernetic totalists” or “digital Maoists.”
The central mistake of recent digital culture is to chop up a network of individuals so finely that you end up with a mush. You then start to care about the abstraction of the network more than the real people who are networked, even though the network by itself is meaningless. Only the people were ever meaningful.
The same thing is happening again. A self-proclaimed materialist movement that attempts to base itself on science starts to look like a religion rather quickly. It soon presents its own eschatology and its own revelations about what is really going on—portentous events that no one but the initiated can appreciate. The Singularity and the noosphere, the idea that a collective consciousness emerges from all the users on the web, echo Marxist social determinism and Freud’s calculus of perversions. We rush ahead of skeptical, scientific inquiry at our peril, just like the Marxists and Freudians.
Every save-the-world cause has a list of suggestions for “what each of us can do”: bike to work, recycle, and so on.
I can propose such a list related to the problems I’m talking about:
- Don’t post anonymously unless you really might be in danger.
- If you put effort into Wikipedia articles, put even more effort into using your personal voice and expression outside of the wiki to help attract people who don’t yet realize that they are interested in the topics you contributed to.
- Post a video once in a while that took you one hundred times more time to create than it takes to view.
- Write a blog post that took weeks of reflection before you heard the inner voice that needed to come out.
- If you are twittering, innovate in order to find a way to describe your internal state instead of trivial external events, to avoid the creeping danger of believing that objectively described events define you, as they would define a machine.
The approach to digital culture I abhor would indeed turn all the world’s books into one book, just as Kevin suggested. It might start to happen in the next decade or so. Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what’s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don’t know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video. A continuation of the present trend will make us like various medieval religious empires, or like North Korea, a society with a single book.
The most effective young Facebook users, however—the ones who will probably be winners if Facebook turns out to be a model of the future they will inhabit as adults—are the ones who create successful online fictions about themselves.
They tend their doppelgängers fastidiously. They must manage offhand remarks and track candid snapshots at parties as carefully as a politician. Insincerity is rewarded, while sincerity creates a lifelong taint. Certainly, some version of this principle existed in the lives of teenagers before the web came along, but not with such unyielding, clinical precision.
The Bible can serve as a prototypical example. Like Wikipedia, the Bible’s authorship was shared, largely anonymous, and cumulative, and the obscurity of the individual authors served to create an oracle-like ambience for the document as “the literal word of God.” If we take a nonmetaphysical view of the Bible, it serves as a link to our ancestors, a window into human nature and our cultural origins, and can be used as a source of solace and inspiration. Someone who believes in a personal God can felicitously believe that the Bible reflects that God indirectly, through the people who wrote it. But when people buy into the oracle illusion, the Bible just turns into a tool to help religious leaders and politicians manipulate them.
If you want to know what’s really going on in a society or ideology, follow the money. If money is flowing to advertising instead of musicians, journalists, and artists, then a society is more concerned with manipulation than truth or beauty. If content is worthless, then people will start to become empty-headed and contentless.
The combination of hive mind and advertising has resulted in a new kind of social contract. The basic idea of this contract is that authors, journalists, musicians, and artists are encouraged to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind. Reciprocity takes the form of self-promotion. Culture is to become precisely nothing but advertising.
So is there any way to bring money and capitalism into an era of technological abundance without impoverishing almost everyone? One smart idea came from Ted Nelson.
Nelson is perhaps the most formative figure in the development of online culture. He invented the digital media link and other core ideas of connected online media back in the 1960s. He called it “hypermedia.”
Nelson’s ambitions for the economics of linking were more profound than those in vogue today. He proposed that instead of copying digital media, we should effectively keep only one copy of each cultural expression—as with a book or a song—and pay the author of that expression a small, affordable amount whenever it is accessed. (Of course, as a matter of engineering practice, there would have to be many copies in order for the system to function efficiently, but that would be an internal detail, unrelated to a user’s experience.)
As a result, anyone might be able to get rich from creative work. The people who make a momentarily popular prank video clip might earn a lot of money in a single day, but an obscure scholar might eventually earn as much over many years as her work is repeatedly referenced. But note that this is a very different idea from the long tail, because it rewards individuals instead of cloud owners.
The scarcity of money, as we know it today, is artificial, but everything about information is artificial. Without a degree of imposed scarcity, money would be valueless.
Let’s take money—the original abstract information system for managing human affairs—as an example. It might be tempting to print your own money, or, if you’re the government, to print an excessive amount of it. And yet smart people choose not to do either of these things. It is a common assertion that if you copy a digital music file, you haven’t destroyed the original, so nothing was stolen. The same thing could be said if you hacked into a bank and just added money to your online account. (Or, for that matter, when traders in exotic securities made bets on stupendous transactions of arbitrary magnitudes, leading to the global economic meltdown in 2008.) The problem in each case is not that you stole from a specific person but that you undermined the artificial scarcities that allow the economy to function. In the same way, creative expression on the internet will benefit from a social contract that imposes a modest degree of artificial scarcity on information.
At the time that the web was born, in the early 1990s, a popular trope was that a new generation of teenagers, reared in the conservative Reagan years, had turned out exceptionally bland. The members of “Generation X” were characterized as blank and inert. The anthropologist Steve Barnett compared them to pattern exhaustion, a phenomena in which a culture runs out of variations of traditional designs in their pottery and becomes less creative.
Here is a claim I wish I weren’t making, and that I would prefer to be wrong about: popular music created in the industrialized world in the decade from the late 1990s to the late 2000s doesn’t have a distinct style—that is, one that would provide an identity for the young people who grew up with it. The process of the reinvention of life through music appears to have stopped.
I realize the whole point is to get a lot of free content out there, especially content that can be mashed up, but why won’t Creative Commons provide an option along the lines of this: Write to me and tell me what you want to do with my music. If I like it, you can do so immediately. If I don’t like what you want to do, you can still do it, but you will have to wait six months. Or, perhaps, you will have to go through six rounds of arguing back and forth with me about it, but then you can do whatever you want. Or you might have to always include a notice in the mashup stating that I didn’t like the idea, with my reasons.
If a fuzzy crowd of anonymous people is making uninformed mash-ups with my recorded music, then when I present my music myself the context becomes one in which my presentation fits into a statistical distribution of other presentations. It is no longer an expression of my life.
The quintessential example of the open ideal showed up in Freeman Dyson’s otherwise wonderful piece about the future of synthetic biology in the New York Review of Books. MIT bioengineer Drew Endy, one of the enfants terribles of synthetic biology, opened his spectacular talk at Sci Foo with a slide of Dyson’s article. I can’t express the degree to which I admire Freeman, but in this case, we see things differently.
Dyson equates the beginnings of life on Earth with the Eden of Linux. Back when life first took hold, genes flowed around freely; genetic sequences skipped from organism to organism in much the way they may soon be able to on the internet. In his article, Freeman derides the first organism that hoarded its genes behind a protective membrane as “evil,” just like the nemesis of the open-software movement, Bill Gates.
Once organisms became encapsulated, they isolated themselves into distinct species, trading genes only with others of their kind. Freeman suggests that the coming era of synthetic biology will be a return to Eden.
I suppose amateurs, robots, and an aggregation of amateurs and robots might someday hack genes in the global garage and tweet DNA sequences around the globe at light speed. Or there might be a slightly more sober process that takes place between institutions like high schools and start-up companies.
However it happens, species boundaries will become defunct, and genes will fly about, resulting in an orgy of creativity. Untraceable multitudes of new biological organisms will appear as frequently as new videos do on YouTube today.
One common response to suggestions that this might happen is fear. After all, it might take only one doomsday virus produced in one garage to bring the entire human story to a close. I will not focus directly on that concern, but, instead, on whether the proposed style of openness would even bring about the creation of innovative creatures.
The alternative to wide-open development is not necessarily evil. My guess is that a poorly encapsulated communal gloop of organisms lost out to closely guarded species on the primordial Earth for the same reason that the Linux community didn’t come up with the iPhone: encapsulation serves a purpose.
The cuttlefish is mostly soft-bodied; the crab is all armor. As the cuttlefish approaches, the medieval-looking crab snaps into a macho posture, waving its sharp claws at its foe’s vulnerable body.
The cuttlefish responds with a bizarre and ingenious psychedelic performance. Weird images, luxuriant colors, and successive waves of what look like undulating lightning bolts and filigree swim across its skin. The sight is so unbelievable that even the crab seems disoriented; its menacing gesture is replaced for an instant by another that seems to say, “Huh?” In that moment the cuttlefish strikes between cracks in the armor. It uses art to hunt!
There is no way to interpolate between two smell molecules. True, odors can be mixed together to form millions of scents. But the world’s smells can’t be broken down into just a few numbers on a gradient; there is no “smell pixel.” Think of it this way: colors and sounds can be measured with rulers, but odors must be looked up in a dictionary.