Steven Johnson is the author of Mind Wide Open, Emergence, Everything Bad Is Good For You and The Ghost Map. Johnson’s writing has appeared in The New Yorker, Harper’s, The Guardian, The New York Times and The Wall Street Journal. He also writes for Discover magazine and Wired.com, and was co-founder of the award-winning websites FEED and Plastic.com.
He teaches at New York University’s Interactive Telecommunications Program and also hosts a blog at www.stevenberlinjohnson.com.
Emergence, Steven Johnson’s groundbreaking book, offers a dazzling new perspective on the cities we inhabit, the media frenzies we suffer and the games we play.
What is the relationship between the disparate elements of consciousness, communities and computer games? The answer can be found in emergence, change that occurs from the bottom up. In an exclusive interview, we asked Steven to explain more.
So, what exactly is emergence?
Emergence is something magical that happens in certain systems, systems that involve lots of relatively low-level, simple agents interacting with each other and following simple rules. Somehow through those interactions a higher level order is created. The best way to think about it is to think about something like an ant colony, which is a wonderful example of emergence at work. Ant colonies involve thousands and thousands of ants, none of which have any real global information about what the colony needs to do or where the food is or where the nest is beyond its bear-bones location. Somehow colonies are capable of solving incredibly complicated problems like figuring out how many ants should be going out for food, how many ants should be doing garbage collection, how many ants should be building nests. They'll switch the number of ants working on any of those given tasks on the fly, adjusting to changes in their environment with the kind of accuracy that looks almost like a five year plan in a socialist system, except no ant is in charge. No individual ant is sitting there assessing the entire state of the colony and making those changes, somehow all the ants, all of which have incredibly meagre intelligences, manage to collectively solve the problem of finding food or of maintaining the nest or of doing all of these other incredible things that ants do. By studying ants or other biological systems like slime mould or termites, we've started to figure out the laws that underlie these emergence systems and we've started to apply those laws to systems that we build ourselves.
What are the characteristics of an 'emergent' system?
If you look at all the systems that I talk about in the book, there are a couple of basic principles, one of which is 'pay attention to your neighbours'. Each little ant in the ant colony looks around at neighbours, at other ants that it happens to stumble across and evaluates the overall state by making statistical observations about the numbers of different types of ants that it encounters. Another principle is to encourage randomness. City neighbourhoods form in a kind of emergent way without anybody being in control of them; you don't have master planners who decide; OK the artists go here and the people who sell buttons go in this neighbourhood and the tailors go in this neighbourhood - that just happens spontaneously. Cities or environments encourage random movement through their streets, and often lower level interactions. If there is enough randomness and there are enough people engaging in those actions, a higher level shape like a city neighbourhood will form. Another important thing is to look for patterns in those encounters. An ant assessing whether it should be looking for food or whether it should be working on the nest will take note of how many other foraging ants it encounters over the course of an hour. Sometimes it finds that it keeps encountering a lot of foragers, so it'll make the decision to change what it does based on those statistical patterns. Those are some of the rules [of an emergence system].
The book takes a long look at software development, how does this link in?
The web was originally designed in a way that didn't enable it to grow more organised as it grew bigger. One of the properties that you see in all emergent systems is that the more units they get, the larger in scale they get and the more organised they get - that's one of the miracles about it. As a city grows larger it forms neighbourhoods that give the city definition and make it intelligible to people living in the city. You know where to go to for a certain type of shopping district and you know where to go to encounter a certain ethnic group. The web was originally designed in such a way so that as it grew it only grew more complicated and more complex and more overwhelming. There was no structure that would emerge as the web grew in size, it just became; it was like a city that had no neighbourhoods. What started to happen recently, and it's a very interesting development, is that we're starting to see software that tracks neighbourhoods of websites. There's software that looks at thousands and thousands of people surfing various different sites over the course of a day or the course of a year, and it starts noticing that these ten thousand people went to these six sites and these other ten thousand people went to these ten sites over here, and they have these ten sites in common. What the software is starting to pick up is the sense that there is something in common between these ten sites and these six sites and these five sites, and it takes that information and makes that information visible to an ordinary web surfer.
How can we use this information?
If you come to a site, you can pull down a little menu and it'll say 'hey, these other sites are related to that site'; that happens in a city already. When I go out to an arty neighbourhood and there are a bunch of funky clothing boutiques, I know I'm going to pretty much like all the shops that are there. It's very hard to do that naturally on the web but this new software, Alexa, enables you to see neighbourhoods of websites based on patterns of usage.
How can we use the ideas in Emergence to change the way we use the web?
One of the problems online communities have had in their early days was that if they got too big they ran into trouble. You'd go to a website and there would be a thousand comments by a thousand users, and 90% of those comments weren't really that interesting. It was very hard to separate out the signal from the noise; there was a lot of interesting stuff in there but you didn't want to have to wade through the nine hundred comments that were of no interest to you. There are two ways to solve this problem: one is a top-down approach, a hierarchical approach, and one is a bottom-up, emergent approach. The top-down approach is to hire a lot of people to wade through all of those comments and sort them out and say this one is really useful let's keep this one and this one is not very useful let's discard this one. That'll work but it costs a lot of money and it takes a lot of manpower. The other approach is to follow a much more distributed, much more emergent approach and say 'let's let every member of the community rate each comment in this community board'. Everybody has the ability to go through and say, 'hey I like this' or 'I didn't like this'. Over time, if you have enough people, you'll be able to sort though all those comments and effectively create a quality filter, a system that ends up regulating itself.
The idea of a self-regulating system is quite controversial; some people might find it alarming that no single person is in charge. Do you think people may not like this idea?
My argument is that it's not as though the computer is sitting there learning how to read or the computer has become self-aware in some sense. The computer is simply looking at patterns of human behaviour, analysing a hundred book purchasing decisions, or a million CD purchasing decisions, or decisions about whether to go to one site or another - decisions that are made by humans. The computer is just looking at those decisions in aggregate and looking for patterns in that big soup of data that gets created every time you or I go and buy something online. So the computer is not becoming smart, the computer is not becoming like HAL, the computer from 2001. It's not talking to us and acting like a human being, all the computer is doing is looking at our collective intelligence and deducing a few things from the data that it surveys.
When did you first recognise the power of emergence?
I moved to the West Village in New York City and sat down to read this wonderful book, The Death and Life of Great American Cities by Jane Jacobs, which turned out to be a big influence on the book; it's a classic study of how cities work. As I started reading the book I realised that Jacobs had written it about 3 blocks from where I had just moved, so there was this initial coincidence. But as I was reading it I was spending a lot of time walking around in my new neighbourhood and I noticed this very subtle but fascinating little thing about the West Village in New York; you can walk all the way through the entire area and you will never see a national chain store of any kind, even on the level of the drug store. You'll find these weird, quirky little stores but you won't find a McDonalds, you'll just find crazy little restaurants and coffee shops, you won't even see a Starbucks! There's no anti-chain zoning ordnance at work in the West Village, it's not as if you're unable to get space there by law. For some reason it doesn't attract chain-stores - I just thought 'where does that identity come from?'.