Human Nature Forecast: Fail, or fail harder

On a weekday afternoon, I meet Tim Harford, TED Global 2011 speaker and senior columnist for the Financial Times, at the headquarters of his publisher in Amsterdam. He is here to explain why there is a strong need to embrace experimental research and why failure is the playing field for success.

Whilst setting up the video-equipment, Tim tries, and fails, to put together the book-standard on the table, which shows his newest book, Adapt. He (nor me) succeeds in making it work. Instead, we make do with an improvised version of a book standard: using one book to support the other books. Trial, error and an experiment.

Tim HarfordPhoto credits: Jorrit Monné

Making the right mistake

In his TEDTalk, Tim speaks about making ‘good mistakes’, mistakes based on trial and error. What was his latest ‘good mistake’? “Well, we don’t know what my most recent mistake has been.. I recently moved from London to Oxford with my young family. We have yet to see whether this is a good decision or not. What I try to do, talking with my wife, is to recognize that we don’t know whether this is going to be a good mistake or not. Instead of trying to say: is this the right decision or not, we ask: what are we going to learn, and can we reverse the decision? So we structured the whole decision around how this may be a mistake. But it is a mistake we can reverse, if we need to.”

Does that mean reversibility is a prerequisite for successful trial and error? “It certainly helps if you can get out of your mistakes. One of the points that I keep making, is that the world is very unpredictable, and very complicated, so mistakes will happen. And if you design a system, a business plan, or plan your personal life and that system does not recognize the mistakes that are possible, I can’t but not think of the Euro-crisis – you can’t leave, you can’t be bailed out, there’s no function for default. In other words, you can’t make a mistake. And if you make a mistake, there will be a problem.”

Why complexity matters

Tim’s call for trial and error is particularly applicable to social systems. He assesses these systems by their complexity and whether they are tightly or loosely coupled. A tightly coupled system (banking) works like a dominoes game: if you flick one, all dominoes top over, making it difficult to intervene. A loosely coupled (education) responds in an unpredictable manner to changes in syllabuses, or immigrant laws, but this happens slowly. A lot of what Tim talks about are complex, yet loosely coupled systems, “where you can experiment, and adjust and adapt, and flexibility is everything and correcting your mistakes is everything”.

The role of humans in these systems seems secondary, leaving them victims of a system’s unpredictable dynamics. What drives human behavior in such large systems? “The theory of a homo economicus [human as rational actor pursuing self-interests] is a useful theory. Asking about what the incentives and risks are people respond to, gets you a long way. People respond to incentives, it’s not the only thing, but it is an important thing.” But incentives can also help to change a system, such as when you allow whistle-blowing. Tim quotes fascinating research into who are the first to speak up about fraud in a company – it turns out it’s the employees. “In the United States the average whistle-blower in the healthcare system gets 15 million dollars reward money. Not surprisingly, at least to an economist, these financial incentives make a huge difference to how willing people are to speak up when they see a problem.”

Too big to fail

Tim told me how he was astonished when in New York, where he saw a new TV series announced on the financial crisis, titled “Too Big To Fail”. Failure should be encouraged, rather than avoided. He himself is the embodiment of embracing failure, illustrated by these examples from his personal life on his daughters’ upbringing and his career change:

Don’t design

What Tim actually teaches us, is that we shouldn’t design a system, instead he says “we should be experimenting, even when it’s really important. In fact, the more important the system, the more important it is to experiment.”  With this knack for experiments also comes humility, because experts – such as heart surgeons see that “we don’t really understand all the factors, but they will experiment in a controlled, ethical, transparent way until they know the right treatments to use, well, I think that’s a perfectly responsible way to practice medicine. Doctors have gotten over this idea of understanding everything perfectly in a theoretical world and they’ve embraced experiments. And I think economists should, too.”

Daily life

But then, if the world is too complicated to be grasped by any bright mind, what can we do? “I think, if you just examine an everyday object. I was fascinated by the project that a design student in London. He reverse engineered a toaster, spending 9 months and a 1000 Euros on it, and in the end he just gut this lump of useless stuff, it did get warm if you would plug it into a car battery. And we think about a toaster as being a totally simple object, you can buy one whenever you want, it’s totally reliable, totally ordinary, and still it has all this hidden complexity. That doesn’t even begin to scratch the surface of how complicated our economy is. It’s actually rather dizzying when you come to think about it.”

Thanks, Tim. You blew my mind by saying that sometimes the best answer is: I don’t know.

Further reading:

• Tim’s TED talk “Trial, error and the God complex” for TED Global 2011.
• His most recent book, Adapt, is translated into Dutch by Business Contact.