At the FB Heron Foundation we have a vision of a society where everyone has the opportunity to succeed, and some don’t succeed at the expense of others, or the planet. Our approach is to change how the economy operates by trying to change how people think about investing money, so that people consider the full impact of their investments rather than just the financial impact.
My background is mostly in the corporate world. I studied a fair bit of economics when I was younger. And I couldn’t help thinking, during the long discussions we have had on strategy, that the way we in society invest money, and think of risk/return, is just like the classic prisoner’s dilemma game theory problem.
In the classic problem, two parties face the choice of whether to remain loyal to each other (and face some consequences), and under pressure to personally escape the consequences by throwing the other party under the bus. If the parties cooperate with each other, they are both better off than if they both act selfishly. But if one party chooses to cooperate and the other doesn’t, the cooperator suffers the consequences.
This strikes me as being very similar to how we traditionally approach investing money in our society. Traditionally, investors are out to maximize their financial return, at an acceptable level of financial risk. They’re not considering non-financial impacts such as social and sustainability outcomes. Sometimes (especially in more complex financial products) these non-financial impacts actually will have some impact on both the expected financial return and the risk. Meanwhile, using the game theory analogy, what if investors were to cooperate more with each other, and with the organizations they invest in, would all parties would be better off?
In my old economic notes, I found an article looking at series of computer simulations testing the theory that provided a number of insights. In the 1980s, Robert Axelrod hosted several tournaments of computer simulations to see what happens when various prisoner’s dilemma strategies encounter each other, within a simulated society, to determine what types of strategies ultimately would be most successful. Should you always cooperate, or cooperate at first, say, until you’re cheated on?
The article describes in detail the first two rounds of the tournament, and then goes on to describe “ecological” tournaments, which were long series of rounds in which the entrants in each round depended on their success in previous rounds. Kind of like an evolutionary, survival-of-the-fittest model. They conducted thousands of generations of interactions between dozens of types of strategies.
In many ways, the ecological tournaments functioned and evolved a lot like investors:
The types of strategies were very diverse. Some tried to be cooperative, some to exploit the perceived “weaknesses” of others.
No one strategy was consistently successful; success depended on the “environment” — for instance, what other strategies were encountered in the tournament.
In the ecological tournaments, relative success changed over time between the same strategies. For example, if a strategy benefited from exploiting weaker ones, it did well until those that sustained it died out. Then it too would die out if it couldn’t find a way to cooperate.
Over time, success only bred success when it was mutually beneficial.
Of the more aggressive strategies, Axelrod wrote, “Not being nice may look promising at first, but in the long run it can destroy the very environment it needs for its own success.” Also, programs that tried to be very clever and complex ran the risk of appearing to others as merely chaotic.
Interestingly, in the ecological tournaments, the most successful program didn’t actually “defeat” any other programs; by design, its best outcome was a tie. It won by eliciting behavior from the other programs it encountered that allowed both to do well. Axelrod wrote:
In a non-zero sum world you do not have to do better than the other player to do well for yourself. This is especially true when you are interacting with many different players. Letting each of them do the same as or a little better than you is fine, as long as you tend to do well yourself. There is no point in being envious of the success of the other player, since in an iterated prisoner’s dilemma of long duration the other’s success is virtually a prerequisite of your doing well yourself.
In his book on this topic, Axelrod answers a number of fundamental questions about the evolution of cooperation in a world of raw egoism. One is: How can cooperation get started in a world of unconditional defection — when there is nothing but a primordial sea of organisms all trying to survive at each other’s expense? The simulations showed that an invasion by small clusters of cooperating organisms, even if they form a tiny minority, is enough to give cooperation a foothold. One cooperator alone will die, but small clusters of cooperators can arrive (by means of migration or mutation) and propagate even in a hostile environment.
One of the most amazing results was that cooperation is pervasive. They showed that an environment of “meanies” (strategies that just try to take advantage of each other) can be penetrated by cooperators in clusters. But a world of cooperators cannot be penetrated by meanies. Once cooperation has established itself, it is permanent.
I feel like this study could have so easily have been about investing, investors, investees, asset managers, traders etc. The conclusions seem so obvious: If we worked together and didn’t try to take advantage of each other, we’d all be better off.
All investing is impact investing, and I think these tournaments illustrate that one day all investors will think like impact investors, taking into account the full spectrum of impact of their investments, and cooperating to create value.
And I’d like to think that means that we as a society will end poverty, do the right thing by the planet and be able to focus (without guilt) on the really important things such as exploring the universe. I think the recent financial crisis may have been the catalyst we needed to make this happen.
As investors and asset managers, we can all take a page of the advice given in the book to players of the prisoner’s dilemma: Don’t be envious, don’t be the first to defect, reciprocate both cooperation and defection, and don’t be too clever. And take heart: even if our cooperation with social and environmental stakeholders is small at first, we can and we are starting to permeate this system.