Here is a 3-sentence crash course in systems thinking:
Good systems encourage good behavior.
Bad systems encourage bad behavior.
Every system is perfectly designed to get the result that it produces.
Each of us encounters systems that reward suboptimal outcomes on a daily basis: doctors getting kickbacks from unnecessary drug prescriptions, salesmen getting commissions by grifting customers, social platforms trading dopamine for compulsive behavior, and so on. This is not a new phenomena.
According to legend, the British Empire once offered one of its colonies a generous bounty on cobra skins in order to try and reduce the area’s snake population. In response, the locals bred more cobras, increased the snake population, and profited heavily from the surplus. And why would the locals not do this? The system was practically begging them. Sure — this may not have unfolded as the British planned, but it certainly unfolded as it was designed.
This is good news for people looking to catalyze meaningful change because it means that not all behavior stems from soul-level mechanisms or deeply-held ideologies. Sometimes, it is the system that needs to be adjusted. Sometimes, it works better to change the payoff rather than the person. Whether you’re trying to change individuals or industries, take time to understand which outcome is being encouraged.
It turns out that the road to hell is not paved with good intentions. The road to hell — and maybe heaven — is paved with good incentives. Let’s choose those incentives wisely.
Great piece. The obvious but annoying question: what value judgments can we make on the people who designed the systems in the first place?