I had Forrester as a professor. I don't think he ever said the forecast outcome was correct. No student was taught to believe the output of the modeling technique is precise. The purpose of the models is to understand scenarios and non linear systems. It is notable the MIT students that took his courses went on to create the technology that avoided the worst case of the forecast. Which was really the purpose, to train the minds to make the world better. I believe at the time he said you could not model price signal properly, which ultimately was the conclusion by economics who rebuted Ehrlich.
Forrester also was key to the invention of radar, led the first national missile defense system in the 50s, invented bubble memory, helped start DEC computer, and demonstrated why massive public housing projects increase poverty.
Certainly the world model looks at lots of different scenarios, and (as the piece notes) Forrester is much more thoughtful and precise in his language than many of his less sophisticated imitators. But all of these scenarios are (much?) more pessimistic than the actual outcome, and in Forrester's writing (e.g. in the preface to the second edition) it's clear that he thinks he's discovered something that means the world will need to fundamentally shift course when it comes to growth (even while he allows that the details of how that happen have a lot of uncertainty). So I think it's too generous to describe it as a hypothetical exercise that could never be falsified by subsequent events.
That said, as I hope comes across in the piece, I remain a fan of Forrester, not least for the reasons you list. Even though it failed, the world model was a worthy experiment, and still has value in showing the limits of this approach. Where I think the culpability lies is in the wider, modern use of 'systems thinking', often by people much much less careful and well trained than Forrester and his students, but who have responsibility for systems much, much more complicated and consequential for the rest of us. It's a bad combination!
Quite, the Club of Rome predictions did not yet come true for us precisely because legislators reacted to those predictions across North America and Europe. We planted more trees to replace those felled, although our biodiversity decreased. We phased out domestic coal fires. Deforestation and pollution became worse elsewhere as industrial production shifted away from the wealthy world.
By the way, Britain had RADAR in the 1930s while Forrester was still in high school. My grandfather was a glassblowing technician on the early systems while working for television developer EMI at Hayes, Middlesex, which had won the BBC's broadcast competition against the Baird system in 1936. No doubt Forrester made improvements to American implementations.
Forrester did the radar tracking not the core transmission. But yes he was a research assistant on effort to tie the sweep to the display. Worthy of note only because he was a bit of a controls person.
He did have us guess how much memory the icbm tracking code consumed. The part of whirlwind that tracked an incoming missile and it's terminal point. We all guess 1m, 512k. Etc. Answer was 16k.
Also noteworthy he was accused of being a racist by Tip O'Neil when he did the public housing model, which basically showed the large public housing projects reinforced poverty because they lacked organic community mentoring systems and social influence constructs to motivate economic growth. He pointed out its more like a racist would build these concentrated projects. The net result is we tore them all down.
My sense is there is a little bit of Jay Forrester in everyone that graduates from MIT. More so if they took System Dynamics. It's also notable how much of the world's startups and innovation flows from those graduates.
Excellent piece! Very much aligned with my own thinking on how complex systems actually work (and fail).
Might I offer Gerald Gaus's "The Tyranny of the Ideal" as a philosophical and moral counterpart to your practical and operational argument? Your analysis of systems "kicking back" against top-down design is the practical outcome of the kind of moral hubris Gaus describes.
I was particularly struck by your mention of Australia's NDIS, which is a prime example of getting almost everything wrong by violating these principles. I made this connection explicitly in my own writing, where I used the NDIS to illustrate the central ideas from Gaus's book.
Wow. That pulls together a lot of analyses to illustrate this critical understanding. It does show why freedom usually trumps government control. It's satisfying to see that the moral method is also the practical method. I'm saving this essay as an antidote to big government hubris.
Thank you. The fact that private companies can go bust, get shut down, and get replaced by simpler, newer entrants is such an important safety release valve for complexity
I like the implication that when a civilization (which is just another complex system) is in a doom loop, there will come a time when it's much easier to simply rebuild from scratch.
It's definitely interesting to look at metrics like state spending as a share of GDP: many historical problems were solved (if that's the word) by a big expansion of the state, c.f. Britain's triumph in the Napoleonic wars, or the US after the Great Depression. Even if other parts of the state were moribund, adding a whole load of new spending and institutions let's you create some new simple working systems. But today there's not a lot of headroom left to repeat that trick!
Is there anything you'd recommend on this topic Godfree? Perhaps it's something that Dan Wang's new book touches on? It would be interesting to understand if having a leadership class steeped in engineering means that you take on the same complex systems challenges as the West and just execute them with more skill, or if it teaches the humility to steer clear of the complexity in the first place?
Lots to agree with here. I'm slightly unconvinced that many people prefer, as you put it, the idea that these systems are the result of deliberate analysis. I tend to see this as whether we are lumpers or splitters in the way we view things, with systems thinking being very much a lumper's perspective. As lumpers we see complexities with lots of interactions and unexpected and unintended consequences: in the version of the beer game that I know the whole system gets destabilised because a local brand of beer that sells in small quantities gets name-checked in a music video and suddenly becomes popular. As splitters we look at individual components which may indeed be carefully designed and work as intended. I would expect if anything the individual components to be more likely to fight back than the complete systems
As far as I can tell the terms lumper and splitter go back to Darwin when talking about biological classifications. Where I use them I'm closer to what would be called holistic versus reductionist thinking and systems thinking tends to be strongly holistic. Jay Forrester's perspective is highly holistic as for example is that of Norbert Wiener who pioneered the idea of cybernetics. I don't see either as arguing that it's always necessary to think holistically but that a holistic approach is essential to deal with big knotty problems.
Peter Senge's 'The Fifth Discipline' is an interesting business text from the 1990s based on System Dynamics. From earlier the magnificently creative and eccentric Stafford Beer, a British thinker who worked also in Chile during the Allende regime, wrote 'The Brain of the Firm' with his take on organisations as complex systems
I had Forrester as a professor. I don't think he ever said the forecast outcome was correct. No student was taught to believe the output of the modeling technique is precise. The purpose of the models is to understand scenarios and non linear systems. It is notable the MIT students that took his courses went on to create the technology that avoided the worst case of the forecast. Which was really the purpose, to train the minds to make the world better. I believe at the time he said you could not model price signal properly, which ultimately was the conclusion by economics who rebuted Ehrlich.
Forrester also was key to the invention of radar, led the first national missile defense system in the 50s, invented bubble memory, helped start DEC computer, and demonstrated why massive public housing projects increase poverty.
Certainly the world model looks at lots of different scenarios, and (as the piece notes) Forrester is much more thoughtful and precise in his language than many of his less sophisticated imitators. But all of these scenarios are (much?) more pessimistic than the actual outcome, and in Forrester's writing (e.g. in the preface to the second edition) it's clear that he thinks he's discovered something that means the world will need to fundamentally shift course when it comes to growth (even while he allows that the details of how that happen have a lot of uncertainty). So I think it's too generous to describe it as a hypothetical exercise that could never be falsified by subsequent events.
That said, as I hope comes across in the piece, I remain a fan of Forrester, not least for the reasons you list. Even though it failed, the world model was a worthy experiment, and still has value in showing the limits of this approach. Where I think the culpability lies is in the wider, modern use of 'systems thinking', often by people much much less careful and well trained than Forrester and his students, but who have responsibility for systems much, much more complicated and consequential for the rest of us. It's a bad combination!
Quite, the Club of Rome predictions did not yet come true for us precisely because legislators reacted to those predictions across North America and Europe. We planted more trees to replace those felled, although our biodiversity decreased. We phased out domestic coal fires. Deforestation and pollution became worse elsewhere as industrial production shifted away from the wealthy world.
By the way, Britain had RADAR in the 1930s while Forrester was still in high school. My grandfather was a glassblowing technician on the early systems while working for television developer EMI at Hayes, Middlesex, which had won the BBC's broadcast competition against the Baird system in 1936. No doubt Forrester made improvements to American implementations.
Forrester did the radar tracking not the core transmission. But yes he was a research assistant on effort to tie the sweep to the display. Worthy of note only because he was a bit of a controls person.
He did have us guess how much memory the icbm tracking code consumed. The part of whirlwind that tracked an incoming missile and it's terminal point. We all guess 1m, 512k. Etc. Answer was 16k.
Also noteworthy he was accused of being a racist by Tip O'Neil when he did the public housing model, which basically showed the large public housing projects reinforced poverty because they lacked organic community mentoring systems and social influence constructs to motivate economic growth. He pointed out its more like a racist would build these concentrated projects. The net result is we tore them all down.
My sense is there is a little bit of Jay Forrester in everyone that graduates from MIT. More so if they took System Dynamics. It's also notable how much of the world's startups and innovation flows from those graduates.
In Scotland they built the same style housing projects for white people. The problem was modernist theory.
Great read! And reassuring to learn that the lazy instinct to start over with something simple is actually the wise and effective thing
Although software developers will often argue for starting from scratch when the real problem is they don't understand the code of their predecessor.
Excellent piece! Very much aligned with my own thinking on how complex systems actually work (and fail).
Might I offer Gerald Gaus's "The Tyranny of the Ideal" as a philosophical and moral counterpart to your practical and operational argument? Your analysis of systems "kicking back" against top-down design is the practical outcome of the kind of moral hubris Gaus describes.
I was particularly struck by your mention of Australia's NDIS, which is a prime example of getting almost everything wrong by violating these principles. I made this connection explicitly in my own writing, where I used the NDIS to illustrate the central ideas from Gaus's book.
Thanks for a great read.
You can find the review at https://peter.evans-greenwood.com/2025/08/27/the-tyranny-of-the-ideal/ should you be interested.
Thank you very much - added to my reading list!
Wow. That pulls together a lot of analyses to illustrate this critical understanding. It does show why freedom usually trumps government control. It's satisfying to see that the moral method is also the practical method. I'm saving this essay as an antidote to big government hubris.
Thank you. The fact that private companies can go bust, get shut down, and get replaced by simpler, newer entrants is such an important safety release valve for complexity
All systems resist to change; complex systems resist change in complex ways
Great read!
Thank you!
I like the implication that when a civilization (which is just another complex system) is in a doom loop, there will come a time when it's much easier to simply rebuild from scratch.
It's definitely interesting to look at metrics like state spending as a share of GDP: many historical problems were solved (if that's the word) by a big expansion of the state, c.f. Britain's triumph in the Napoleonic wars, or the US after the Great Depression. Even if other parts of the state were moribund, adding a whole load of new spending and institutions let's you create some new simple working systems. But today there's not a lot of headroom left to repeat that trick!
The Chinese mastered systems thinking centuries ago and today they are the masters of implementation, too.
Is there anything you'd recommend on this topic Godfree? Perhaps it's something that Dan Wang's new book touches on? It would be interesting to understand if having a leadership class steeped in engineering means that you take on the same complex systems challenges as the West and just execute them with more skill, or if it teaches the humility to steer clear of the complexity in the first place?
Lots to agree with here. I'm slightly unconvinced that many people prefer, as you put it, the idea that these systems are the result of deliberate analysis. I tend to see this as whether we are lumpers or splitters in the way we view things, with systems thinking being very much a lumper's perspective. As lumpers we see complexities with lots of interactions and unexpected and unintended consequences: in the version of the beer game that I know the whole system gets destabilised because a local brand of beer that sells in small quantities gets name-checked in a music video and suddenly becomes popular. As splitters we look at individual components which may indeed be carefully designed and work as intended. I would expect if anything the individual components to be more likely to fight back than the complete systems
I like the lumpers vs. splitters distinction! Is there anything further out there I could read about it!
As far as I can tell the terms lumper and splitter go back to Darwin when talking about biological classifications. Where I use them I'm closer to what would be called holistic versus reductionist thinking and systems thinking tends to be strongly holistic. Jay Forrester's perspective is highly holistic as for example is that of Norbert Wiener who pioneered the idea of cybernetics. I don't see either as arguing that it's always necessary to think holistically but that a holistic approach is essential to deal with big knotty problems.
Peter Senge's 'The Fifth Discipline' is an interesting business text from the 1990s based on System Dynamics. From earlier the magnificently creative and eccentric Stafford Beer, a British thinker who worked also in Chile during the Allende regime, wrote 'The Brain of the Firm' with his take on organisations as complex systems
Fantastic work! Thanks for sharing!