Model Thinking
I developed a natural affinity for “models” early on.
It started with algorithm design—abstracting reality, extracting variables, identifying patterns. Later, when I read books on management and psychology, like Thinking in Systems and The Fifth Discipline, I realized that machines, organizations, and people all follow similar underlying logic. That moment struck me: models aren’t just the language of algorithms—they’re a way of thinking about the world.
Because of this ingrained habit, when I face complex problems, I instinctively “model” them: I ask, what are the inputs? What are the outputs? What are the feedback loops in this system?—whether it’s code, a team, or a decision, I want to see it as a system that can be simulated. This isn’t deliberate “rationality”; it’s a habit—I’ve always felt that if you can draw the structure, chaos becomes visible.
That’s the beauty of model thinking: it’s not about “finding answers,” but about “finding order.” When many people encounter complex problems, they think, “Who’s right or wrong?” or “What should I do?” But I care more about “Why is this happening?” Models let me replace “events” with “variables” and “emotions” with “relationships.” When you can see the relationships, the problem shifts from “right vs. wrong” to “structural deviation.” It feels like being lost in a fog and suddenly seeing a topographical map—the problem is still there, but it’s no longer intimidating.
Writing the article on “Productivity Determines Relations of Production” was an externalization of this way of thinking. When productivity accelerates while the relations of production remain unchanged, friction is inevitable. This model isn’t just theory—it’s a pattern I’ve repeatedly observed in management. I wasn’t “quoting philosophy”; I was simply using a different lens to see the structure of reality.
If I had to distill my deepest understanding of model thinking, it would be this: models aren’t meant to explain the world—they’re meant to make the world simulable. They let you “sense the inevitability of change before it happens.” They also let you “see the tension between variables when conflict arises.” That’s the true power of model thinking—not cold analysis, but a sense of control.
I later discovered an interesting pattern: the better someone is at modeling, the less prone they are to emotional reactions. Because models help you shift from “whose problem is this?” to “what’s the system’s problem?” The better you are at modeling, the easier it is to find leverage points—you start to know “which line to adjust to change the system.” Models free you from passivity; they replace anxiety with the ability to “tweak the structure.”
Someone once asked me how to cultivate model thinking. My answer: don’t rush to read books—first, learn to draw. Map out the problem, label the variables, connect the logical chains. Often, it’s not that you don’t understand, but that you haven’t “visualized the chaos.” Models aren’t knowledge—they’re a form of training. You have to push yourself to “abstract” again and again, until you can see order in the chaos.
The biggest change model thinking brought me isn’t becoming smarter—it’s becoming more humble. Because when you use models to see the world, you realize that everyone is just a part of the system. Instead of blaming individuals, it’s better to fix the structure. Instead of complaining about change, it’s better to optimize the variables. It makes you more objective, and also more gentle.
That’s why I now prefer this description of model thinking: it’s a rational shell wrapped around the warmth of understanding. It doesn’t teach you to abstract—it teaches you to use structure to understand the world. It’s not meant to replace human nature, but to help you see the system behind it. The meaning of model thinking has never been about fitting the world into logic—it’s about helping you find, in a complex world, that one line along which you can act with grace.
A model isn’t a way to see the world clearly—it’s a way to see how you see the world.
Originally written in Chinese, translated by AI. Some nuances may differ from the original.
