Mental Models of Great Engineers - Focus, Friction, Feedback
“The purpose of abstraction is not to be vague, but to create a new semantic level in which one can be absolutely precise.” — Edsger Dijkstra
There’s a kind of engineering mind you encounter rarely. Not necessarily the loudest, nor always the fastest to answer. But when they speak, everything slows down. You feel less fog, more structure. Their words feel inevitable — like they’ve seen around a corner you didn’t know existed.
What distinguishes these engineers — the senior ones in spirit, not just in title — isn’t a fixed set of knowledge, tools, or even experience in years. It’s how they see. The lens they use to model the complexity of systems, tradeoffs, and people. If you could look inside their head, you’d find three dominant forces shaping their mental architecture: focus, friction and feedback.
These are not vague virtues. They are constructs. Lenses. Each enables a kind of clarity that accumulates and compounds over time. Together, they form the cognitive foundation of engineers who can both build robust systems and reason clearly under pressure.
Let’s dissect each.
I. Focus: The Physics of Attention
“The skill of deep work is becoming rare at exactly the same time it is becoming more valuable.” — Cal Newport
The Scarcity of Depth
We begin with focus, because it governs everything that follows. Without focus, there is no attention. Without attention, there is no modeling. Without modeling, there is no clarity.
Cal Newport calls this deep work — the ability to work deeply on hard problems, while resisting distraction. But in real engineering environments, this isn’t just a productivity technique. It’s survival logic. Systems thinking demands stack-depth. You must trace behaviors across abstraction layers — from process scheduling to API guarantees to team incentives. You can't do this between meetings or in 12-minute pomodoros.
Senior engineers protect cognitive continuity. They architect their days, communication habits, and toolchains to enable extended states of reasoning. This isn’t hustle culture or monk-mode extremism — it’s a systemic reaction to the complexity gradient. The deeper you go into a problem, the more expensive context-switching becomes.
They also have an internal radar for signal. Ask a junior developer to describe a bug, and you get a wall of logs. Ask a senior, and you get a model: “This seems like a distributed lock starvation issue — I suspect contention is spiking in the leader election code.” Focus reveals itself as selectivity — the ability to suppress noise and home in on what matters.
Paul Graham wrote that great hackers are able to "tune out everything outside their own heads". But I think it’s more precise to say they have an appetite for epistemic solitude — a state where ambiguity is metabolized in peace, without the clutter of cheap opinions. Focus gives them the bandwidth to build models, not just solutions.
Their bandwidth is finite — and they treat it as capital, not charity.
Working Memory, Mental Caching, and State
Cognitively, focus is bounded by working memory. You cannot hold more than a few layers of abstraction in your head without degrading your judgment. Great engineers know this, and so they architect both code and team environments to preserve mental state. They favor:
Stateless tooling: tools that don’t leak state between runs.
Defensive architecture: systems that fail loudly and early instead of rotting silently.
Interrupt-resilient workflows: think commit discipline, modular branches, codified deployment paths.
In a world where “10x engineering” is largely a myth, clarity retention across sessions becomes the real multiplier.
II. Friction: The Feel for Resistance
Friction is not the enemy. It’s where the system reveals its structure.
Most Engineers Fight Friction; Great Ones Listen to It
Most engineering organizations think about velocity. Great engineers think about friction.
Friction is the felt resistance between intent and outcome. It’s the drag coefficient in the system — both in code and in process. You try to build X, but spend 70% of your time wrestling with Y. You attempt to ship a fix, but the CI pipeline silently fails for 15 minutes. You try to coordinate with two teams and realize they both use different definitions of “done.”
Where junior engineers feel frustration, great engineers detect texture. They learn to sense structural resistance. They know when an abstraction leaks too often. When a codebase punishes exploration. When an interface is semantically brittle, even if the tests pass. This friction is not a bug — it’s a signal.
A standout trait among senior engineers is how quickly they stop blaming themselves when things “feel wrong.” Instead, they probe: Why does this workflow create cognitive dead-ends? Why is this bug so hard to isolate? Often, the answer lies not in one line of code, but in a design misfit — a place where assumptions silently diverged from reality.
There’s a passage in Eliezer Yudkowsky’s writing on rationality where he describes “noticing confusion.” Most people experience confusion as discomfort and move on. A rationalist treats it like a fire alarm. Senior engineers operate the same way: friction is not something to tolerate — it’s something to model.
One example: in distributed systems, retry logic often hides failure modes — the system appears “resilient,” but in reality, it’s just noisy-silent. Great engineers develop a taste for invisible friction: systems that “mostly work” until they don’t. They know that debuggability is not an afterthought — it’s a first-class design constraint.
Imagine a payments microservice that’s become the bottleneck for a multi-product company. Every new product line wants to hook into it. Suddenly, latency balloons, on-call burns out, and cross-team PRs become a negotiation minefield.
An average engineer might start optimizing queries.
A good one might suggest sharding by tenant or product.
A great engineer also asks: Why did this boundary absorb so many responsibilities in the first place?
They go upstream:
Was the original product boundary defined around code or business value?
Did shared ownership evolve, or was it defaulted into?
What friction signals did we ignore 6 months ago?
This engineer isn’t just fixing the bottleneck.
III. Feedback: Epistemic Humility in Action
If you can’t tell when you’re wrong, you’ll keep getting better at being wrong.
Software is a Belief System Under Test
No model is perfect. But some are calibrated. That’s where feedback comes in.
Engineering is applied epistemology. You’re making bets on how a system will behave under real-world constraints — load, failure, misuse, entropy. And like any map, your internal model must be regularly updated with reality checks. Great engineers have a tight “feedback loop hygiene”. They seek out deltas between belief and behavior.
Perell talks about the concept of idea sex — the combinatorial creativity that comes from crossing domains. But feedback is how ideas meet resistance, and thus, reality. A tight feedback loop is what turns intuition into informed intuition.
Great engineers don’t just ship and forget. They instrument, observe, and revisit. Not because they don’t trust their work — but because they do trust their curiosity. Feedback enables something subtle: regret minimization. When a decision proves wrong, they want to understand why — so the next model has fewer blind spots.
They also build systems with explainability in mind. Not AI explainability in the fashionable sense, but causal explainability — being able to answer: Why did this behave this way? Feedback isn't just external (metrics, bugs, failures), but also internal: the system gives off affordances that make it intelligible to future readers.
This reflects a deep shift in mindset: from output to iteration. From “Did it work?” to “How does it evolve?” Feedback makes the system legible to itself.
This shows up as:
Writing postmortems that critique thinking patterns, not just root causes.
Building feedback-rich tools: tests that cover failure modes, dashboards that narrate system health.
Favoring instrumentation over guesswork — not just metrics, but diagnostic observability.
IV. Organizational Inheritance: Scaling These Models
While individual engineers can internalize these mental models, the real leverage comes when teams and orgs absorb them. That means:
Creating onboarding that teaches reasoning patterns, not just stack knowledge.
Promoting engineers who model clarity under ambiguity, not just throughput.
Codifying systems design reviews that reward epistemic humility, not architectural ego.
A team’s culture is downstream of what it optimizes attention for, what it treats as normal friction, and how it processes failure. Teams that model focus, friction, and feedback at the system level don’t just scale better — they decay slower.
Closing Thought: The Compass, Not the Map
When these three mental models are stacked — Focus → Friction → Feedback — something larger emerges: a self-improving system. A kind of internal DevOps loop for cognition.
Focus lets you perceive deeply. Friction lets you perceive honestly. Feedback lets you perceive accurately.
The best engineers I know aren’t infallible. They just recover faster.
They don’t guess better. They observe sooner.
They don’t over-architect. They zoom out just long enough to see what’s really going on — before it hurts.
And then they build from that place — grounded, systemic, and clear-eyed.
As you grow in your own practice, don’t just chase knowledge. Develop taste. Taste for what focus feels like when it clicks. Taste for friction that’s not accidental. Taste for feedback that sharpens, not flatters.
Because in the end, software engineering is not just about building things. It’s about building systems that hold up under pressure, uncertainty, and time. And that requires mental models that do the same.