Game UI Lessons

Nobody needs to play a videogame. There’s no utility to fall back on. No bank account, no tax filing, no work email that forces you to stay. The interface is the entire product. A player who doesn’t enjoy the first five minutes alt-tabs and never comes back. No sunk cost, no data lock-in, no switching cost. The only moat is how the game feels.

This makes game UI the most brutally competitive design environment that exists. Apps compete for a purchase. Games compete for every second of attention against other games that are also pure interface. Bad game UI doesn’t survive long enough to ship a sequel.

Games solved onboarding, conveyance, feedback, and novice-to-expert scaling decades ago because they couldn’t afford not to. Apps lean on utility and never have to.

What the pressure produced

WoW’s quest system solved onboarding. Every quest has an unambiguous goal, calibrated effort, visible progress, and a chained reward that pulls you into the next one. Early quests teach one mechanic at a time, not through text but through play. App onboarding dumps everything at once and hopes you figure it out.

Slay the Spire solved novice-to-expert. The first combat has no tutorial. You draw cards, drag one onto an enemy, a damage number flies off. The interface teaches through animation, pointer highlights, and immediate feedback at the moment you encounter each mechanic. Not before. A beginner and an expert see the same screen. The expert just reads it deeper. No “simple mode” vs “advanced mode.” One interface, layered information.

Dark Souls solved tutorialization. It teaches through level design, not text. A weak enemy near a bonfire teaches combat safely. A narrow bridge teaches falling. The environment is the tutorial.

These aren’t cherry-picked. The full pattern catalog has more: cooldown indicators, HUD minimalism, progress visualization, inventory systems. They all emerged from the same pressure.

Provenance

These ideas have a literature that doesn’t get read outside game design.

Raph Koster’s A Theory of Fun (2004) made the foundational argument: fun is learning. Games are pattern-recognition machines. When the brain stops finding new patterns, it gets bored. This is the cognitive basis for progressive disclosure. Not a UX heuristic, but how brains work.

Jesse Schell’s The Art of Game Design (2008) introduced over 100 “lenses” for evaluating a design, drawn from psychology, architecture, music, and film. None are game-specific.

Steve Swink’s Game Feel (2008) named “juiciness”: the excessive feedback that makes actions feel significant. This is why a button click in a well-made app feels different from the same button in a bad one. Swink formalized what the transitions post calls temporal coherence.

Celia Hodent’s The Gamer’s Brain (2017) is the direct bridge. Hodent was UX director at Epic Games during Fortnite and has a PhD in cognitive psychology. She maps game UX to neuroscience and treats game interfaces as applied cognitive science.

The corruption

“Gamification” took the surface and missed the depth. Points, badges, leaderboards — the dopamine loop stripped of everything that made it work. No conveyance. No progressive disclosure. No effort calibration. No teaching through play. Just extrinsic rewards bolted onto interfaces that were already bad.

The lesson was never “add game mechanics to apps.” It was: games solved these problems first because players leave immediately if the interface fails. The discipline came from the constraint, not from the points.

The takeaway

Take the best lessons from the harshest design environment and transplant them onto everything else that can use them. Games can’t survive on utility. Everything else can — which is exactly why everything else settles for worse UI.