·Psychology & Behavior
Section 1
The Core Idea
We perceive what we expect or want to see. The same data supports opposing conclusions. Confirmation bias is selective perception in action — the downstream effect of a filter that operates before conscious analysis begins.
In 1999, Christopher Chabris and Daniel Simons ran an experiment that should have ended every argument about objectivity. They asked subjects to watch a video of six people passing basketballs and count the passes made by the team in white. Thirty seconds in, a person in a gorilla suit walked to the centre of the frame, beat their chest, and walked off. Fifty percent of subjects didn't see the gorilla. Their visual systems, focused on counting passes, literally filtered out a chest-beating gorilla standing in plain sight for nine full seconds.
The brain doesn't passively receive reality. It actively constructs it, selecting what gets through based on expectations, goals, and prior beliefs. What you're looking for determines what you see. What you're not looking for becomes invisible.
In product: users see what they're primed to see. A founder who built the product cannot perceive the friction new users experience. The founder's expectation filter knows where every button is and interprets every error message correctly. New users don't share that filter. They see a confusing interface. The founder watches a usability test and is genuinely surprised — not because they ignored the problems, but because their perceptual system literally couldn't register them.
In strategy: executives see threats that fit their narrative and miss those that don't. An incumbent dismisses a new market entrant because "they're not competing with us." The incumbent's filter is calibrated to existing competitors — same customers, same channels, same value proposition. A disruptor serving different customers through a different channel doesn't trigger the competitor-recognition filter. The incumbent literally doesn't see the threat until it's too late.
In investing: bulls and bears examine identical data — the same earnings report, the same macroeconomic indicators, the same competitive landscape — and reach opposite conclusions. The bull's perceptual filter amplifies signals of growth and discounts signals of risk. The bear's filter does the inverse. Neither is lying. Both are seeing a version of reality that has been edited by their expectations before conscious analysis begins. The perception precedes the reasoning. The reasoning then rationalises what perception already selected.
Amazon's "working backwards" — start with the customer, not with our assumptions — combats selective perception. The mechanism: attention is limited; we filter by relevance; relevance is shaped by beliefs. The antidote: seek disconfirming evidence. Build systems that force data that contradicts your expectations into the decision process.
The information age made the problem worse, not better. More data does not produce less selective perception. It produces more material for the filter to work on. The investor with access to Bloomberg terminals, alternative data feeds, and real-time market sentiment has more data than any investor in history — and the same filtering brain that a 1950s stock picker had. The additional data doesn't overcome the filter. It gives the filter more raw material from which to selectively extract confirming evidence. Selective perception scales with the volume of available information.