Shared Stories and Illusions

February 1, 2026

By Stephen Stofka

This post was not available earlier this morning because of a technical glitch.

As a child I enjoyed the puzzles in the Highlights for Children magazine. One of these was an optical illusion that we could share with our friends. There was a drawing of a cube. Did it face left or right? The lines of the cube do not move but different people see it facing different directions. I was surprised when I looked away from the drawing, then looked back and the cube faced in the other direction. Another illusion featured a debate over the number of rods in the picture. Four rods or three rods? It depends on our perspective and which part of the drawing we instinctively look at. This week I want to explore the axis of the whole and its parts.

Take the cube example. The cube faces left and the furthest vertical line is the left front edge of the cube. Or the cube faces right and the leftmost vertical line forms the left rear edge of the cube. The lines of the cube do not move but the role that each line plays can change depending on our immediate interpretation of the whole figure. Drawing just the hint of shadow along one line can fix the position of the cube.

In 1892, an anonymous German illustrator drew the Duck-Rabbit Ambiguous Figure (Source).The duck-rabbit illusion led philosopher Ludwig Wittgenstein (1889 – 1951) to the insight that we often perceive the whole of something before noticing its parts. That interpretation of the whole then provides the context in which we see the parts. “We see a face as a face, not as a sum of eyes, nose, cheeks, and so on,” Wittgenstein (1953) wrote in Philosophical Investigations.

Wittgenstein did not say that we always perceive the whole first but that it is one type of perception. Our ability to rapidly identify faces supports the argument by Nancy Kanwisher and Galit Novel (2006) that the fusiform face area (FFA) is a part of the brain dedicated to face recognition. Some research has shown that the FFA may not be dedicated solely to faces but to any repeated visual identification task like identifying the make and model of cars (Source).

Daniel Kahneman and Amos Tversky (1974) have argued that we notice a few details and build an interpretation of the whole from those parts (Source). These shortcuts, or heuristics, explain our biases in interpreting what we experience. In his 2011 book Thinking Fast and Slow, Kahneman explained his dual system model of cognition. System 1 is fast and intuitive, able to build a coherent impression of some experience from limited information. We don’t need to look at all the jigsaw puzzle pieces to guess what the whole puzzle looks like. In Kahneman’s model, System 2 is slow and analytical, patiently putting all the pieces together to form a detailed impression.

In his book Determined: A science of life without free will, neuroscientist Robert Sapolsky (2023, pg. 407) notes that our nervous system must react in two extremely different time frames. The first is the lightning fast reaction to an immediate threat. The second is a considered judgment about an important transition in our lives.

Kahneman’s key insight was that we don’t need a lot of evidence to construct an impression. The quality of our evidence may be faulty as well. What determines our level of confidence in our interpretation of events is the coherence of the story that we build from that evidence. That leads me to the second axis I want to look at this week, levels of confidence.

Two parties dominate our politics. One party points to the beak on the left side of the  duck-rabbit illusion and says it’s a duck. The other party insists that those are the ears of a bunny, not the beak of a duck. Each party expects their followers to express belief in the party’s impression of the illusion. They must show confidence in the story and the party.

A jigsaw puzzle has a known and limited number of pieces. It says right on the box how many pieces there are. In life, how do we know that we have all the pieces? We don’t. We might argue that we don’t need all the facts to assess a particular situation. In the TV sitcom All in the Family, Archie Bunker often insisted to his son-in-law, Mike “Meathead” Stivic that he knew the truth when he saw it, that his common sense was superior to Mike’s academic knowledge.

In her 2016 book Weapons of Math Destruction, Cathy O’Neil drew a key distinction between understanding bounded phenomena like sports games and unbounded events like trading in the financial markets. The mathematics of Sabermetrics is used to analyze sports games like baseball. The key feature of games is that there is an end and a definite result. The game is over and one team won the game.

In finance, like life in general, the “game” is never over and there is no definitive win. An investment firm might make a winning trade one period only to find out the next period that they had misjudged the risk they had taken on. The winning trade in Period 1 can become a losing trade in Period 2 that threatens to “blow up” the firm’s portfolio and wipe out their investing capital as it did during the 2008 financial crisis. AIG and Goldman Sachs were one of many firms that exposed themselves to a lot of risk based on faulty risk assessment. They had looked at several puzzle pieces and were confident they knew what the whole jigsaw puzzle looked like. They did not account for the “simultaneity” risk when a number of firms made similar bets. They were wrong and the taxpayers had to bail them out.

How do we confirm our assessment of events? We can reach out to friends, family or co-workers for their impression. In this age of social media, we tend to connect with those who share our perspective and value system. Social media makes it easy to find people with biases similar to our own. We may only need a smidgeon of detail to confirm what we already believe to be true. During the 2024 campaign, Trump, Vance and a number of Republicans accused Haitian immigrants of eating dogs and cats in Springfield, Ohio (Source). The story was complete horse hocky, of course, and to many of us, the story lacked coherence. Why did it spread? Prejudice can infuse an incoherent story with credibility.

Rumors of Jews sacrificing Christian children often spread through communities in medieval Europe. What fueled this prejudice? The Catholic Church had long portrayed Jews as the killers of Christ and many Christians knew little about Judaism or its religious practices. As an outgroup, the Jews were sometimes assigned the unpopular task of tax collector. Because of the Christian prohibition on usury, i.e., charging interest on loans, Jews became a community’s moneylender. Both roles provoked economic tensions when times were hard. Like Jews, immigrants are convenient scapegoats. Got problems? Blame it on the Jews. Blame it on the immigrants.

Kahneman coined the term What You See Is All There Is, or WYSIATI, to describe the phenomenon where the story becomes the reality. Kahneman was probably inspired by the software acronym WYSIWYG, meaning that what a user sees on the screen is what they will see on the printed page. Our bias is that we think that what we see on the “screen” inside our minds represents an accurate picture of what is going on outside us. That bias enables us to take action, to form coalitions of other believers. Reality be damned. The reality is the shared story.

The best stories offer little evidence which could contradict the story. The Haitian rumor was one of those. The worst stories are built on a lot of video evidence. Each viewing of the evidence punches another hole in the story. The story is modified to repair its damaged coherence. Then more holes. Rarely does a political party abandon a shared story. Regardless of their relationship to reality, they build party loyalty. How long before this administration abandons its justification of the killing of Minneapolis protestor Alex Pretti?

I hope to see you next week.

////////////////////

Image Credit: Donaldson, J. (July 2016), “The Duck-Rabbit Ambiguous Figure” in F. Macpherson (ed.), The Illusions Index. Retrieved from https://www.illusionsindex.org/i/duck-rabbit. Creative Commons License

Sapolsky, R. M. (2023). Determined: A science of life without free will. Penguin Press.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124

Wittgenstein, L. (1953/2009). Philosophical investigations (P. M. S. Hacker & J. Schulte, Eds. & Trans.). Wiley-Blackwell.