Advertise here with Carbon Ads

This site is made possible by member support. โค๏ธ

Big thanks to Arcustech for hosting the site and offering amazing tech support.

When you buy through links on kottke.org, I may earn an affiliate commission. Thanks for supporting the site!

kottke.org. home of fine hypertext products since 1998.

๐Ÿ”  ๐Ÿ’€  ๐Ÿ“ธ  ๐Ÿ˜ญ  ๐Ÿ•ณ๏ธ  ๐Ÿค   ๐ŸŽฌ  ๐Ÿฅ”

A cognitive bias cheat sheet

Cognitive biases are systematic ways in which people deviate from rationality in making judgements. Wikipedia maintains a list such biases and one example is survivorship bias, the tendency to focus on those things or people which succeed in an endeavor and discount the experiences of those which did not.

A commonly held opinion in many populations is that machinery, equipment, and goods manufactured in previous generations often is better built and lasts longer than similar contemporary items. (This perception is reflected in the common expression “They don’t make ‘em like they used to.”) Again, because of the selective pressures of time and use, it is inevitable that only those items which were built to last will have survived into the present day. Therefore, most of the old machinery still seen functioning well in the present day must necessarily have been built to a standard of quality necessary to survive. All of the machinery, equipment, and goods that have failed over the intervening years are no longer visible to the general population as they have been junked, scrapped, recycled, or otherwise disposed of.

Buster Benson recently went through the list of biases and tried to simplify them into some sort of structure. What he came up with is a list of four conundrums โ€” “4 qualities of the universe that limit our own intelligence and the intelligence of every other person, collective, organism, machine, alien, or imaginable god” โ€” that lead to all biases. They are:

1. There’s too much information.
2. There’s not enough meaning.
3. There’s not enough time and resources.
4. There’s not enough memory.

The 2nd conundrum is that the process of turning raw information into something meaningful requires connecting the dots between the limited information that’s made it to you and the catalog of mental models, beliefs, symbols, and associations that you’ve stored from previous experiences. Connecting dots is an imprecise and subjective process, resulting in a story that’s a blend of new and old information. Your new stories are being built out of the bricks of your old stories, and so will always have a hint of past qualities and textures that may not have actually been there.

For each conundrum in Benson’s scheme, there are categories of bias, 20 in all. For example, the categories that related to the “not enough meaning” conundrum are:

1. We find stories and patterns even in sparse data.
2. We fill in characteristics from stereotypes, generalities, and prior histories whenever there are new specific instances or gaps in information.
3. We imagine things and people we’re familiar with or fond of as better than things and people we aren’t familiar with or fond of.
4. We simplify probabilities and numbers to make them easier to think about.
5. We project our current mindset and assumptions onto the past and future.

Benson’s whole piece is worth a read, but if you spend too much time with it, you might become unable to function because you’ll start to see cognitive biases everywhere.