Skip to Content

Against Stupidity, the Gods Themselves Contend in Vain

Richard Fall
May 20, 2022

Stupidity.

It’s all around us, or so it seems at times.

Nick Fewings, Unsplash.com

The silly things people believe based on what they’ve seen on social media. The poor decisions otherwise intelligent people (including ourselves!) make in a business setting. “How can they/we be so stupid?!?”

And what is stupidity, anyway? Is it just beliefs or actions that we don’t agree with or is it something more? And even if we can define it, is there anything meaningful we can do with such a definition, such a taking steps to reduce the likelihood of making “stupid” decisions?

Let’s start by understanding that “stupid” is not the same as “dumb”. According to an interesting article in Psyche magazine, Why Some of the Smartest People Can be so Very Stupid.:

Stupidity is a very specific cognitive failing. Crudely put, it occurs when you don’t have the right conceptual tools for the job. The result is an inability to make sense of what is happening and a resulting tendency to force phenomena into crude, distorting pigeonholes.

Psyche

In other words, stupidity is not a mere lack of intellectual processing power--what one might refer to as “dumb”. In fact, as the Psyche article points out:

Stupidity [is] something very different and much more dangerous: dangerous precisely because some of the smartest people, the least dumb, were often the most stupid.

Now, other than providing some insight as to why your uncle believes that aliens from Alpha Centauri visited him while he was watching Third Rock from the Sun re-runs, of what use could this definition be?

I want to argue that within the work environment, understanding what stupidity is and how to counter it is extremely relevant. And I hasten to add that this isn’t about countering perceived stupidity in others–it’s about how we each can take on the task of making the smartest decisions possible ourselves. Doing so will make us more productive–and less stressed–employees, and provide an example to others.

And since, as the referenced article points out, stupidity can be a much more serious problem for smart people (read: educated professionals), it is perhaps worth our while to understand what we can do to escape from its grasp.

As an example of the sort of stupidity we’re examining here, I have a story of my own to tell.

As a young electrical engineer working on the development of a militarized mini-computer I was faced with a problem reported by our manufacturing team.

The problem: during post-assembly testing, some of the computers were experiencing intermittent failures. The maintenance team had already–as was their charter–tried to isolate the source of the problem before reporting it to my team for resolution. They had swapped out a number of modules and made various changes only to have to report–somewhat sheepishly–that the failure appeared to be correlated with the color of the power cord.

Incredulous, I went to the manufacturing floor to see the problem for myself. They demonstrated the problem using two different colored power cords and, much to my amazement, the black cord caused the failure while the orange one didn’t.

With that problem statement, I went to work trying to understand what was going on. Clearly, the power flowing through the cord had no knowledge of the color of plastic covering it. So, what could it be?

As I had been trained to do, once I had this piece of information, I worked out a number of possible root causes, intending to create test scenarios for each one.

And as a young, smart, Stanford-educated engineer–boy, could I come up with the most amazing explanations for what was going on.

Perhaps the wire gauge–the diameter–of the wires in the cord was different between the two and this was causing a power drop.

Perhaps the length difference between the two cords was making a difference for the same reason.

Perhaps there was a manufacturing flaw in joining the wires to the plug prongs.

And on and on. I really thought up some out-there scenarios.

And more than a few cords gave up their all to a dissection process that, end the end, yielded nothing. None of the hypotheses panned out.

Now what? Clearly the was a reproducible problem that needed fixing, but I had no course of action in front of me.

I went to the manufacturing floor to review with the team the testing process, thinking that perhaps they could provide some additional insight.

While we chatted, something caught my eye that I had not noticed before. The black power cord was plugged into a different power strip, which itself was connected to a different power circuit, than the orange cord.

“Let’s do a quick check. Swap the strips and outlets the two cords are plugged into test again, please.”

And, lo and behold, the problem was now with the orange cord–the black cord worked just fine!

In the end, the problem was traced to a fault on one of the power circuits on the manufacturing floor and once that was repaired the problem went away.

How stupid could I have been to think the color of the power cord could have made a difference? And how clever was I in coming up with increasingly complex explanations that–in the end–simply took time away from finding the real source of the problem?

Ultimately, I was stupid by the definition of “using the wrong conceptual tools”. I was sure that the problem must be with the power cord, and I stayed with that view for far too long.

How could I have avoided this–and, by extension, what can we all do to prevent ourselves from using the wrong conceptual model in dealing with work situations?

In this case I used here, I had fallen prey to what is known as “anchoring bias”.

Anchoring bias is a cognitive bias that causes us to rely too heavily on the first piece of information we are given about a topic. When we are setting plans or making estimates about something, we interpret newer information from the reference point of our anchor, instead of seeing it objectively. This can skew our judgment, and prevent us from updating our plans or predictions as much as we should.

The Decision Lab

Being aware of the types of cognitive shortcomings that we, as humans, are capable of makes it possible to find strategies to counter them.

There are many good articles online on dealing with cognitive biases in the workplace–I can recommend the one at Zapier as a good starting point.

Mostly, though, being aware of the biases that exist, even if you’re very, very not stupid, is enough to start the journey to making our problem-solving and decision-making more effective.

About the author

National Solutions Architect | United States
Richard has been a practice lead in the Digital Transformations (formerly Mobility) practice at Sogeti for 2-1/2 years, originally in the Des Moines office and now in the Minneapolis office. In that role, he has lead major architecting efforts at a number of Sogeti clients, resulting in creative solutions to difficult problems, winning client loyalty and business.

    Comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *