Feature / Summer 2009
Why Smart People Do Stupid Things

Intelligence by itself doesn’t make you rational. Thinking rationally demands mental skills that some of us don’t have and many of us don’t use



Illustration by James JoyceHow can someone so smart be so stupid? We’ve all asked this question after watching a perfectly intelligent friend or relative pull a boneheaded move.

People buy high and sell low. They believe their horoscope. They figure it can’t happen to them. They bet it all on black because black is due. They supersize their fries and order the diet Coke. They talk on a cellphone while driving. They throw good money after bad. They bet that a financial bubble will never burst.

You’ve done something similarly stupid. So have I. Professor Keith Stanovich should know better, but he’s made stupid mistakes, too.

“I lost $30,000 on a house once,” he laughs. “Probably we overpaid for it. All of the books tell you, ‘Don’t fall in love with one house; fall in love with four houses.’ We violated that rule.” Stanovich is an adjunct professor of human development and applied psychology at the University of Toronto who studies intelligence and rationality. The reason smart people can sometimes be stupid, he says, is that intelligence and rationality are different.

RELATED READING:

 
QUIZ: How Rational Are You?
Five questions to get you thinking
 
What Makes Us Happy?
Take our simple happiness test
 
Eureka!
Where great ideas come from
 
Can a Brain Change?
The human brain may be far more malleable than we thought
 
Meditative State of Mind
What are people really doing when they’re meditating?
 
Why Good People Do Bad Things
Are we living in an unethical era?

“There is a narrow set of cognitive skills that we track and that we call intelligence. But that’s not the same as intelligent behaviour in the real world,” Stanovich says.

He’s even coined a term to describe the failure to act rationally despite adequate intelligence: “dysrationalia.”

How we define and measure intelligence has been controversial since at least 1904, when Charles Spearman proposed that a “general intelligence factor” underlies all cognitive function. Others argue that intelligence is made up of many different cognitive abilities. Some want to broaden the definition of intelligence to include emotional and social intelligence.

Stanovich believes that the intelligence that IQ tests measure is a meaningful and useful construct. He’s not interested in expanding our definition of intelligence. He’s happy to stick with the cognitive kind. What he argues is that intelligence by itself can’t guarantee rational behaviour.

Earlier this year, Yale University Press published Stanovich’s book What Intelligence Tests Miss: The Psychology of Rational Thought. In it, he proposes a whole range of cognitive abilities and dispositions independent of intelligence that have at least as much to do with whether we think and behave rationally. In other words, you can be intelligent without being rational. And you can be a rational thinker without being especially intelligent.

Time for a pop quiz. Try to solve this problem before reading on. Jack is looking at Anne, but Anne is looking at George. Jack is married but George is not. Is a married person looking at an unmarried person?

Yes      No      Cannot be determined

More than 80 per cent of people answer this question incorrectly. If you concluded that the answer cannot be determined, you’re one of them. (So was I.) The correct answer is, yes, a married person is looking at an unmarried person.

Most of us believe that we need to know if Anne is married to answer the question. But think about all of the possibilities. If Anne is unmarried, then a married person ( Jack) is looking at an unmarried person (Anne). If Anne is married, then a married person (Anne) is looking at an unmarried person (George). Either way, the answer is yes.

To figure this out, most people have the intelligence if you tell them something like “think logically” or “consider all the possibilities.” But unprompted, they won’t bring their full mental faculties to bear on the problem.

And that’s a major source of dysrationalia, Stanovich says. We are all “cognitive misers” who try to avoid thinking too much. This makes sense from an evolutionary point of view. Thinking is time-consuming, resource intensive and sometimes counterproductive. If the problem at hand is avoiding the charging sabre-toothed tiger, you don’t want to spend more than a split second deciding whether to jump into the river or climb a tree.

So we’ve developed a whole set of heuristics and biases to limit the amount of brainpower we bear on a problem. These techniques provide rough and ready answers that are right a lot of the time – but not always.

For instance, in one experiment, a researcher offered subjects a dollar if, in a blind draw, they picked a red jelly bean out of a bowl of mostly white jelly beans. The subjects could choose between two bowls. One bowl contained nine white jelly beans and one red one. The other contained 92 white and eight red ones. Thirty to 40 per cent of the test subjects chose to draw from the larger bowl, even though most understood that an eight per cent chance of winning was worse than a 10 per cent chance. The visual allure of the extra red jelly beans overcame their understanding of the odds.

Or consider this problem. There’s a disease outbreak expected to kill 600 people if no action is taken. There are two treatment options. Option A will save 200 people. Option B gives a one-third probability that 600 people will be saved, and a two-thirds probability that no one will be saved. Most people choose A. It’s better to guarantee that 200 people be saved than to risk everyone dying.

But ask the question this way – Option A means 400 people will die. Option B gives a one-third probability that no one will die and two-thirds probability that 600 will die – and most people choose B. They’ll risk killing everyone on the lesser chance of saving everyone.

The trouble, from a rational standpoint, is that the two scenarios are identical. All that’s different is that the question is restated to emphasize the 400 certain deaths from Option A, rather than the 200 lives saved. This is called the “framing effect.” It shows that how a question is asked dramatically affects the answer, and can even lead to a contradictory answer.

Then there’s the “anchoring effect.” In one experiment, researchers spun a wheel that was rigged to stop at either number 10 or 65. When the wheel stopped, the researchers asked their subjects if the percentage of African countries in the United Nations is higher or lower than that number. Then the researchers asked the subjects to estimate the actual percentage of African countries in the UN. The people who saw the larger number guessed significantly higher than those who saw the lower number. The number “anchored” their answers, even though they thought the number was completely arbitrary and meaningless.

The list goes on. We look for evidence that confirms our beliefs and discount evidence that discredits it (confirm-ation bias). We evaluate situations from our own perspective without considering the other side (“myside” bias). We’re influenced more by a vivid anecdote than by statistics. We are overconfident about how much we know. We think we’re above average. We’re certain that we’re not affected by biases the way others are.

Finally, Stanovich identifies another source of dysrationalia – what he calls “mindware gaps.” Mindware, he says, is made up of learned cognitive rules, strategies and belief systems. It includes our understanding of probabilities and statistics, as well as our willingness to consider alternative hypotheses when trying to solve a problem. Mindware is related to intelligence in that it’s learned. However, some highly intelligent, educated people never acquire the appropriate mindware. People can also suffer from “contaminated mindware,” such as superstition, which leads to irrational decisions.

Stanovich argues that dysrationalities have important real-world consequences. They can affect the financial decisions you make, the government policies you support, the politicians you elect and, in general, your ability to build the life you want. For example, Stanovich and his colleagues found that problem gamblers score lower than most people on a number of rational thinking tests. They make more impulsive decisions, are less likely to consider the future consequences of their actions and are more likely to believe in lucky and unlucky numbers. They also score poorly in understanding probability and statistics. For instance, they’re less likely to understand that when tossing a coin, five heads in a row does not make tails more likely to come up on the next toss. Their dysrationalia likely makes them not just bad gamblers, but problem gamblers – people who keep gambling despite hurting themselves, their family and their livelihood.

From early in his career, Stanovich has followed the pioneering heuristics and biases work of Daniel Kahneman, who won a Nobel Prize in economics, and his colleague Amos Tversky. In 1994, Stanovich began comparing people’s scores on rationality tests with their scores on conventional intelligence tests. What he found is that they don’t have a lot to do with one another. On some tasks, there is almost a complete dissociation between rational thinking and intelligence.

You might, for example, think more rationally than someone much smarter than you. Likewise, a person with dysrationalia is almost as likely to have higher than average intelligence as he or she is to have lower than average intelligence.

To understand where the rationality differences between people come from, Stanovich suggests thinking of the mind as having three parts. First is the “autonomous mind” that engages in problematic cognitive shortcuts. Stanovich calls this “Type 1 processing.” It happens quickly, automatically and without conscious control.

The second part is the algorithmic mind. It engages in Type 2 processing, the slow, laborious, logical thinking that intelligence tests measure.

The third part is the reflective mind. It decides when to make do with the judgments of the autonomous mind, and when to call in the heavy machinery of the algorithmic mind. The reflective mind seems to determine how rational you are. Your algorithmic mind can be ready to fire on all cylinders, but it can’t help you if you never engage it.

When and how your reflective mind springs into action is related to a number of personality traits, including whether you are dogmatic, flexible, open-minded, able to tolerate ambiguity or conscientious.

“The inflexible person, for instance, has trouble assimilating new knowledge,” Stanovich says. “People with a high need for closure shut down at the first adequate solution. Coming to a better solution would require more cognitive effort.”

Fortunately, rational thinking can be taught, and Stanovich thinks the school system should expend more effort on it. Teaching basic statistical and scientific thinking helps. And so does teaching more general thinking strategies. Studies show that a good way to improve critical thinking is to think of the opposite. Once this habit becomes ingrained, it helps you to not only consider alternative hypotheses, but to avoid traps such as anchoring, confirmation and myside bias.

Stanovich argues that psychologists should perhaps develop tests to determine a rationality quotient (RQ) to complement IQ tests. “I’m not necessarily an advocate of pushing tests on everyone,” he says. “But if you are going to test for cognitive function, why restrict testing to just an IQ test, which only measures a restricted domain of cognitive function?”

Kurt Kleiner is a writer in Toronto.


Reader Comments

# 1
Posted by Dave on June 11th, 2009 @ 3:08 pm

I can’t believe there is no mention of religion in this article. I can think of no greater example of the surrender of rational thought to superstition by so many otherwise intelligent people. Still taboo to mention?

# 2
Posted by David (MMedSci 1987) on June 11th, 2009 @ 8:54 pm

Dave, it should not be taboo to discuss this at all but your comment begs the question, did you really understand this article? It seems to me that you are not aware of your own presuppositional biases and “myside blindness.” There is no such thing as a bias-free position. It is irrational to think that the evolutionary materialist view (which I am assuming you hold) is the only valid one when it starts with the fundamental assumption that there can be allowed no non-material explanation for the Universe. Fail at Logic 101 level.

I am also not going to debate the enormous, heterogeneous and muddy field of religion or superstition. I may agree with you in part that a significant proportion of superstition and religious claims are ill-founded. However I also do not accept a widely stated view that all claims and beliefs in this context are are equally valid (or invalid depending on the commentator) as that is a philosophical rather than a rational position. “Test everything. Hold on to the good.” 1 Thessalonians 5:21

I enjoyed this article and the admonition to examine problems more carefully and broadly. I like the idea of teaching people how to think rationally rather than what to remember. It seems to me that the reason it is less done is that “facts” and techniques are relatively easy to teach and assess whereas the critical tools to interpret “facts” are harder so less “efficient” in the short term. (I tend to overuse inverted commas but use them around facts because a significant proportion of what are believed to be facts, aren’t.)

# 3
Posted by Paul (BA Philosophy and Religion, BEd 2007) on June 12th, 2009 @ 12:16 pm

If you are raised and educated in this secular and atheistic age and end up in adulthood still honestly believing in a religious tradition, then you are likely the most rational of people since you would have had to come to terms with a culture that opposes your beliefs, that forces you to hear the other side of the argument, and that would force you to examine your presuppositional biases, “confirmation bias,” and “myside blindness.”

# 4
Posted by Lloyd Christmas (2007) on June 18th, 2009 @ 11:57 am

Paul (above) says: “If you are raised and educated in this secular and atheistic age and end up in adulthood still honestly believing in a religious tradition, then you are likely the most rational of people…”

And if you’ve been exposed to religion for a significant portion of your life and choose to be secular, are you most rational too?

# 5
Posted by John Lovas (DDS1975) on June 30th, 2009 @ 7:25 am

Robert J. Sternberg’s writings on “wisdom” may be appreciated by our readers as common ground. In particular: “How wise is it to teach for wisdom? A reply to five critiques” in Educational Psychologist 2001; 36(4): 269-272.

# 6
Posted by Keith Falkner (BSc 1963) on July 4th, 2009 @ 2:18 pm

The author says, “Jack is looking at Anne, but Anne is looking at George”. He adds that Jack is married, and George is unmarried. Then he asks if a married person is looking at an unmarried person.

The reader is asked to choose among: Yes, No, Cannot be determined.

The author suggests we “think about all of the possibilities”, then tells us that in BOTH of the possible cases, the answer is yes, because either married Jack is looking at unmarried Anne, or married Anne is looking at unmarried George.

The author’s conclusion: “Either way, the answer is yes”.

I suggest the author did not “think about all the possibilities”.

In the first place, there are no grounds for assuming that only one “Anne” is present. Jack might be looking at a married woman named Anne, while an unmarried Anne looks at George. If that is the situation, the answer is No.

In the second place, even if only one Anne is present, the problem as stated does not define Anne as a person. Indeed, George, being unmarried, need not be a person either. I consider Jack, being “married”, must be a person, but we have no grounds to assume either of the other named entities to be a person. If Anne or George is not a person, the answer is again No.

I believe I have proven the answer, from only the words of the problem as stated, to be “Cannot be determined”.

Yes, I understand the problem, but we ALL wear blinders, perhaps varying from time to time in size and opacity.

# 7
Posted by Leo Næsager (BEd 1971) on August 15th, 2009 @ 12:20 am

This interesting article made me immediately think back on a quote by the late Sydney Harris: “The fatal mistake that most intelligent people make is assuming that a high degree of intelligence confers an equally high degree of judgment, when actually the correspondence between these is quite accidental. However, there is a high correlation between prejudice and ignorance.”

# 8
Posted by Tyler on November 7th, 2009 @ 5:40 am

The second commenter states: “Dave, it should not be taboo to discuss this at all but your comment begs the question, did you really understand this article? It seems to me that you are not aware of your own presuppositional biases and “myside blindness.” There is no such thing as a bias-free position. It is irrational to think that the evolutionary materialist view (which I am assuming you hold) is the only valid one when it starts with the fundamental assumption that there can be allowed no non-material explanation for the universe. Fail at Logic 101 level.”

Are you serious? The entire article is about how suppositional belief patterns are irrational. Religion is, by definition, irrational (it admittedly eschews rationality in favor of “faith” – abductive reasoning at its worst).

Dave is absolutely correct. Religion is the most prominent example of people inexplicably disowning reason and fact (in this case represented by science and empirical evidence) in favor of a “hunch.”

Also, please attempt to explain, in terms that actually make sense, what you mean by “non-material explanation for the universe.”

# 9
Posted by Wolter on November 7th, 2009 @ 4:32 pm

The problem with the married/unmarried question is not with the people answering the question, but rather with the question itself.

This question has been specifically designed to take advantage of low-risk assumptions in order to foster an incorrect answer (also known as a trick question). The information is there, but in the form of a purloined letter.

Trick questions do not test intelligence; they test whether the subject can see through your subterfuge or not (much akin to a magician daring you to figure out his trick).

So no, we’re not “cognitive misers.” We’re rational beings who cannot live by daily questioning all of our assumptions. Imagine trying to live, wondering if the house might catch fire the moment you turn on the toaster? Or worrying that an earthquake might swallow you as you walk on the sidewalk? Or that your chair will suddenly break and impale you? Or that all the wheels will suddenly disappear from your car while you’re driving? You simply could not live at all.

Do you check the washing machine every three minutes to make sure it’s actually cleaning your clothes? Do you check the refrigerator every 10 minutes to make sure it’s still keeping things cold?

We make assumptions as a result of our daily experiences, and then turn our attention to things that are not so safe to assume.

This is not a case of “cognitive miserliness,” but rather of “cognitive priorities.”

People who buy high and sell low are either prone to panic, or unable to defer gratification. Neither is a measure of cognitive initiative. People who follow horoscopes are those who are less able to understand what “evidence” means. I’ve had a number of discussions with horoscope readers, and each one related a story about the times that their horoscope was correct (never mind that they’re so general as to be “correct” quite often). The same thing goes for religious people. People who gamble are controlled by their greed. This is an emotional fault, not one of intelligence. And so on, and so on.

“Dysrationalia” encompasses all of these things, each with their own cause. The “cognitive miser” doesn’t enter into it, except in cases of genuine laziness (knowing the right thing to do, but can’t be bothered to do it the proper way) which are not as common as the others. Therefore, Stanovich is incorrect in fingering lack of cognitive action as a major source of this problem.

The jelly bean experiment tested peoples understanding of probability (or their capability of understanding), not their cognitive activity. These are the same kinds of people who play the lottery.

The disease question is “framing”, as you’ve said. It’s just another form of the “trick question.” In the first example, the question states that the entire population will be wiped out. The second example gives no such indication. This gives a different impression of the stakes involved. You’re merely tricking people, not measuring cognitive action.

You may think that the scenarios are identical, but because you conceal critical information, you actually create a different scenario for the listener. The fault is yours, not theirs.

The “anchoring effect” argument is absurd. You’re asking people to estimate something based on virtually no information at all. Of course they’re going to choose a useless number within the range of something they were recently exposed to! This has NOTHING to do with cognitive action, since the subjects were well aware of their lack of information. All this shows is that you can influence what random things they will choose by exposing them to something tangentially related.

Stanovich’s “mindware gaps” coinage demonstrates a fundamental lack of understanding in how new knowledge is discovered.

The basic premise of new information is “extraordinary claims require extraordinary evidence.” Sticking to the current understanding until someone can come up with a compelling alternative IS the rational way to go. The idea of “highly intelligent, educated people” with “mindware gaps” is pure wishful thinking. Intelligent people are simply more likely to notice a different pattern, and run with it. If it leads to a discovery of new information, they must then demonstrate that their new theory is sound. If we just accepted every new theory without resistance, science would degenerate into superstition. Superstition is the result of accepting theories without objective proof. Usually the “evidence” offered is anecdotal.

I think that Stanovich should do some bias checking of his own before pushing his theories any further.

# 10
Posted by David D Short (BA 2001) on November 8th, 2009 @ 3:18 am

@David MMedSci 1987
That’s a very long winded way of saying “Yes, it is still taboo.”

# 11
Posted by misanthropope on November 8th, 2009 @ 4:47 am

Intelligence is the hardware and software needed to solve abstract problems. Rationality is the emotional inclination to frame a situation fraught with real consequences, as an abstract problem.

I think there are very few people in the world for whom the “expectation” operator is as weighted with significance as the words “some” “all” “none”, and particularly “dead.” At a certain level, rationality can reasonably be described as taking very important problems lightly.

That said, the extent to which you can concentrate over the noise produced by your endocrine system, is the degree to which you can fairly describe yourself as an adult human being.

# 12
Posted by Kay Bonny on November 13th, 2009 @ 1:31 am

This article describes the issues I am facing with my teenage daughter. She is brilliant and scored high on IQ tests in elementary school. However, she lacks rational thought and everyone attributes these behaviors to her age.

I think associating her behavior to her age is a cop-out. There is no way her brain is functioning on a complete level.
I need to teach her rational thought patterns. Because of her intelligence, any goofy behaviors are dismissed since she more than makes up for it with her academic performance.

She’s smart, but will she be able to make it in the world? Properly relate? Thank you for a new take on how she can be so smart and stupid at the same time.

# 13
Posted by PatrikS on November 15th, 2009 @ 4:46 am

@Keith: The use of the word “but” in the “married/unmarried problem” indicates that we are to assume that the author speaks of the same Anne. Therefore, the answer can only be “yes.” Anything else would overcomplicate a rational test.

# 14
Posted by Lesley on November 16th, 2009 @ 2:15 am

Many people tend to worry too much about doing the “right” thing or the “smart” thing. Who cares if you “lose” a theoretical sum of money (which itself is only a concept) on a house deal, if you liked the house and it sheltered you and your family well, both physically and psychically, for a substantial part of the very short time you will be alive? Reason is a useful tool but not the ultimate measure of whether or not a particular choice or course of action is best. Humans evolved language and reason, but that doesn’t mean language (i.e., categorizations and representations of reality, which themselves are only tools and have no substance) and reason should rule the way we live. I think it’s good that the research discussed in this article supports a more holistic view of healthy human functioning.

# 15
Posted by James on November 28th, 2009 @ 8:30 pm

To the question “Is a married person looking at an unmarried person?” the answer is actually “cannot be determined” because we are not told whether Anne is in fact a person. For all we know she could be Jack’s pet dog. Therefore, based on the information given we can reach no conclusion, other than one based purely on assumptions. To make assumptions in a problem like this based on no real evidence is an incorrect way of solving the problem.

# 16
Posted by Edgar (M Sexology 1999) on December 27th, 2009 @ 12:03 pm

#9 “The information is there, but in the form of a purloined letter.”

Purloin is another word for steal. Your sentence does not make sense. How does the information resemble a stolen letter?!?

# 17
Posted by Tyler Litman (2013) on February 7th, 2010 @ 1:51 pm

@Wolter: Is rational thought not having the ability to recognize the “trick” in a “trick question”? Recognizing the hidden information seems to me like a rational skill of a higher level.

# 18
Posted by Laura on May 26th, 2010 @ 11:34 pm

I love how religion adds spice to any Internet discussion. To add my own insignificant opinion, I make these points.

Christianity is the dominant religion in America, and so it is wrong to assume that individuals grow up in a culture that hates Christianity. On the contrary, atheists and agnostics are some of the most hated individuals in the nation because of their lack of belief. Christians need to quit playing the martyr card in America when they make up 80 per cent of our population.

In Christianity’s defense, it is rational for some to have belief. It may not be rational in a strict, empirical sense of the word, but it is a form of rationality. In light of the evidence for and against the existence of God, individuals must come to their own conclusions, both of which are rational and valid. The tenuous evidence for and against religion means that one must weigh the evidence and come to his or her own conclusion. Atheists need to stop accusing Christians of being stupid and just accept that different opinions may be just as valid.

Can’t we just all get along?

# 19
Posted by Courtney on August 24th, 2010 @ 6:17 pm

I enjoyed Wolter’s & Laura’s responses immensely. I agree with both.

# 20
Posted by William (MSc 2009) on August 31st, 2010 @ 4:37 pm

A previous poster stated: “If you are raised and educated in this secular and atheistic age and end up in adulthood still honestly believing in a religious tradition, then you are likely the most rational of people since you would have had to come to terms with a culture that opposes your beliefs”

So members of the Flat Earth Society are the most rational people because everything in our culture says the Earth is round, and they oppose that? Denying is not equivalent to reasoning. Simply denying something does not make it untrue; nor does it make something else true. VonDaniken denies the rationally held common belief that the pyramids were built by humans. He says they were built by aliens. So I assume you believe VonDaniken because he denies what everyone else believes, and denying what everyone believes automatically means that you are more rational than anyone else?

Another poster stated: “since you would have had to come to terms with a culture that opposes your beliefs, that forces you to hear the other side of the argument, and that would force you to examine your presuppositional biases”

Really? The Flat Earth Society has listened to the other side of the argument and examined the presuppositions? What possible basis is there for the assumption that someone who disagrees with a rationally held belief does so because they hold a more rational belief? Again, denial is not equivalent to evidence. Does everyone who denies something, deny it because they have examined the evidence? Do people who deny that vaccines prevent infections do so because they have been forced to carefully examine the evidence? I don’t think so!

All of this garbage twists around the ‘fallacy of social proof’ (“Everyone agrees with me, so it’s true”) and its hideous love-child, the ‘fallacy of social dis-proof’ (“If it’s wrong to believe things just because everyone else believes in them, then it’s right to deny what everyone else believes in”).

This is combined with the “Christopher Columbus defence”: “They laughed at Christopher Columbus and he was right. They’re laughing at me; therefore i’m right.”

And the “narcissism defence”: “I disagree with everyone else, therefore I’m special, and if I’m special then i must be more important than everyone else. And if I’m more important than everyone else, then my ideas are more important than theirs and I’m right. The more people prove me wrong, the more ’special’ I become, and so the more ‘right’. Why can’t people see how special I am?!”

What so many people don’t seem to be able to get through their skulls is: it has nothing to do with beliefs. What matters are the facts. Your opinions are meaningless.

# 21
Posted by Rosita on October 23rd, 2010 @ 12:09 pm

What a lovely collection of “William terms”:

The fallacy of social proof’: “Everyone agrees with me, so it’s true”

The fallacy of social dis-proof’: “If it’s wrong to believe things just because everyone else believes in them, then it’s right to deny what everyone else believes in.”

The Christopher Columbus defence: “They laughed at Christopher Columbus and he was right. They’re laughing at me; therefore I’m right.”

The narcissism defence: “I disagree with everyone else, therefore I’m special, and if I’m special then i must be more important than everyone else. And if I’m more important than everyone else, then my ideas are more important than theirs and I’m right. The more people prove me wrong, the more special I become, and so the more ‘right’. Why can’t people see how special I am?”
We could add the creationist fallacy: The scientific theory of human descent by modification from other life forms is wrong therefore the assertion that all earth’s life forms were created in an instant by my version of god is right. As William says, “Simply denying something does not make it untrue; nor does it make something else true.”

# 22
Posted by Dianne Brown on January 9th, 2011 @ 11:37 am

#16 “Purloin is another word for steal. Your sentence does not make sense. How does the information resemble a stolen letter?!”

It’s a literary reference, to Conan Doyles’ Sherlock Holmes story The Purloined Letter. A “purloined letter” is something hidden in plain sight.

# 23
Posted by Chase Thomas on March 24th, 2011 @ 10:30 pm

Everyone who comments tries to find faults and loopholes in the article. Do you really think you know better then someone who researched the subject and put careful thought into how everything was worded? You are just proving the article’s point!

# 24
Posted by Claire on April 14th, 2011 @ 12:24 pm

There is nothing more entertaining about this very interesting article than reading the responses. Stop faulting the article, it’s nothing personal!

# 25
Posted by Paul on November 12th, 2011 @ 9:16 am

This article is very interesting. It touches on an extremely sensitive area of human science and challenges to think about how we think about what we think about. I reckon I would learn from the rest of this guy’s book.

# 26
Posted by Endurance Amehson on September 7th, 2012 @ 6:01 am

I love this.

# 27
Posted by Mark G on November 19th, 2012 @ 5:07 pm

The answer to the Jack/Anne/George question, as written, is C, because we don’t know (on the information given) whether Anne is a person. Anne might be, for example, a dog.

In fact, when I originally saw this question, I assumed that this was the catch that most people were expected to miss. I wasn’t expecting the author of the article to miss it. My guess, therefore, is that the question has been misquoted. Worded differently, the answer would be A, for the reasons given.

# 28
Posted by David Vanderbyl on July 5th, 2013 @ 2:06 pm

I have to admit, I initially didn’t get what you people saying Anne might be a dog were going on about. A dog would be unmarried, so the answer is still ‘Yes’.

However, the question asks whether an unmarried person is looking at an unmarried PERSON. So you’re right, without the question explicitly stating that we are dealing with 3 people, there is not enough information, and the correct answer is clearly, ‘Cannot be determined.’

Add a Comment

required, use real name
required, Not for Publication
optional, eg: BSc 2008

Next story in this issue: »
Previous story in this issue: «