“A man’s got to know his limitations…”
Faculty development would be easier if our brains were better. Ours is a species of very little mental bandwidth, however. According to Google the average person receives 11 billion bits of information every moment. Of these, we can consciously process only 40. The average person can hold at most 7 numbers in working memory. The brain can’t effectively handle more than two complex, related activities at once.
Even the brains of us academic faculty. For example…
You are asked to take part in a test of visual acuity and quantitative skills. You view a film of six players, three dressed in purple and three dressed in orange, passing two basketballs amongst themselves. Your task: count the number of times a player dressed in purple passes the basketball. You think: this should be easy! After all, I have a doctorate and am an academic faculty member. So you watch. The players move randomly, in front and behind one another as they pass the ball back and forth. There’s one pass by a player in purple, and another. It’s challenging to count these, but you’ve completed far more difficult tasks in your life as a faculty member. The film ends. You’ve counted 15 purple passes. You are CORRECT!! No surprise. But then you are asked: Did you see the gorilla? What gorilla???? A replay of the film reveals that, while you were busy counting the passes, an actor dressed in a gorilla suit strolled through the midst of the passing players, beat its chest, and walked off.
Fifty percent of people who take this test fail to see the gorilla. Not just random people, but academics like us. I did not see it myself, despite years of training and experience in detailed data analysis. Try this special version!
But, you object, this would never happen in one’s field of expertise! Au contraire. See http://www.npr.org/blogs/health/2013/02/11/171409656/why-even-radiologists-can-miss-a-gorilla-hiding-in-plain-sight
If half the people like ourselves, people with excellent academic credentials, can’t see an actor in a gorilla suit, what else are we missing? In particular, how might this affect the provision and reception of faculty development?
How our academic brains REALLY work
We do have a way of coping of with the other 99.999996% of each moment’s information that Google says we can’t consciously process. Two brains would be better than one, but we each have only one. Within each brain, however, we have two systems. ‘System 2’ copes with the 40 bits, and ‘System 1’ with the 99.999996%. System 1 is remarkably fast and facile. Our ancestors relied on it to make near-instantaneous distinctions between predator and prey, friend/mate vs. foe, food vs. poison, and danger vs. haven. We rely on it still. Sometimes these jumps are fantastic feats of instantaneous insight from vanishingly little data. By contrast, System 2 is the complex thought that requires our full attention; it is slow, laborious, and has limited capacity. Nonetheless, it suffices for peer-reviewed publication and successful grant applications.
How does System 1 cope with the seemingly massive demand upon it? It takes shortcuts. It jumps to conclusions. Nothing is wrong with a jump to a conclusion that saves one from a predatory lion or a venomous snake [or, these days, an onrushing automobile or a mugging]. Indeed, a careful prolonged consideration of alternatives [What’s the lion’s motivation? When did it last eat; is it hungry? Is it actually a lion, or someone dressed up in a lion suit? Will it be able to catch me, or is it sick or wounded?] could be fatal. But what if the jump places you in the path of an unseen onrushing wildebeest (or partner lion or invisible gorilla)? Not good! Jumps to conclusions can have both positive and negative consequences.
What are these jumps [or, technically, heuristics or frames]? An exhaustive list would consume many volumes. Most, according to Kahneman, are derivatives of a simple rule:
WHAT YOU SEE IS ALL THERE IS!
That is, the point of departure for System 1’s jumps to conclusions is the information that is currently available (rather than unseen or not in active memory). Our System 1 acts on what we see without or instead of considering alternative possibilities, and acts before System 2 can rein it in. As scholars we understand the value of exhaustive research and contemplation of alternative explanations, but System 1 is not scholarly. Rather, it rapidly constructs a coherent story:
- The Dean is carrying an umbrella. The Department Chair is carrying an umbrella. Your System 1 says: Better take your umbrella, for rain is likely.
- You arrive in an unfamiliar city during a foggy night. Your hotel room happens to overlook a park, which is what you see as you glance out the window in the morning. System 1 says: This is a city full of parks, and you dress accordingly. Imagine your surprise when exit the other side of the hotel into a densely urban environment.
- You and your identical twin, with identical academic track records and abilities, compete for a job as a senior faculty member with rank open. Your application letter says ‘associate professor’, and your identical twin says ‘full professor’. The Systems 1 of you, your twin, and all who assess the applications say: your twin is more accomplished than you are. Your twin gets the job.
- You are advising a mentee on graduate programs. She tells you that her top choices are similar except that in the first 80% pass their preliminary or qualifying exams, and in the second 20% fail their exams? System 1 says the first is better. She takes your advice, and her entire life unfolds accordingly.
Each conclusion is, in a sense, reasonable, but each is also a cognitive error that System 2 could have avoided were System 1 not so quick and effective. The umbrellas were recent gifts to the Dean and Chair on a sunny day; a glance through another window would have revealed a park-less urban landscape; you and your twin have identical records and abilities; and 80% pass = 20% fail.
Can System 1 be put in its place so that System 2 can correct these errors? Evidently only with difficulty. A classic example is the Müller-Lyer illusion
in which the horizontal lines are the same length. Even when System 2 measures the lines to verify the equivalence, System 1 persists in its error.
What’s this got to do with faculty development?
System 1’s decisions to dress for a park, carry a gratuitous umbrella, or mismeasure the lines are largely inconsequential. The other decisions are not. How System 1’s jumps to conclusions contribute to implicit bias and prejudice are well known. But they also bedevil faculty development in many ways, both bad and good.
The bad is easy to envision. As subsequent posts will explore, our System 1 can cause us to:
- Overlook or avoid faculty development programs
- Avoid helpful feedback
- Lose focus and motivation
- Misplace priorities
- Choose inferior career pathways
- Hold ourselves back from advancement
- Incorrectly assess ourselves, colleagues, trainees, and applicants
Unless such resistance of System 1 can be overcome, the provision of faculty development information and advice may have little impact.
What’s the good news, then? The cognitive errors that System 1 commits are highly repeatable and therefore often predictable; as Ariely puts it, we are predictably irrational. If we can foresee these errors, we can avoid them, compensate for them, or even exploit them to our benefit as we develop faculty, including ourselves. If jumps to conclusions have predictable trajectories and stimuli, we can trigger them and aim them. If we know where invisible gorillas lurk, we can convert them to allies or avoid them as need be.
PEOPLE – EVEN ACADEMIC FACULTY – HAVE TWO SYSTEMS OF THOUGHT
System 1 responds instinctively and rapidly to the information before it.
System 2 orchestrates rational but slow thought, and accommodates non-obvious information. It is often subordinate to System 1.
System 1 copes with more information than System 2 can. It does so by jumping to conclusions. These jumps have predictable rules and trajectories.
These features can be exploited to develop academic faculty.
To recap, our species’ limited mental bandwidth forces us to rely on System 1, which both jumps to conclusions and can overpower System 2’s attempts to reach conclusions judiciously. Jumps to conclusions cannot be avoided; they are part of human nature. A possible response to System 1’s drawbacks: if you can’t beat them, use them. As in the martial art jujutsu, use System 1’s overwhelming force and momentum against it and towards your own ends. In the original context, lions and other predators have done this for millennia: it’s called herding. The advertising and sales industry does so too, to great profit. So do political and military campaigns, and hedge funds. Governments and charitable organisms have begun to do this to benefit society, and the health professions to improve their practice. Can we do this to promote faculty development? Can we do this to ourselves?
One thing is certain: Any magic feather must have its contours aligned with how our minds actually work.
TO DO LIST
√ If you have never done so, take a moment to educate yourself about behavioral economics. The World Bank report is a concise introduction. More lengthy works are by Ariely, Kahneman, and Thaler and Sunstein.
√ And ask yourself: If these insights apply to every other form of human endeavor, shouldn’t they also apply to faculty developers and developees? And, if they do, how should your faculty development program change in response?
 After A.A. Milne, Winnie-the-Pooh (1926): “…a Bear of Very Little Brain”
 Magnum Force. Warner Bros.
 Miller, G. A. (1956). “The magical number seven, plus or minus two: Some limits on our capacity for processing information”. Psychological Review 63 (2): 81–97
 This phenomenon, known as anchoring, is also exploited by realtors and car salespeople, who know that a high asking price influences perception of value. Or consider the following dialogue (http://www.imdb.com/character/ch0030046/quotes) from John Le Carre’s Tinker Tailor Soldier Spy (http://www.imdb.com/title/tt0080297/):
George Smiley: Ever bought a fake picture, Toby?
Toby Esterhase: I sold a couple once.
George Smiley: The more you pay for it, the less inclined you are to doubt its authenticity.
 Ariely, Dan. 2008. Predictably Irrational. HarperCollins.
©Martin E. Feder 2015