The Industrial model of education that fueled the modern era and its dizzying reach did its job incredibly well. Its prime accomplishment was the expansion of our ability to abstract the world. Until then, pragmatism reigned, limiting most people’s thoughts and interactions primarily because we lacked the tools to extrapolate information, to wield it for things that lay beyond obvious applications.
That is, most humans couldn’t see patterns behind inputs, because doing so required us to be able to entertain abstract concepts. And our education was what led us there.
If the inability to see patterns sounds implausible to you, it did to me, too, until I read David Epstein’s fascinating book, Range: Why Generalists Triumph in a Specialized World. Early on, he introduces us to Raven’s Progressive Matrices— essentially the gold standard for general human intelligence tests, because it transcends education and culture.
Using Raven’s Matrices, New Zealand psychologist named James Flynn was the first person to amalgamate existing studies from around the world to test how results from Raven’s Matrices changed over time. His discoveries in 1981, dubbed the Flynn Effect, proved that across generations and nations, whether or not general scholastic intelligence climbed or fell, Raven’s scores always went up.
What drove this was the education system we had formalized in response to the complexities introduced by industrial-era machines and the businesses they sponsored. STEM subjects, knowledge of which our new machines and work processes required of the general work force for the very first time—at least to some degree—allowed us to extrapolate information from abstract concepts like math and scientific inquiry.
We used these to make sense of increased complexity.
Modern education introduced the public at large to things that had hitherto only existed in rare imaginations, or in ivory towers. Most of these things had remained hidden from us because they required an abstract level of thinking that was never taught at large, but was now suddenly within the grasp of the masses. It wasn’t long until novel inventions began to blossom prodigiously.
That’s not to say that invention or innovation began with the Industrial Revolution. It didn’t. The Ancient Babylonians, Egyptians, Persians, Greeks, Romans, Ottomans and Anglo-Saxons were all wildly inventive. The difference was that in pre-modern times, education wasn’t formalized, standardized and democratized until the Machine Age demanded it.
With some exceptions, the new education system was aimed at one end and one end only: specialization, necessitated by new complexity.
We remain there today.
Missing the Forest for the Trees
The learning systems that made societies like the Ancient Greeks so prolific—as inventors of nearly every conceptual system that still underpins society today, medicine, ethics, philosophy, logic and democracy among them — bore no resemblance to our current educational paradigm. While Aristotle and co. also excelled in abstraction, they used it to opposite ends from ours.
They generalized knowledge.
“Flynn’s greatest disappointment,” Epstein writes, “is the degree to which society, and particularly higher education, has responded to the broadening of the mind [under abstraction] by pushing specialization, rather than focusing early training on conceptual, transferrable knowledge.”
In other words, the abstraction that made progress in the Modern Era so monumental also painted itself into a corner by removing the very quality that would allow us to escape the limits of our own inventions. This is because specialization is about the selective removal of variables, to enable the deepening of a singular skill or expertise. The trade-off with this kind of training is that we become less able to engage in discussions about, or participate in, things that lie outside of our chosen area(s) of focus.
Not infrequently, specialists “miss the forest for the trees.”
In his writing, Flynn illustrates study after study, spanning nations and decades, in which modern educational gift of abstraction — a transcendent sign of intelligence, as we saw in Raven’s Matrices — has been squandered because we have aimed that broad potential to narrow ends. The reason we’ve done so is because modern society values predictable and monetizable outcomes above all—the kind that specialists excel in—over broad, often messy, critical thinking that may someday lead us somewhere.
Unfortunately, most schoolchildren are taught that the world is predictable. We teach rules in language, biology, physics, history, chemistry and mathematics, then we test for acuity in fact retention—aka rote regurgitation. Thus our education system equips us well for things we have experienced before and expect to stay that way, while leaving us extremely ill-equipped for everything else.
Our thinking is highly specialized in a manner that the modern world has demanded of us; and it is thinking that is increasingly obsolete.
“A rapidly changing ‘wicked’ world demands conceptual reasoning skills that can connect new ideas and work across contexts.”
Ready for Change
Today, we call this critical thinking. Incredibly, most education systems— and the societies in which we are raised and taught — downplay its importance, going so far as to think it a distraction from what really matters: learning a skill expertly, so that we can differentiate ourselves as we compete for a slice of the economic pie, as adults.
Post-school-age learning follows a similar trajectory, in which adults pay to specialize even further, for competitive advantage.
But, to paraphrase Jack Cecchini — one of the world’s great multi-genre musicians—there is a huge difference between creation and re-creation.
Alas, specialization, it turns out, is the enemy of creative thinking; and worse, it turns out that once our minds are hard-wired to specialize, it is an uphill — often insurmountable — battle to grasp things that challenge our well-formed and deeply grooved beliefs.
Said another way, rules thwart creativity.
Throughout his book Epstein introduces us to world-renowned musicians, scientists, athletes and others, taking us through their life journeys. With few exceptions, their stories unfold the same way: those who dabbled the most early in life—resisting depth in favor of open-ended, self-directed play, then choosing to train their focus on something in particular only later, and even then, never exclusively—invariably reached higher than those who didn’t. Moreover, they stayed there even as the world changed around them. The hyper-specialists didn’t.
The old saw, “An expert is someone who knows more and more about less and less until they know absolutely everything about nothing,” speaks to this concept.
Importantly, that same group of creative dabblers were more often than not innovators within their chosen fields, as well—not just the temporary masters of a prevailing paradigm, like their expert peers.
What we need to be taught is how to be ready for change in a quickly changing world—not how to dominate a field that may disappear overnight.
The word interdisciplinary is a modern one. I’ve never liked it. Most of my colleagues use it daily, as a badge of honor for how well we work together. But if we look beneath the obvious benefit of sharing skillsets in a genial and collaborative manner, the fact is that interdisciplinary was an idea borne out of the fact that we were no longer able to easily skate between knowledge sets, and thus required a team in order to solve the same problems we might have solved on our own before we turned our backs on general learning in favor of specialization.
In my view, we’d be better served by a different goal. In lieu of seeking inter-disciplinary partnerships, we could instead aspire to reach intra-disciplinary fluency — that is, turning inward to cultivate our innate polymathic ability.
We could retool the educational paradigm to be intentionally broad, and equally resistant to specialization.
If we were intra-disciplinary, we could solve “wicked” problems per se, without regard for prevailing orthodoxies or rule sets of any kind. That’s because knowledge sets are resources to draw upon, and the intersection between broad ones is where insight and innovation can be found.
Whether we do this solo, or collaborate with others, isn’t the point. What is the point is that every person contributing to a creative problem would bring better contextual tools to the undertaking, allowing us to individually and collectively reach farther than we could by subdividing the tasks to apply discrete lenses to the bits.
In the Industrial era, we excelled in the creation of new objects. In Ancient Greece, they excelled in the creation new systems of thought.
My field is utterly complex. Some buildings whose design I’ve led have required teams of 100 or more, sustained over a 5-plus year period, simply to draw it before it reached a construction site, where hundreds more might build it. What makes me a good leader [if I can say that] is not a singular expertise. Rather, it is the ability to know enough about each collaborator’s expertise to understand how their specialized contributions could best fit into the larger context, in order to inform better—and holistically considered—decisions and outcomes.
The ability to tackle problems from a variety of angles, through broad learning and the playful trial and error that it precipitates, seems to show up consistently among the most creative people in any field.
One of my favorite words is polymathy. Polymaths are people “whose knowledge spans a substantial number of subjects, known to draw on complex bodies of knowledge to solve specific problems.”
As I’ve written before, polymaths created the world. In that article, I started with what has to be one of the funniest highbrow jokes I’ve heard:
“The Austrian satirist Karl Kraus once quipped, “I had a terrible vision: I saw an encyclopedia walk up to a polymath and open him up.”
It’s funny precisely because polymaths are encyclopedic in their reach, without being as burdened by rigid conclusions or circular thinking that dogs the public at large. And the only reason it dogs us is because many of us lack the tools to see beyond what we have been taught.
Our education has not prepared us for it.
In the modern era, we have fallen out of love with polymathy. So much so, that we frequently use synonyms as insults. “Amateur” and “Jack of all trades (Master of none)” are wielded largely to disparage or impugn people for lacking sufficient expertise to warrant our attention, or speak with authority.
When was the last time sportscasters brought in an engineer to comment on plays; or news anchors invited a mathematician to opine on the latest political scandal; or the CDC brought in a philosopher to share her thoughts about COVID-19?
Yeah. Didn’t think so.
And yet: from Aristotle to Da Vinci to Newton to Darwin to Edison and beyond, these polymaths’ outsized contributions to humanity (engineering, math and philosophy included) emerged from lives spent resisting depth in favor or indulging their curiosity wherever it took them.
And in the process, they invented things in fields in which they had no standing, and which they changed, nonetheless.
Self-Directed and Test-Free
Self-directed learning is another subject frequently touched upon in Epstein’s book. Most of the stories he tells about various fields’ brightest luminaries start in the very same place: with children who resisted being boxed in or told what to do, and actively rejected things that didn’t interest them in favor of indulging their casual curiosity.
There’s even a term for that: interleaving.
Interleaving “involves mixing together different topics or forms of practice, in order to facilitate learning”. Interleaving has been shown to improve inductivereasoning. While deduction is inference based on widely accepted facts, induction is inference based on observation alone—often just a sample.
Said another way, deduction draws truths from facts. Induction questions the facts themselves.
While we may not want to build a rocket ship on induction alone, this type of “ballpark learning” allows broad threads to be pulled together into a gestalt whose end goal is contextual understanding.
Interleaving is how Darwin—an amateur naturalist—discovered the principles of evolution. It’s how Newton—an amateur scientist—uncovered the laws of thermodynamics, gravity among them. It’s how Da Vinci invented scuba diving, the airplane and war machinery, while sneaking into graveyards to disinter bodies so that he could dissect them and understand how mechanics functioned.
On and on.
Imagine if, instead of teaching “Bobby and Suzy” that 2+2=4, we told them to go home, gather groups of objects together, and work with their classmates to invent games with the goal of exchanging them. I think they’d learn math pretty quickly.
Whether or not that’s a good illustration, I’m proposing a focus on investigationrather than conclusion. To teach question-seeking rather than answer-finding. To encourage children to explore their worlds without regard for being wrong, and to share the questions and thoughts that emerge; then to gather the collected reflections of the entire class, as a pretext to structured learning.
The Japanese do just that. It’s called bansho.
Bansho celebrates “the class’s collective intellectual voyage, dead ends and all.”
Taken together, these proposals would encourage curiosity, listening skills, collaboration, inductive reasoning, dialectics, agency, tolerance, neuroplasticity, comfort with change and courage. And probably some other things, too.
Instead, we teach and test for “right and wrong”, rather than gray. We lead children (and adults) toward efficient answers, rather than effective questions. We focus on monetizing adulthood, rather than exploring childhood. We dream of guiding our children’s mastery over something that will differentiate them, in lieu of encouraging an amateurism that may have no clear application, but would seed their long-term resilience.
As Epstein’s book cautions, the tide of rapid change is rising, and the one thing AI cannot yet do—and may never do—is draw illogical connections between seemingly unrelated (non-linear) things in ways that lead to novel strategies and solutions.
Learning, it turns out, benefits more from self-directed mistakes than it does from correct answers.
How? Cognitive psychologist Nate Kornell and psychologist Jane Metcalfe separately tested sixth-graders and ivy-league university students on vocabulary. They alternated between giving the students a definition and word together, or giving only the definition and asking the student to come up with the word themselves.
What they found was that “being forced to generate answers [on our own] improves subsequent learning even if the generated answer is wrong.” This is due to a hypercorrection effect. The more wrong the answer, the bigger the correction and the bigger the impact when the right answer is learned. They concluded that tolerating big mistakes can create the best learning opportunities.
A gentleman named Edison comes to mind. Questioned about his missteps in creating the electric light, he said, “I have not failed 10,000 times — I’ve successfully found 10,000 ways that will not work.”
I think we know how that one ended.
What Am I?
The prevailing pedagogy does us a great disservice by blinding us to our own acts. We are too precious about the conclusions we draw from what we’re taught or otherwise learn, because our socio-economic value system demands it from us.
It demands answers and certainty as the defining measures of human value.
If I’m not an expert, then what good am I to an employer—or the world?
What is my purpose?
Abstraction has led us to fantastic things: the creation of machines and systems our ancestors could only imagine. At the same time, our drive to specialize—as children and adults—has cheated us out of our full potential.
In a 21st century world, where AI and machines will increasingly outperform people on rote tasks, critical thinking will add value and longevity to our productive lives. Like the Ancient Greeks before us, and the polymaths who engaged their worlds with broadly applied curiosity, rooted in ideas but free of preconceptions, we would be well-served by guiding children and adults through a life of open-ended co-exploration.
Summing up his thoughts on the matter, Epstein cites Shinichi Suzuki, a grittily self-taught musician who nonetheless became a renowned music educator:
“Children do not practice exercises to learn to talk… children learn to read after their ability to talk has been well established.”
He’s saying that humans are hard-wired to act first, then learn from what we’ve already done. And yet: in schools, we do the opposite. The exercises come first, in the hopes that they result in learning.
“In totality,” Epstein writes, “breadth of training predicts breadth of transfer. That is, the more contexts in which something is learned, the more the learner creates abstract models, and the less they rely on any particular example. Learners become better at applying their knowledge to a situation they’ve never seen before, which is the essence of creativity.”
It’s time for learning to prepare us for the unknown.
It’s time for an education reboot.