For thirty years I have moved as a professor from school to school trying to find one that was really changing the way that higher education is delivered. I’ve been disappointed at every turn. I explained this away as the natural outcome of bureaucracy, tenure systems, or our gild-like accreditation apparati. But I’ve come to believe that the real problem is mindset…and the start of bigger fix may be as simple as adopting a different mental model.
Education, through most of human history, was one-on-one, highly personalized, and focused on behavior. It looked a lot like what we call apprenticeships today. Of course, there was variation—some groups had some formalized education for millennia. But if you were to represent it with a curve, it would have looked something like this:
The formal education introduced in religious and moral training in many parts of the world pushed the curve to the other side of spectrum. In the West, these were called “lectures” (from the Latin “to read”) and literally involved one person reading from holy books and others copying word for word. These lectures were standardized, mass, low-touch affairs.
Push me, pull you
The modern university had its precursor in uniform, rote, church education in Europe, In the mid-19th century, under pressure to bring students into the industrial age, Harvard College was the first to introduce the “elective” system (allowing students—and professors—greater personalization of their coursework). Universities today allow some freedom and specialization (a move to the right), but firmly fixed in the industrial age concepts of mass markets and standardization (in subject matter, course structure, and grading criteria), with an emphasis on conveying and certifying knowledge rather than molding behavior.
Computers and internet technologies have promised to push the curve further to the left; but MOOCs and most online courses have been little more than glorified correspondence courses. Professors and teachers—who run the accrediting agencies—rightly fear a loss of control (and jobs) if a less instructor-centric model of education were to emerge. And studies of the learning outcomes for online or hybrid education models have been less than stellar.
Rather than continue to try and achieve the “best of both worlds”—and eventually arrive at something stuck-in-the-middle and ultimately dissatisfying to all—why not just split educational modalities in two? And strive to do each extremely well. But when to do what?
Several years ago, I developed these postulates for bifurcation—which I have come to believe in as rules:
- Postulate one: What can be tested electronically should be taught electronically.
- Postulate two: Almost all of what we currently consider formal education (even into graduate-level education) can be automated without relying on teachers or professors, as we have known them.
- Postulate three: When formal education as we know it today has been replaced by machines, schools will become more important than ever—not as places to go for “book knowledge,” but as institutions where older humans teach younger humans what it really is to be better humans.
For decades, educators who believe in technology solutions have known there is a holy grail—globally-accessible, low cost, engaging, memorable, and self-paced—the video game. And there are already excellent educational video games for elementary and even some for secondary school students: alphabets, math, geography, grammar, history, basic principles in science, etc. But when it comes to educating students in high school and beyond—in more complex, conceptual knowledge and application of that knowledge, the currently available game products are limited. All that is available are simulations—usually relatively short, blunt programs for teaching one or two concepts requiring significant instructor involvement to drive the points home.
To effect a move to the technology-driven end of this educational spectrum in higher education, I and a small team of some really smart folks have successfully tested a prototype and are now developing a full-fledged 45-hour-long video game-like experience to replace a capstone course for thousands of undergraduates. In this initial Interactive Learning eXperience (or I-L-X), students learn how to think through the problems and give advice to the CEO of an airline – but in future I-L-Xs, students will be working through a problem like building a new car engine or conquering outer space. In all of these, the “players” apply theory, gather data, test hypotheses, and offer solutions. There are hundreds of thousands of paths through these games and millions of possible variants on the final solutions that students can offer. About one-quarter of these potentials are extremely viable and would result in a good grade. This experience is 100% online; students will never interact with a professor. Players’ scores on the game will be their grades in the course – and there is no reason that students should not have the opportunity to play as many times as necessary to master the content and theory—and pass the course.
On the human, behavioral side, courses are being created that look much more like a piano lesson than a university lecture. Professors ask students to prepare and practice assignments before the class period which is then used as a time for group and individual “performance” or “show and tell”—with the instructor (and classmates) giving constant frank, honest feedback. Criticism, praise, and even the grades themselves are shared with everyone in the course. And the students offer feedback to each other as well (for example, online ratings of written assignments are completed before the class session)—their feedback is factored into the course grades. One class, taught by a single professor, is a year-and-a-half-long to allow time for that professor to watch, evaluate, and encourage the progress of students over time. Some have called this development-based assessment—it is very much like an apprenticeship. Grades matter much less to the students than the growth they see in themselves and their classmates. And employers who have been introduced to this course methodology are clamoring for graduates.
The double hump
In adopting the double humped form of education, there will be the classic, bureaucratic opposition to anything potentially frame breaking. Current instructors will doubt their ability to become more engaging in the classroom – to offer constant mentoring to their young charges; and, it is true, that some will be entirely unsuited to this form of education. So there will be mighty opposition. But with a clear potential path to the “far left” technology-based education, the focus of universities needs to be increasingly on the “right.”
The beauty of the two extreme methods of teaching is that they deconstruct the age-old conflicts over personalization vs standardization by letting machines do most of the basic, normalized teaching; then humans can be freed to educate with emotion, sophistication, and often messiness—all the things that are humanity. There is no need to squeeze both learning modalities into a single “stuck in the middle” course. They are different for a reason—and those differences in teaching method and purpose are the very things that create more and better learning.
I have seen some attitudes changing which make me cautiously optimistic that double-humped innovation may be able to work even in the largest, slowest-moving, bureaucratic organizations we know as universities.