Liberal Magazine: “Colleges Are Lying to Their Students”
skynesher/iStock/Getty Images Plus
Article audio sponsored by The John Birch Society

If ever there were a sign that academia’s shtick is getting a bit stale, it’s when even a liberal magazine starts scoffing at its norms. This is what has happened, too, with the left-wing Atlantic warning that “colleges are lying to their students.”

No, the outlet isn’t complaining about the various leftist isms pervading the average campus, but about something more fundamental. That is, states The Atlantic’s Caitlin Flanagan, every college tour today seems to have something in common. The guide will at some point say, “What’s different about College X … is that our professors don’t teach you what to think,” Flanagan writes. “They teach you how to think.”

It’s wash, rinse, repeat, Flanagan laments, at College Y — and all up and down the alphabet. She then questions the very idea that anyone can teach us “how to think.” She also writes:

Each of the guides seems to think this is a point of difference about his … college, which is itself a sign that they have spent a lot more time in the “what to think” school of higher education than in the “how to think” one. When you’re visiting a college, walk through the corridors of some of the humanities departments. Look at the posters advertising upcoming events and speakers, read the course listings, or just stand silent in front of the semiotic overload of the instructors’ office doors, where wild declarations of what they think and what they plan to make you think will be valorously displayed.

Does this look like a department that is going to teach you how to think?

Of course, that kids should be taught “how to think” has become reflexively uttered dogma, much like “diversity” and “equality.” The difference is that it’s not really questioned from the Left, Right, or center. But it should be.

Thinking About Thinking

Flanagan certainly questions it. “The truth of the matter is that no one can teach you how to think,” she writes. But “what they can do is teach you how to think for yourself.” That certainly is better, too, than just mindlessly following the herd. (As opposed to mindfully following the herd. In other words, sometimes the herd is correct.)

Should even this, however, really be the focus? Is it sufficient? After all, whether it’s “how to think” or “how to think for yourself,” something is missing. What are we to think about?

Or, more precisely, what should be our goal when thinking “for ourselves”?

Goals shape actions — and thoughts. If a student aims to make his school tennis team, he’ll take actions (lessons, training, etc.) directed toward that end. If you want to reach a geographic destination, you figure out the route (or, today, turn on the GPS!). And what of thoughts?

The Goal

Let’s analogize this. Imagine someone insisted on fervently advocating Lysenkoism, the belief in the heritability of acquired traits. (E.g., if a man loses an arm, his children will be born minus an arm. This belief was officially upheld in the Soviet Union until, essentially, 1964.) Now imagine that you combat this by saying, “Okay, let’s talk about ‘how to think.’” To this the person replies, “I think just fine. Chinese people all have black hair, so their children all have black hair; offspring reflect their parents. So a one-armed man will have a one-armed baby.” (It’s logical; the problem is that the premise is flawed.) So now, getting exasperated, you counter, “Sheesh! My good man, think for yourself!”

The problem, however, is that the individual may be thinking for himself. The issue is that he’s not sincerely looking outside himself for what really matters.

That is, science is about seeking Truth. Doing this, and finding and accepting Truth, shapes (and reshapes) our thinking; it aligns it with reality. And doing this habitually results in being in touch with reality. As an example, when French chemist Louis Pasteur proved germ theory, man’s thinking on disease aligned further with Truth. In turn, this led to better health outcomes and other discoveries, such as antibiotics for combating pathogens. (The search continues, too, with alternatives to antibiotics being developed.)

It is likewise, of course, in the philosophical arena, except in that Truth isn’t discovered via the scientific method. Rather, reason and logical analysis are employed, using axioms as a starting point.

We can then follow the bread crumbs, so to speak. The discovery of one aspect of Truth can lead to another, just as how pathogen understanding led to antibiotics’ creation.

Knowledge’s Prerequisite

Returning to Flanagan and thinking, what’s clear is that she had a father who inspired it. As to this, she related what would happen when she, as a young and spunky lass, would expound away in his presence upon some position, with complete certitude. Upon her conclusion, he would gently derail her with a simple question. “And what is the best argument of the other side?” he’d ask.

If we want to actually be right (as opposed to, perhaps, just popular), we must become our own worst critics. We must learn to argue against our positions inside our minds better than any debate opponent would beyond them. It’s only when we examine our beliefs from every angle that we can become complete and thorough thinkers.

Yet there is, again, that prerequisite for this. And a bit more analogizing is in order. Imagine a society was firmly convinced that disease was punishment visited upon man by capricious gods. Imagine each of its members wholly believed there were no “laws” of nature, no such thing as scientific Truth. What chance would they have of discovering illness’ true causes? Why would they even engage in such inquiry? It would be like searching the cold depths for a valuable treasure — while convinced no such treasure could exist.

Likewise, will people seek answers to moral/philosophical questions if they’re convinced those answers don’t exist? Will they seek Truth in the moral realm if they believe Truth doesn’t exist? Another way of saying this is:

Will they seek those answers if they believe “everything is relative”?

From Nothing Comes Nothing

This belief, after all, moral relativism, boils down to moral nihilism. For if everything is relative, who is to say what’s right or wrong (as the modernist refrain goes)? This belief has pervaded our civilization, too, as studies have shown. The result?

As those studies also reveal, people are then most likely to make “moral” decisions based on feelings. And this makes sense. After all, if that (metaphysical) treasure doesn’t exist — if there are no moral “answers” — why search for them? What is there to think about? You may as well go with what feels right.

This, do note, is precisely why emotionalism reigns in our time. It’s why virtually all of us have had that common experience when debating someone (often of a certain ideological bent). That is, you present reasoned arguments, and the individual responds with emotion-driven non sequiturs. And it is then that you discover why British satirist Jonathan Swift observed long ago (I’m paraphrasing), “You cannot reason a man out of a position he has not reasoned himself into.”

Put simply and broadly, if youths believe in Truth, they’ll naturally search for it. If they don’t believe in Truth, they naturally won’t. Stressing “what to think,” “how to think,” or “how to think for yourself” becomes irrelevant when the very reason for thinking itself is gone.