ChatGPT Can Get Off My Lawn

By: - February 27, 2024

Source link

Will artificial intelligence become the greatest boon to higher education since online learning? (This assumes that online learning was a boon, which is a topic for another day.) Or will it mean the utter destruction of academia as we know it? Those are the two views I see expressed most often these days, with various individuals I respect taking opposite sides.

As someone who is naturally skeptical of this kind of over-the-top rhetoric, I believe the answer lies somewhere in the middle. Despite the forceful yet mixed messages surrounding AI and its applications to higher ed, I have so far in my work been affected by it very little. Although I could be wrong, I don’t expect to be much affected by it in the future.

So: Should I change the way I do everything to accommodate this latest “latest thing?” Or should I run for the hills and pray for the mountains to fall on me? Perhaps I should do neither, confident that the more attention a new toy receives, the less it probably deserves.

The suddenness with which AI arrived on campus last winter, in the form of ChatGPT, and the speed with which it became, overnight, all anyone was talking about, are reminiscent of other much-ballyhooed events of the not-too-distant past. Remember Y2K? Our computers would all stop working. Airplanes would fall from the sky. Civilization would be thrust back into the Stone Age. Yet, as I strongly suspected would be the case, none of that happened. It turned out to be a big “nothingburger,” as they say.

Or how about the introduction of the Segway scooter back in the early 2000s? Does anybody else remember the hype surrounding that? It was supposed to “fundamentally change” the way we all live. Spoiler alert: It didn’t.

More recently, I could point (with some trepidation) to the Covid panic of spring 2020, when we were treated to scenes of Chinese people dropping dead in the streets, shots of freezer trucks outside New York hospitals, and running death counts on the nightly news. The implication was clear: This respiratory disease was on par with Ebola or the Bubonic Plague. Yet none of that, or at least very little of it, was real.

It is now apparent that, if we subtract from the highly-publicized totals those who died with the virus as opposed to from the virus—as well as those whose deaths were actually caused by the treatments they received (or failed to receive) and those who died due to other “mitigation” measures such as lockdowns—the Covid “pandemic” amounted to little more than a couple of bad flu seasons, if that.

In other words, the pandemic, too, was mostly hype. It was never as bad as the government and public-health officials told us it was. But we bought into it, anyway. This has become a primary feature of modern society, the so-called “information age,” in which relatively minor events are regularly blown out of all proportion by the potent combination of “expert” opinion and media, especially social media.

The current obsession with all things AI seems to me to be just the latest iteration of this trend. I don’t think it will turn out to be a complete bust, like the Segway, but I do think it will soon become endemic, just part of the landscape, like Covid and flu. I may be wrong; time will tell. Perhaps a year or two from now I will be embracing AI enthusiastically and penning a giant mea culpa. But I doubt it.

Meanwhile, how should those of us who teach in non-computer-related fields respond to the existence of AI and all the hype surrounding it? As someone who teaches primarily college writing, I have colleagues who are enthusiastically embracing AI, changing all their assignments, and encouraging students to “work with it.” Although I like and respect many of those individuals, I take issue with their approach. As teachers of the humanities, in particular, we have a different job.

I was taught that the “humanities” encompass all that makes us uniquely human: art, literature, philosophy, and religion. The purpose of offering humanities courses is to help students more fully embrace their humanity—to think for themselves, expand their minds, explore and come to terms with their deepest hopes, dreams, and fears. Artificial intelligence, it seems to me, is the antithesis of all that, as even the very name suggests.

What, after all, is the reason for allowing students to use AI in the humanities classroom, much less encouraging them to do so and teaching them how? Because they will probably be using it at some point in their professional lives and maybe even in other courses? Fine. Let them learn how to use it elsewhere (if indeed they really needed to be taught). Because it “makes things easier for them?” What exactly are we making easier? Thinking? Why in the world would we want to do that?

Every humanities teacher knows that thinking well is hard work, that it does not come naturally to most people, that they therefore must discipline themselves to do it consistently, and that becoming a clear thinker is nevertheless a worthwhile pursuit because it brings great personal and professional rewards. For the life of me, I don’t understand why we would want students to do something that requires them to think less or suggests that turning their thinking over to a machine is a good idea.

And what about writing? One of the things I keep hearing from AI enthusiasts is that we can still teach thinking but allow students to use AI to help them express their thoughts. No, I’m sorry, it doesn’t work that way. Every writer understands, or ought to understand, that, in a very real sense, writing is thinking. They are not two separate activities. They are inextricably linked.

Indeed, one of the main ways we teach students to think is by teaching them to write—in their own words, in their own voice, engaging their own brains. Personally, I see no need to teach my students how to write like robots. They get enough of that in their high-school AP classes. Teaching them to write like real human beings—that is the challenge.

I alluded above to the fact that the swift and sudden advent of ChatGPT on college campuses was met with numerous pronouncements from on high. One of those, for me, came in the form of an email from my department chair, no doubt instigated by the dean and probably by the provost, informing us we were to include a “Statement on AI” in our syllabi. To their credit, those administrators didn’t tell us what the statement had to say or how we should approach the topic, just that we needed to let students know what we planned to do.

Fair enough. After giving the matter some thought, I wrote the following, which is now part of the syllabus for all my writing courses:

The main purpose of this course is to help you learn to express yourself, clearly and cogently, in your own unique voice: your thoughts and ideas, your emotions (where appropriate), your words. There is great value in that kind of authenticity, both personally and professionally. AI may be a useful tool for many things, but it cannot help you sound like the best version of yourself. It is also bad at following directions and tends to make things up, both of which can be grade-killers. For all these reasons, you MAY NOT use AI on any of your assignments in this course.

I try my best to structure the writing assignments so you can’t simply turn them over to ChatGPT. But of course I don’t always succeed, and clever students can often find a work-around. (Why they don’t just apply that cleverness to the assignments, I’ll never understand.) If I can prove that you used AI—and there are programs to help with that—you will receive a zero on that assignment. If I can’t prove it, but the writing sounds robotic—whether or not you actually used AI—you will almost certainly receive a lower grade than if you were writing in your own voice. (I’ve been reading essays that sounded like they were written by robots since long before AI came along. I refer to that as “AP Syndrome.”) A big part of what I’m trying to teach you is how to write in such a way that you sound like an actual, intelligent, unique human being, with personality, experiences, passions, and opinions, and not like some soulless computer program.

Can I actually prevent students from using ChatGPT or any other form of AI? Probably not. But through a carefully curated combination of teaching, encouraging, cajoling, a little bit of bluffing, and continually fine-tuning my assignments, I can at least make it more difficult for them to simply outsource their writing or thinking to the hive brain.

If that makes me old-fashioned, outmoded, shortsighted, hidebound, intransigent, uncool, or a stereotypical “Boomer,” so be it. I will always believe that my job is to help students learn to cultivate their own intelligence, not rely on the artificial kind.

So, hey, ChatGPT? Get off my lawn.

This article appeared first on Brownstone Institute under a Creative Commons License (CC BY 4.0).

Image credit: Unsplash

  • RSS WND

    • 'The great replacement'? Hell yes
      I recently gave a speech about open borders, why it's happening, who benefits and what's behind it – "the great replacement." My speech was delivered at a conservative conference put on by and attended by sheriffs from across the USA. My speech brought down the house and ended with a standing ovation. That's what happens… […]
    • Who is shaking the jar ... and killing America?
      The seventh book of C.S. Lewis' "The Chronicles of Narnia" is titled "The Last Battle" and depicts the end of the magical realm presided over by Aslan. As the remnant witnesses the destruction of their beloved land, one of the characters (Lord Digory) – who had witnessed the birth of Narnia – makes the remark:… […]
    • Biden campaigns on killing babies
      "Abortion," "women's reproductive health care," "freedom of choice," "my body my choice," "it's only a fetus" – all these nice little terms and sayings in reality represent the surreal, unnatural, against nature "right" for a mother to have her baby killed in her womb. Even the term "fetus" is a Latin work for "offspring," which… […]
    • Is climate change spurring child labor? No, but EV batteries are
      By Linnea Lueken Here we go again. Among the most annoying trends in media is one where a journalist will take any random topic, be it "trans sex workers" and their struggles in Indonesia, predatory loan practices, human trafficking – pick your poison, and connect it to climate change. My theory when it comes to… […]
    • Presidential contest turns into 'Saul vs. David'
      He's called the "Ragin' Cajun" for a reason. Watching Democratic strategist James Carville's recent expletive-filled rant, blasting "You little f–-ing 26-year-olds!" in response to recent polling showing Trump's healthy rise in support from young voters, I thought of King Saul's reaction to the future King David's mounting popularity, "an ugly mood" consuming Saul so that… […]
    • Confessions of a 'hate criminal'
      The remnant of Western civilization, which still values freedom of speech and other classical ethics and virtues, is aghast at Canada's Stalinesque "online harms bill," which would punish so-called "hate speech" with penalties up to life imprisonment, offers both cash incentives and legal anonymity for "whistleblowers" and would retroactively cover speech that occurred even decades… […]
    • It's simple: Let the Bill of Rights rule
      Years ago, a committee of lawyers from the Los Angeles County Bar Association gathered to discuss the issue of a "fair trial." Invited to the discussion were various leaders of the newspaper industry in Los Angeles County. The lawyers were in search of support of their idea to regulate the reporting on criminal defendants. The… […]
    • The deadly cost of lesbianism and feminism
      According to a major study by the Harvard Pilgrim Health Care Institute, "bisexual women die, on average, nearly 40 percent younger than heterosexual women, while lesbian women die 20 percent sooner." These are tragic numbers, numbers that should concern all of us, regardless of our attitudes towards lesbianism and bisexuality. If you care about people,… […]
    • Psalm 27: Encroaching End Times darkness
      Editor's note: The following video is presented by Pastor Daniel Joseph, president and founder of Corner Fringe Ministries. Subscribe to the Corner Fringe YouTube channel here. The post Psalm 27: Encroaching End Times darkness appeared first on WND.
    • Israel: Christians' past, present and future
      The name Jerusalem means "city of peace" or "habitation of peace." Yet ironically, more wars have been fought at the gates of Jerusalem than that of any other city on the face of the earth. For Christians, Jerusalem and Israel are part of our past, present and future. We're connected to Israel. And we're connected… […]
  • Enter My WorldView