Educators worry about students using artificial intelligence to cheat

Earlier this month, New York City public schools blocked access to the popular artificial intelligence tool ChatGPT. Educators are concerned that students could use this technology to write papers – the tool wasn't even a month old when a college professor in South Carolina caught a student using it to write an essay in philosophy class. Darren Hick of Furman University joins John Yang to discuss.

Read the Full Transcript

  • John Yang:

    When the New York Public School District blocked access to the popular artificial intelligence tool ChatGPT earlier this month, it was the latest response to concerns over how rapidly changing technology is effecting our lives. Educators worry that students are using this technology to write papers and that they'll never have to learn how to write on their own.

    The tool wasn't even a month old when a college professor caught a student using it to write an essay for her philosophy class. Darren Hick Furman University wrote about that on his Facebook page. And he's with us now for our periodic series, the AI Frontier.

    Professor Hick, what were the red flags. This was an essay for the record about Scottish Philosopher, David Hume. And the paradox of her, as you were reading this essay, what were the red flags that this might have been other than her own product?

  • Darren Hick, Furman University:

    There's several red flags that come up. And in any case of plagiarism. They just sort of build up until you have to screech the grading to a halt and look into the problem. In this case, it got some basic issues exactly right. Other things fundamentally wrong. It talks about things that the student wouldn't have learned about in class, which is always something of a flag and connected things together in a way that was just thoroughly wrong. But it was beautifully written. Well, beautifully for a college take home exam, anyway. So, it was a weird collection of flags.

  • John Yang:

    You say these, were they different from the red flags you'd get from, I don't know, what we'd call — I guess we call old flash fashion plagiarism?

  • Darren Hick:

    Well, normally what happens when a student plagiarizes, there's a sort of cost benefit analysis that goes on. But usually it's a panic, right? At the end of the semester, they realize they don't have enough time or enough knowledge to put this thing together properly. And so, they cobbled pieces together that don't really fit together. What was odd about this is there was some of that. There was things that they didn't understand. Things that they didn't understand, or seem to understand. But it was just so well composed, which is not something you would see with an essay that's cobbled together at the last second. It was nicely written. Sentence by sentence, it was nicely structured, it was just really odd.

  • John Yang:

    And you actually use the ChatGPT product or part of this tool to confirm the plagiarism?

  • Darren Hick:

    That's right, there was a detector that was designed by the same lab that had created the GTP generator in the first place. And so, I knew that this thing was around. So, I thought, well, when I had these suspicions, let's plug it in and see what it has to say. At that point, I had never done anything with it. So, this was new investigation for me.

  • John Yang:

    What did you take away from this experience? What does this lead you to think about this technology? About the uses, the misuse of it? Do you have a policy about it in your class now? What do you walk away from this with?

  • Darren Hick:

    I have a really ambivalent view on this, it's, on the one hand, it's fascinating. It's a great toy. I've been playing with it a lot. I can do all kinds of things with it. I had it right Christmas stories for me, it's just a lot of fun. On the other hand, it's terrifying. It's — what's terrifying about it is that it's learning software. It's designed to get better.

    So, when I caught the student using it, it was maybe three weeks old at that point, not even, and it was an infant. But a month from now, it's going to be better. A year from now, it's going to be better. Five years from now, the genies out of the bottle. So, my worry is mostly about how do we keep up with this thing? How do we prepare for this thing?

    Plagiarism isn't anything new. I don't expect a new flood of plagiarists, but in that cost benefit analysis that I was talking about, this changes the analysis for students. This is a tool that makes things easier, faster. And so, my worry is that we'll get more students who are using this method, and we need to be prepared for that coming. So, what I have to do in the classroom is change that analysis again.

  • John Yang:

    How are you going about that? I mean, you talk about sort of having to, you know, sort of keep up with this out? How are you doing it in your classroom?

  • Darren Hick:

    Well, I have to rethink every assignment that I give. You think about plagiarism every time you give an assignment anyway, so that's not new. You have to think of new methods though. Currently, in my syllabus, the newest changes, I tell students if I catch a whiff of AI in their essays, I'm going to throw out their essay and give them an impromptu oral exam on the spot, with the hope that that's enough to make them say, well, I better understand what I'm putting in that essay.

  • John Yang:

    You say that there's sort of there — it's a mixed bag that there are some upsides to this tool. What are the upsides in your view?

  • Darren Hick:

    I have seen a lot of people trying to think creatively about how to use this in a classroom. We can't ignore new technology. There's fun things to be had here. I think one of the best suggestions that I saw is somebody said, assign a prompt to your students. Have your students put that into ChatGPT. See what essay it produces and then write a new essay that analyzes that essay, says what it gets right, what it gets wrong. That's creative, that's interesting. That's getting a little bit ahead of the bar. Of course, I would ask what stops ChatDPT from analyzing its own essay?

  • John Yang:

    That's a good point, you could have a sort of a circular argument here.

  • Darren Hick:

    You have to stick a step ahead if you can.

  • John Yang:

    Yeah, but could you see using as a teaching tool?

  • Darren Hick:

    Sure. I teach about the ethics of AI in my intro philosophy class. We're going to be poking around at this thing later this semester. So, it's absolutely raising questions that are worth asking. But at the same time, it is a potentially dangerous tool.

  • John Yang:

    Darren Hick of Furman University, thank you very much.

  • Darren Hick:

    Thanks so much for having me.

Listen to this Segment