Ask Your Students Why They Use AI (opinion)


I once had a student who used AI to generate several of his assignments. The first assignment in my class is an ethnographic paper in which students discuss the culture with which they most identify and the language of that community. It’s an assignment for my students to write about the things that are most important to them, so it was a little surprising to see this student use AI to generate his work. At first, I told this student that he couldn’t use AI to write his entire essay or I would have to fail his homework. Other times, when I have encountered students caught using AI unethically, they stop using it almost instantly.

However, what surprised me the most was that this student continued to use it, even after my warning. It even got to a point where I was about to file an academic dishonesty report, but I was curious as to why he continued to use AI to complete his work, even though he knew the consequences could include failing the course. When I asked him about it, he replied that he was not a “good writer” (which he had mentioned during our first conversation) and that he had never written an article longer than one page. He even went so far as to say that his high school “didn’t teach him anything.” As a result, he was very withdrawn, rarely spoke in class, did not participate in group discussions, and did not feel at all confident in submitting what he himself had written.

I had another student, this time in an asynchronous online course, who also used AI to generate multiple assignments. When I emailed her to ask why she relied so much on AI, she responded with the following:

“I had a lot [people] leave me at college. I didn’t even try out in high school – it was a struggle for me. It was easier for me to use AI… I know I’m not smart enough.

As touching as this may be, I don’t think it’s a rare experience, especially for students from marginalized communities. (Both students I spoke with were first-generation Hispanic students from small, rural towns.)

I currently teach English at a community college in central California, and one of the reasons I love teaching here is the diversity of my students. They come from very different backgrounds and experiences: some are fresh out of high school, while others are returning to college after a short (or long) hiatus. Many of my students are also the first in their families to go to college.

For the rhetorical analysis unit, I show my students the famous 2007 TED Talk by Sir Ken Robinson “Does school kill creativity?“It’s still a great talk for understanding how a speaker can effectively use ethos, pathos, and logos in oral communication, but more than that, Robinson’s message resonates deeply with students. When he jokes about how schools “gradually educate them from the waist up, then we focus on their head and slightly to the side,” my students laugh — not just because it’s funny, but because they know it’s true. Many students have shared with me over the years how they often feel. “idiots” in middle and high school when they didn’t know what to do for an assignment or essay and how that still affects their self-esteem in college.

Just as I encourage my students to use Wikipedia wisely, I also encourage them to use AI ethically. After my students submit an essay, I have them fill out a form indicating whether they used AI and, if so, for what purposes. Usually, about half the class admits that they have used AI, while the other half says they have not. Of those who do, they report using AI primarily for brainstorming ideas, assisting with citations, proofreading or suggestions, and rephrasing or paraphrasing.

In my experience so far, it’s actually rare for students to use AI to generate entire drafts. Rather, they can use AI to generate part of their essay, such as an introduction or a few body paragraphs. However, as we continue to explore ways to implement AI in the classroom, we must also remind our students of an often-forgotten truth about learning: it’s complicated and it’s okay to make mistakes. When Robinson, in 2007, says that education stigmatizes mistakes, my students of 2025 nod and wholeheartedly agree.

When I talk with students who have been caught using AI, they usually do two things: They apologize and almost always say, “I’m just not a good writer.” » I have always made it a point to let my students know that there is no such thing as a bad or good writer, a statement that sometimes raises eyebrows. The real distinction, I tell them, is between experienced and inexperienced writers, because writing is a skill that can be developed. This is not something I only address on the first day of class; This is one I often emphasize. It’s now in my program, in a section I call “Effort Formula.” When I tell my students that effort and consistency, not talent, are what make someone an experienced writer, I want them to recognize this as a statement of truth. I even have assignments centered around the themes of failure and creativity, and I encourage rewrites and revisions.

In my experience, it’s often the students who didn’t get the teaching they needed before college and who have the least experience in the writing process who turn to AI. They feel unprepared and insecure in their abilities, pointing to a larger problem in which mistakes are still stigmatized and completion (not competence) is often the norm.

There is currently much discussion about the exciting possibilities that AI can offer, and chatbots can certainly be valuable tools that can support learning and streamline the process of gathering and evaluating information. However, as crucial as it is to discuss the substantial downsides of AI, such as its environmental costs, we must also examine the long-standing factors that sometimes encourage students to use AI in unethical ways. If we don’t start these conversations now, much of our dialogue about the possibilities of AI in the classroom will mean very little.

Ernesto Reyes is a professor of English at Fresno City College.

Leave a Reply

Your email address will not be published. Required fields are marked *