What this means for learning • The Tulane Hullabaloo


Faculty and staff gather to discuss the risks and community impact of artificial intelligence (Sarah Peters | Sarah Peters Photography)

According to a 2024 investigationapproximately 85% of Tulane University students regularly use artificial intelligence tools at school. The growing prevalence of AI has changed the academic landscape and the way students learn.

Tulane students recognize this changing learning environment, and many have integrated AI into their study habits. Gavin Mack, a freshman at Tulane, said his attitude toward AI has changed.

“I always thought AI was a little sketchy,” Mack said. “But recently it’s been getting a lot better, and I’ve been using it mostly to create practice questions for myself in preparation for my midterms. And it’s been working pretty well.”

Tulane student Grace Dunning said: that while she doesn’t use AI often and prefers to stick to her own method, she uses ChatGPT primarily for academic purposes for help with homework, studying, or planning.

In addition to their own experiences integrating AI into their studies, Mack said he is concerned about the misuse of AI in the classroom.

“I feel like when it comes to trying to understand the material, I don’t care, as long as you understand what you’re doing. I think that’s perfectly fine,” Mack said. “My problem comes with doing class work with it, because it requires little effort. You don’t learn anything from it, and it kind of undermines the work that everyone puts into it.”

The prevalence of student use of AI has led many professors to adapt their programs.

Julia Langprofessor and director of the Phyllis M. Taylor Center for Social Innovation and Design Thinking, said students’ use of AI has changed the way they create assignments and courses.

“I no longer assign things where I say, ‘Read this article and write a discussion article about it,’ because while I can hope that students will do it, it’s such an easy thing to assign to AI,” Lang said.

Marc ShealyEnglish professor, also noticed how AI removes the role of research and critical thinking in the students’ learning process.

“I said to my students today: ‘A third of your answers are all the same, maybe you’re using different models… [and] you go back and tweak it here and there, but [you’re] do the same standard thinking,” Shealy said. If he asked students “to stand in front of the class and tell us what you wrote and explain, I don’t think they could do it.”

Professors are increasingly aware of the negative effects of AI on student learning and are reluctant to assign work that can easily be done with AI.

Lang and Shealy agree that AI will be an integral part of students’ lives after graduation. Faculty and programs collaborate to help students use AI safely and in a way that supports research, originality, and society.

A development Tulane has already sued These include formal policy changes and the creation of several faculty committees to explore AI in the classroom. Tulane now requires faculty to have a statement on AI in their curriculum.

“This is actually the first semester where faculty are required to have a statement on AI in their curriculum,” Lang said. “Before, it was a sort of ‘don’t ask, don’t tell’ policy, which is not sustainable. »

According to the AI ​​committee report, promoting more transparent and adaptive AI is essential for the ethical use of AI among students.

“I think students need to be the humans involved in the loop… [who] use AI to do some of their research, but then constantly refine and provide more feedback to the AI ​​to get better answers. So I think that kind of critical analysis is really important,” Lang said.

According to Lang, data literacy will also encourage the ethical use of AI.

THE Conolly Alexander Institute for Data Science offers courses on AI tools and the evolution of AI. The Center for Community Engaged Artificial Intelligence is a multidisciplinary team of scientists, students, and community members dedicated to promoting human-centered use of AI that will benefit society.

There is several challenges this will come by helping students become more familiar with AI and more responsible, including around time, environmental concerns, and fear of using AI in the first place.

Dunning agreed that the university should develop the culture of AI, but also consider its environmental impact.

“Rather than banning it outright, [we should be] “You have to educate people on how to use it appropriately,” Dunning said. “But I also think there’s a very big question mark about its environmental impacts, particularly with the energy center being built in Louisiana.”



Leave a Reply

Your email address will not be published. Required fields are marked *