The incident reflects growing concerns among students about faculty misusing AI — an interesting reversal from the past, when faculty themselves often worried that students would use AI to cheat in their studies.
Specifically, according to The New York Times , a female student named Ella Stapleton, who just graduated from Northeastern University (USA) this year, began to suspect "strange" signs in the teaching materials of the business lecturer. Those signs included a missing "ChatGPT" citation in the reference section, many spelling errors commonly found in machine-generated text, along with misleading illustrations - such as characters with strangely extra arms or legs.
"He told us not to use it, but he used it himself," Stapleton shared.

After discovering the incident, Stapleton filed a formal complaint with the school’s business school, focusing on the instructor’s unannounced use of AI, as well as raising broader concerns about the teaching style. She also requested a refund of the course fees — which totaled more than $8,000.
After several internal meetings, Northeastern University has rejected the student's request for a tuition refund.
The instructor involved, Rick Arrowood, later admitted in an interview that he used multiple AI tools, including ChatGPT, the Perplexity AI search engine, and the Gamma presentation builder.
“In retrospect… I wish I had checked more carefully. I think faculty need to think seriously about integrating AI and be transparent with students about when and how to use this technology,” he said.
"If my experience can help others learn a lesson, I feel relieved," added the part-time lecturer with more than 15 years of teaching experience at many universities.
Northeastern University's communications representative, Renata Nyul, Vice President of Communications, also responded to Fortune : "Northeastern welcomes the use of artificial intelligence to enhance teaching, research, and operations. The university provides adequate resources to support the appropriate application of AI, and continuously updates and implements appropriate policies across the system."
When students come back to question instructors' use of AI
Since ChatGPT launched in late 2022, students have quickly become the first group of users, tapping AI to do essays and assignments in seconds. This has created a “silent war” between professors and students — with many trying to detect and prevent cheating.
But the tables have turned somewhat. More and more students are taking to forums like Rate My Professors to complain about professors’ misuse of AI in their teaching. Many argue that this undermines the real value of tuition—the money students pay to learn from humans, not from technology they can access for free.
According to Northeastern University's AI policy, any faculty or student using AI to create academic content must fully cite the source, and ensure regular checks for accuracy and appropriateness of the content before official use.
According to Newsweek , the above case shows that the application of AI in highereducation is becoming more popular. A survey conducted by consulting firm Tyton Partners in 2023 showed that 22% of university lecturers said they regularly used generative AI in their teaching. By 2024, this number had nearly doubled, to nearly 40% in just one year.

u a year
Source: https://vietnamnet.vn/sinh-vien-doi-truong-tra-lai-hoc-phi-vi-giao-su-gian-lan-khi-soan-bai-2405175.html
Comment (0)