The rise of generative AI in society has also given rise to
Generative AI is known not for its skill with numbers but words, which makes it an unfortunately ideal cheating tool for humanities courses that use written essays as major components of their programs. While accounting is not exactly 19th century romantic literature, language and writing are not entirely irrelevant. An accounting student may not need to analyze the major themes in Ulysses, but may be called upon to interpret an accounting standard, tax regulation or audit document, which can be just as dense and confusing. So while there are not as many opportunities for AI-guided cheating as in other fields, students are finding places where bots can do their work for them, much to the chagrin of their professors.
"This is definitely something I have heard quite a bit about from my colleagues in the humanities and other fields, but is becoming an issue for accounting/finance classes as well," said Sean Stein Smith, a professor at Lehman College who teaches intermediate accounting, cost accounting, advanced accounting and forensic accounting, and leads Lehman's development of AI business courses, as well its crypto/blockchain content. "Students still need to understand the implications of ASU's, disclosures, etc., and if they rely entirely on AI for assignment completion, that knowledge will fade away."
![AI cheating](https://arizent.brightspotcdn.com/dims4/default/a7dbc3f/2147483647/strip/true/crop/6016x4016+0+0/resize/740x494!/quality/90/?url=https%3A%2F%2Fsource-media-brightspot.s3.us-east-1.amazonaws.com%2F7c%2Fb4%2F428d7fce462da525181c399cc75e%2Fadobestock-604536856-editorial-use-only.jpeg)
He has seen AI-guided cheating first-hand, especially in short-form essay assignments as well as when he requires students to perform financial analyses using specific ratios.
Douglas Carmichael, former chief auditor of the PCAOB and currently a professor at Baruch College where he teaches auditing, said he does not give any writing assignments that students could use generative AI to cheat on, but this doesn't mean they're not using AI to undermine the purpose of an assignment. Once students realize he has caught onto them, it has become less of an issue.
"I do ask students to submit at least one question before class on something in the text or recorded lectures they found difficult to understand or want additional information about," said Carmichael. "My experience in prior semesters was that about half of the students submitted a question that seemed suspicious to me given the language used and generality of the issue. The lack of specific reference to the topic in the text or recorded lecture was also apparent. These kinds of questions did not earn any credit and as word got out about that use of ChatGPT is infrequent."
Even when students are not outright cheating, some educators have observed an unhealthy reliance on generative AI starting to form. Jack Castonguay, vice president of learning and development with Surgent as well as a Hofstra University professor who teaches advanced courses in accounting and auditing theory, has seen students struggling with understanding and communicating core concepts, due in part to their reliance on generative AI.
"We see the reliance significantly when they have to give a presentation or take an in-person exam. It's clear they have gotten to that point by using AI and can't apply the logic on their own," said Castonguay. "Maybe in three to 10 years (given the speed of the improvement in LLMs) they won't have to do it on their own, but it's a large problem now for client relationships and having conversations with this in practice. They need to look up everything and use AI as a crutch. Seminar discussions are like pulling teeth oftentimes for me."
With this in mind, accounting educators — much like those in other fields — are discussing how to respond to this issue. Richard C. Jones, a Hofstra University accounting professor and former technical staff member at the Financial Accounting Standards Board, said this is a major topic of debate and discussion among college faculty and administrators, noting that it seems to be brought up in nearly every meeting. It's obvious, he said, that students will use LLMs on assignments, so the challenge for faculty is to assign projects and papers that require students to actually demonstrate their knowledge versus just handing in a paper or presentation.
"Fortunately, I teach classes that require the application of accounting rather than accounting theory," said Jones. "Therefore, my exams and other assessments are specific to case information provided and application of the accounting rules in providing the journal entries and the related disclosure information. So, my students do not have as much of an opportunity to use LLMs to answer the questions."
Educators are trying to find ways to work AI into their assignments, considering how quickly accounting firms themselves have taken to it, he added.
Tracey Niemotko — a Marist University professor who teaches accounting and auditing as well as sustainability, taxation and forensic accounting — views AI as more of a tool than a cheating mechanism, pointing out how models can be used to expedite audit procedures or clear away the busy work that eats up the day of many professionals. Consequently, she is more sanguine about AI-guided cheating, noting that even if students do use AI in their assignments, the nature of the work makes cheating difficult.
"Even with electronic testing in the classroom, I do not see cheating as a concern overall. I think the accounting students are perhaps a bit more disciplined than most students, so I don't think they have the mindset to cheat. Even for writing assignments in my upper-level accounting courses, students may use AI to assist them, but they are required to write 'in their own words.' Overall, the majority do their own written work but may use AI as a tool to help them develop an outline or get them started," she said.
Abigail Zhang Parker, a University of Texas at San Antonio professor whose research specialty is AI in accounting, has also directly worked AI into her classes. For example, her courses on accounting information systems include hands-on workshops where students learn to operate different accounting software solutions. She noted that AI can be a useful tool for finding relevant information and understanding difficult concepts.
Her overall philosophy is that students can use generative AI to help with assignments but not on exams, as that is when they're tested on their actual understanding of the topic. So long as it is only used for assignments versus exams, she does not consider using AI to be cheating. It would be impractical to prevent the use of AI entirely anyway, she added, so it's better for educators to find ways to use it too. However, she noted that teaching students proper use of AI can itself present a challenge.
"Perhaps we need to guide them how to use it properly," said Parker. "This is not easy. One method that came to my mind is to make the parts that demonstrate students' own skills take a greater portion in the grading components. … For example, there are three exams throughout the semester, and they take 60% of the total grade, while assignments take 10%. For classes where students need to submit a report and make a presentation, maybe the report itself will not take up a high portion of the grade, but the in-person presentation will, as it better reflects students' true understanding of the subject. And once students know that they will be mainly graded on their own performance, they are more incentivized to think through the problem than simply over-relying on AI."
Another reason to learn AI in the classroom is that, once students are working as professional accountants, clients will likely be using AI as well, and they will need to understand and explain what is missing from the AI's answers. However, Castonguay, from Hofstra, voiced concerns that over-reliance on AI is eroding the critical thinking and reasoning skills needed to properly evaluate these answers in the first place. He does an exercise in class where students have ChatGPT summarize a FASB ASU and review its findings. Some, he said, don't even know where to start as they have obviously been relying on ChatGPT to understand it at all.
"My bigger concern is [that] by such a reliance on AI they will lack the critical thinking and synthesizing skills that are still valued even with AI. To use a sports analogy, they are only bowling with gutter guards — what happens when those aren't there?" he said.
These kinds of concerns underscore the need to teach responsible AI usage in a way that does not degrade the human skills that accountants will be relying on in the professional world, but this could be an uphill battle.
"I do think that as AI becomes more integrated into the classroom and profession, we are going to have to really double down on making sure students still have the ability to think critically," said Smith, from Lehman. "Especially in cases where questions or data may change on the fly, students are seeming to have a harder time pivoting and adapting to analyze data on the spot. It's a growing problem with no cookie-cutter or easy solution, but is definitely something I know is being talked about in pretty much every accounting department/school of business."