The more efficient the AI, the more complex its implementation

The most efficient way for accounting firms to integrate generative AI into their workflow is through robotic process automation that interfaces directly with the model's application programming interface, though this method also requires the most expertise to implement and maintain.

This is the conclusion of a recent paper published in the American Accounting Association's Journal of Emerging Technologies in Accounting, authored by Rutgers University professors  Huaxia Li and Miklos A. Vasarhelyi. The paper presented a general analysis of how accounting firms deploy large language models (e.g. ChatGPT, Claude, Gemini, etc.), and the pros and cons of each approach. Overall, it appears that more complex tasks are best performed by more complex deployment methods, which tend to be more difficult to use. Conversely, simpler deployments are better suited to simpler tasks but are much less efficient.

The paper specifically named four different ways firms deploy generative AI. 

The most straightforward way to do so is through a user interface with visual and interactive elements–picture ChatGPT's web interface as an example. The paper said this method is most accessible for accounting researchers and practitioners seeking to implement LLMs, as it simply requires an internet-connected computer. It is also the cheapest in terms of access cost. At the same time, it is the least scalable and customizable of all the options and the slowest as well due to token limitations. This in mind, the study's authors said this method is best used for client engagement and consultation, basic financial analysis and reporting and basic compliance checking. 

The second is through connecting to the model directly via an API, a type of software interface enabling computer programs to communicate with each other, enabling direct passage of data. Firms can leverage an API to establish connections between their local applications/systems and the LLM service, enabling data interaction between them. This API approach can be integrated into existing workflows without significantly altering their structure, is well suited for scalable processing and allows for a greater degree of parameter setting and customization. At the same time, deployment is more complex, requiring skilled personnel to pull it off. Another limitation is the potential incompatibility of the existing workflow with API connections. The authors said some accounting tasks that benefit most from the API approach include basic financial data extraction, transaction classification and verification, and basic fraud detection. 

The third is using RPA to interact directly with a traditional user interface. This allows for batch querying that the user interface method alone cannot accommodate, and is easier to integrate than the API method alone as RPA can mimic human interactions and so even if the existing system does not support underlying programming-level interaction, RPA can still connect it with the model's user interface to enable automatic querying. Additionally, the UI-RPA method can also be combined with manual efforts that require human judgment. However, the setup is even more complex than the API method alone, and the maintenance process will also require skilled personnel who can update the bots based on changes in the user interface and the working process. Further, not every system integrates with RPA, and introducing new software might create additional privacy and cybersecurity issues, especially for accounting tasks. The authors said UI-RPA is suitable for accounting tasks such as expense management and auditing, asset management and depreciation scheduling, and budgeting and forecasting that require interaction between LLM and local systems.

The fourth is using RPA to interact with the API connected to the large language model. This is the most in-depth integration a firm could have with existing workflows, and the paper said this method maximizes the efficiency of implementing LLMs in the accounting domain. It is more efficient than even the RPA to user interface method as RPA enables the process to robotically collect raw data from existing systems by recognizing graphical-level elements and inputting them into the LLM via the API to achieve efficient queries. After the LLM's processing, the bot can automatically retrieve the output and transmit it back to the internal systems. However, this method has all the same problems of the RPA to user interface method, but is even more difficult to set up and maintain. In general, the authors said the best use for this method is systematic financial data extraction and analysis, regulatory compliance and reporting, and trail analysis and fraud detection.

The paper found this method is the most efficient in terms of the time it takes to extract 500 unstructured financial statements. The User Interface method alone took 1,800 minutes; the API method alone took 142 minutes; the combination of user interface plus RPA took 67 minutes; and the API plus RPA approach took 42 minutes. 

In terms of pure access costs, processing those 500 financial statements was just 83 cents through either the user interface or user interface plus RPA method versus $18 for the API and API plus RPA methods. However, given the time it takes to perform this task, the pure user interface method wound up being most expensive, as researchers added $52 in labor costs to those 83 cents. The API method alone, when accounting for labor costs, was the second most expensive, as the $18 access cost was combined with $31.25 in labor costs. 

All this in mind, the researchers concluded that the API plus RPA method was the most efficient in terms of both time and money. 

"The study finds that currently, the API-RPA is the most efficient method for large-scale accounting tasks. On the other hand, the API and API-RPA approaches are the most expensive methods to apply under the current price rate of GPT4 API," said the paper. 

However, researchers warned that the discussions of each method are based on the current level of technological development and cost. 

"Some limitations might be overcome in the future with the adoption of new models. Additionally, the costs associated with each approach might change based on computing costs and market demand. Further research is needed to discuss additional application methods and cost-benefit models based on future developments of LLMs," said the paper. 

For reprint and licensing requests for this article, click here.
Technology Artificial intelligence Practice management RPA
MORE FROM ACCOUNTING TODAY