Some firms, the ones that constantly grab headlines, are spending billions of dollars to create bespoke artificial intelligence systems to service their Fortune 500 clients who need complex compliance and advisory services to support their global footprint.
In contrast, the vast majority of firms are spending maybe a few thousand dollars to license commercially available models, and perhaps a few thousand more to train staff and integrate systems. Overall, if someone is not building billion-vector custom models housed on a massive server farm, AI is actually quite cheap, especially considering the capacity upgrades it can present.
Firms with about one to 200 people are mostly engaging with the subscription model products right now, according to TJ Lewis, innovation strategist with Rightworks, an accounting-specialized cloud provider. "They're not building out their own models," he added. "They're not securing a bunch of server time or things like that to spin up their own things by and large."
Part of this is because smaller firms don't have the resources to construct their own custom AI models, especially the oceans of data to feed them, as well as the technical experts to bring it all together, according to Donny Shimamoto, head of IntrapriseTechKnowlogies, a tech advisory practice specializing in CPA firms.
"It kind of makes sense," he added. "In order for AI to work to a good extent, you need a high volume of data, and smaller firms just don't have that volume. They need to teach the AI. And they also don't have simply the teams to be able to build that out cost effectively."
Another reason is they really don't need to, he added. Huge, sophisticated AI models are generally used for highly complex tasks for highly complex companies, which is why the international-scale firms tend to invest in them. Conversely, the tasks most local accountants are handling for their clients are simpler by many orders of magnitude. In the majority of cases, said Shimamoto, a commercially available model will work fine for their purposes.
"There's personal AI or personal LLMs that, if a practitioner had a decent amount of content that they wanted to have readily searchable, they could use those LLMs like [Google's] Notebook [LLM] or something. Those personal LLMs are designed to run off of a laptop, so you won't see these huge incremental costs coming along," he said. The cost of commercial AI solutions has also been going down over the years, he added, and many of the ongoing costs of these products are now at the vendor level.
Furey Financial Services, a 38-employee firm in Hoboken, New Jersey, that was also named one of this year's
"From our perspective we're not trying to build our own LLMs or infrastructure," he said. "It was really about how do we from a low-cost perspective leverage some of these models out there and plug them into our workflow, so you can differentiate AI into that platform component. … We're going to really focus on the application layer and see where we can put these things to use while not trying to build the new AI model ourselves."
This falls in line with the general advice Shimamoto had for smaller firms looking to invest in AI. It is the same as it is for any other major tech purchase: Firms must start with the use case, then find technology to fill it. Too many firms, he believes, do it the other way around, much to their detriment.
"It's the same way we've prioritized IT spend for the last two decades at least," he added. "It comes down to where is the business value? What is the business strategy? And how will AI contribute to that? We do have to be careful of AI being a solution looking for a problem, but I have been seeing that a bunch."
Waller said that when Furey was first thinking through its approach to AI, it considered building its own proprietary model, or to train one using an open source model like Llama as a base. However the firm calculated that this would carry not just a significant one-time cost for development but ongoing expenses such as server space. "We decided not to go that route and really just say 'Hey, let's get it plugged into our workflow but let's hold off on running our own model,'" said Waller. This has helped the firm gain efficiency and productivity bonuses from AI while keeping IT costs low.
However, Furey is more than just a consumer of AI products. While it's not prepared to drop millions of dollars on custom systems, it has found great cost savings in the form of creating its own API access point for OpenAI's models. During the development process, Furey estimated its expenses would be hundreds of dollars per month, but as time went on and OpenAI introduced new capacities, the cost began to drop. While the cost savings are nice, Waller said the real benefit is in better quality client services.
"That cost has gone to near zero," he added. "Once we got our whole team up and running on it, all the clients, we've got thousands of [API] calls, [but] we're in no more than 10 bucks a month. But the investment really is on our team knowing which way to go and connecting the API client to the API gateway in a secure way, doing all that dev work to plug that into our templates on a daily basis and go through that."
Doug Schrock, managing AI principal for Top 25 Firm Crowe, said small firms should be actively experimenting with AI beyond just buying or licensing a commercial solution, which he called "the homeowner level of AI." While the investment is much less, there is an upper limit on the value it can create because it does not enable more significant redesigns to processes and tasks that are offered by more complex solutions. Overall, he said, firms should be seeking to innovate and make strategic relationships with some of the larger AI players out there.
While it's fine for now to stick mostly with what's on the market, he warned that smaller firms will need to increase their AI capacities soon, or else be outcompeted by other firms. Smaller players who can't or won't make these investments, Schrock predicted, will start falling behind. They might need to do things like hire consultants to help get them to that next stage of AI development. "The market is moving and folks like us get a higher level of value allowing us to get more cost competitive and deliver value and speed they maybe can't," he added.
"In the next six to 12 months, get your people using the tools tied to your existing system," Schrock said. "If everyone is running MS Suite, turn on Copilot. It's $30 bucks a month per user. Have your people start using the AI features built into your core system, then maybe get some spot LLM tools like ChatGPT or some AI-based research tool. They need to get in the game now if they haven't already."