New technology allows accountants to do far more for their clients, but with this increased capacity has come more ethical issues, especially with regard to objectivity, independence and competence.
This was the conclusion of a
The report goes into a whole host of issues, but some of the most prominent topics include competence and due care, objectivity, independence, and the need for transparency and confidentiality.
On competence, while the report said accountants don't necessarily have to be experts, a certain level of familiarity is expected with the technologies they work with, which some stakeholders said is still too low. It said that accountants often lack practical experience and knowledge about artificial intelligence, blockchain/cryptocurrencies and data governance to know what type of questions to ask, how to identify and mitigate specific risks and errors, and how to assess the reliability of transformational technologies.
"Where [professional accountants] are indeed involved in decision-making (for example, generally small and medium-sized organizations and practices), they might lack the relevant understanding of the technology with which they are dealing. This in turn might result in the potential misidentification of the risks and controls pertaining to such technology and a lack of professional competence to determine if the technology (or its outputs) is appropriate or reasonable. It is noted that the potential for miscommunication with software developers and technologists also increases when public accountants are not appropriately skilled," said the report.
For instance, blockchain audits in particular require a certain degree of competence so the accountants can better understand who all the participants in a blockchain are, as there might be business relationships and professional services provided to them that could raise auditor independence issues. The report also noted that coding a blockchain-based application programming interface for a client could also be fraught — in order to do so, information needs to be "pushed" onto the blockchain, and that information must be accurate and suitable for the purpose. The report said this might hold further independence implications, and further might impact the client's financial reporting and internal controls.
Overall, stakeholders cited in the report said that a reasonably competent accountant should be able to:
- Ask IT professionals appropriate questions and understand their responses in the context of the system or tools being assessed;
- Have confidence in what is happening with the system or tool; and,
- Be able to justify the use and outputs of the tool.
Inside the black box
The report also noted that technology opens up issues with the accountant remaining unbiased. Specifically, it pointed out a trend where certain accountants trust whatever the software outputs over their own human judgment, biasing them towards the machine. People, said the report, are "increasingly simply deciding the machine is 'correct.'" It also pointed out a "brand name" bias among certain accountants, where widely selling solutions are often immediately trusted despite the accountants not having access to things like the source code or the detailed quality assessment process underpinning its development. The report also raised concerns from stakeholders that the increasingly automated nature of accounting work has begun to degrade people's knowledge of the fundamentals.
"Less experienced team members, who were never involved in creating the report and understanding its purpose, will have less ability to recognize or identify what might be unreasonable or incorrect, and likely will not be able to explain the report's basis ... if such automatic reports are generated regularly enough, even more experienced team members will stop noticing what might be incorrect or omitted," said the report.
This sort of black-box mindset can contribute to another trend mentioned in the report: the increasingly opaque nature of financial information. It noted, for instance, that it can be hard enough to explain the output of an AI algorithm; things get even more complicated when that same output is used as the input for another AI algorithm. The report acknowledged, however, that finding the right balance between transparency and confidentiality can be difficult.
"For example, if a public accountant determines that disclosure of non-compliance of laws and regulations to an appropriate authority is an appropriate course of action, they should also consider whether there would be legal protection in the particular jurisdiction if they override the confidentiality terms of their employment contract — this might warrant seeking legal advice," said the report.
Sometimes increased technological capacities lead to questionable requests from clients. For instance, auditors have far more data insights now than they did when they were just sampling receipts from a warehouse. This information is increasingly being requested by client management as deeper insights enable them to ask more relevant questions and make better decisions, which technically counts as an advisory, versus audit, engagement. An auditor may not necessarily even mean to do this.
"A regulator noted the increased risk of a firm inadvertently providing more detailed insight than is appropriate over a number of years (i.e., the potential for 'scope creep'), meaning that the firm might be unaware that it has assumed management responsibility. Other stakeholders observed that clients sometimes use audit information for purposes different from those the auditor intended, which once again can lead to an assumption of management responsibility that the firm might not be aware of, and thus not under the firm's control," said the report.
The report said that custom software tools offered by some accounting firms might present independence issues as well, particularly where it concerns data analytics. If firms offer these data analytical tools to the entities they audit, or to entities that might become audit clients in the future, a conflict might arise if the entity uses these tools to analyze data that later becomes subject to the firm's audit
The report made several recommendations for changes to the IESBA code of ethics to account for these factors. Among other things, it said the IESBA should:
- Achieve clarity on whether firms and organizations may use client or customer data for internal purposes, such as training AI models, and if so, the parameters of such use (prior, informed consent; anonymization);
- Develop further guidance around the importance of transparency and explainability;
- Address the ethics implications of a public accountant's custody or holding of financial or nonfinancial data belonging to clients, customers, or other third parties;
- Engage more actively with other bodies, such as IFAC's International Panel on Accountancy Education and professional accountancy organizations, to encourage them to arrange educational activities to raise awareness about the characteristics of "sufficient" competence; and,
- Continue initiatives to advocate the importance and relevance of the IESBA Code of Ethics, as well as to develop, facilitate the development of, and/or contribute to nonauthoritative resources and materials.