Training insights: artificial intelligence for internal audit
Artificial intelligence (AI) seems to be everywhere – speeding things up, automating processes, answering questions, creating new content and bringing with it a host of new concerns about data use, security, audit trails and fake news. Whether you are excited by its potential, but unsure how it can support internal audit work, or concerned about the new risks it creates for your organisation and how to offer assurance on these, a new course from the Chartered IIA aims to help internal auditors understand the basics and offers pointers on where to go next.
One clear message for internal audit is that ignoring it is not an option. At the very least, internal auditors need to have a knowledgeable, constructive response when the board asks for assurance on the organisation’s use of AI. And this is a when, not an if, points out Stephen Foster, the course leader. “Someone in your organisation will probably already be using AI, even if you are not,” he points out.
Who will benefit?
Foster’s aim is that people currently unfamiliar with, and potentially daunted by, the challenges posed by AI will leave the one-day course able to write some effective prompts and with the confidence to explore opportunities further and an awareness of where to start with their strategy and approach. The subject is huge, so the course aims to support internal audit teams that need to take their first steps. It’s for those who want to know how to progress towards benefiting from generative AI, while understanding its limitations and being aware of the risks it creates.
It is aimed at helping those who have little or no experience with AI to narrow the gap with the leaders and become competent and resourceful with it. It is not, therefore, a course for those who are already using AI extensively and have access to specialist support and an established AI architecture.
“I started my journey with AI asking on a personal level what does it mean for me and how are people using it? I then moved on to looking at what it meant for me as a chief audit executive (CAE),” explains Foster. “It’s initially overwhelming. It feels as if the possibilities are boundless, but what are the limitations and where do you start?”
Despite the well-publicised risks around bias, data, confidentiality and personal information, no internal audit team or organisation can ban AI. The opportunities are potentially too great and its use increasingly comprehensive. Therefore, it’s vital that internal auditors are involved in establishing frameworks that set out what it can and can’t be used for, and how it should be used – particularly regarding decision-making and data, he explains.
Everyone can use it and experiment if they have the confidence and the imagination. “Tech is catching up with science fiction. You don’t need to be a programmer or code expert. Large language models (LLMs) mean you can just talk to the computer,” Foster points out. “This opens up so many opportunities for non-tech specialists to connect applications, automate processes and have a greater impact relatively cheaply. For smaller internal audit teams this is very exciting, since they have traditionally been at a disadvantage compared to those with bigger budgets. But it will also benefit teams in larger organisations that can now knowledgeably explore and require new capabilities from their IT colleagues.”
There are three key areas that internal auditors need to understand, he says. “How to use AI as a tool, how AI can benefit the organisation, and how internal audit can provide assurance on the ways it is used. I don’t believe any of this can be achieved without using the technology yourself.”
What will you gain?
A CAE in a smaller team may simply want to ensure they understand the basic opportunities and risks and how to frame conversations about these with their own teams and stakeholders, such as the board and audit committee. They may also want to make informed decisions about when to start using it in their team.
However, Foster points out that the course is also a good development opportunity for individual internal auditors who want to develop their own knowledge and explore the potential of using more AI in their work. They could be tasked with attending the course, sharing what they’ve learnt and undertaking further experiments on their return to the office.
“CAEs need to think strategically about how they fit AI into their resource planning,” Foster says. “How do you get this knowledge into your team and develop it? Internal audit should also be an important part of the development of AI use across the business. They should ensure that the organisation understands, if it doesn’t already, the need to manage this technology in a responsible way.”
For those more worried about the risks than keen to evaluate the benefits, Foster points out that crossing a road blindly is incredibly dangerous – and pointless if you’re going in the wrong direction. “We mitigate this by looking both ways and using a map to reach our desired destination.”
This article was published in September 2024.