The FRC has issued guidance on addressing exceptions in the use of audit data analytics
This article highlights the FRC’s guidance on addressing exceptions in the use of audit data analytics.
The Financial Reporting Council (FRC) has issued guidance for auditors on addressing exceptions in the use of audit data analytics (ADA). This guidance is welcome given the increasing use of such techniques by auditors and seeks to take account of real-world examples observed both by the FRC’s Audit Quality Review (AQR) team and also described by auditors through the FRC’s External Technology Working Group.
The guidance contains general principles for dealing with outliers when using ADA to respond to identified risks in an audit, a potential approach auditors could adopt and includes an illustrative example based on a real-world scenario. The guidance also includes best practice, and potential pitfalls to avoid, when refining expectations developed for ADA to assist auditors in undertaking this process effectively. Also included is a non-exhaustive list of considerations that auditors may wish to take account of before making use of ADA in the evidence collection stage of the audit. The focus is on using ADA in order to respond to risks, as part of substantive testing and evidence collection, rather than on assessing the risk of material misstatement at the planning stage of an audit.
The guidance is non-authoritative and does not represent the only possible way to address the potential volume of exceptions generated when using ADA. Users are encouraged to apply the general principles, diagrams and specific example as a starting point, adapted and modified for the specific tools they are making use of in an audit. Tools which are substantially more complex than those described in the guidance are likely to require additional consideration and more detailed analysis.
- When using ADA, where parameters are inappropriately defined initially, potentially due to a lack of understanding about the population itself, or the entity and its environment more widely, a significant number of outliers may be generated.
- In many cases, this volume of outliers is a symptom of poorly defined parameters. Parameters may therefore require re-calibration after initial analysis to ensure the tool is appropriately identifying outliers that merit further investigation as exceptions. The term “outliers” is used to describe results generated by an ADA that do not match the auditor’s initial expectation for the population, moving to describe them as “exceptions” only when the auditor has analysed the outliers and determined that they are truly exceptions and not generated as a result of inappropriate tool scoping, poorly defined initial parameters, or the use of poor-quality data.
- This situation is analogous to the development of an expectation when conducting analytical procedures in line with ISA (UK) 520 ‘Analytical Procedures’ which requires that the auditor ‘develops an expectation of recorded amounts or ratios and evaluates whether the expectation is sufficiently precise to identify a misstatement’. It is often the case that the first expectation built is insufficiently precise, and that subsequent refinement is required as the auditor develops a stronger understanding, before the expectation is suitable for comparison to the actual and used to generate audit evidence.
An important consideration before using ADA is ensuring that it is an appropriate tool to generate audit evidence in the specific circumstances of an audit and also to consider where other, non-data analytics focused, tools and tests may be more appropriate. The guidance lists a series of factors to be considered when determining if ADA is suitable for use in responding to identified risks.
The guidance then sets out a general approach to addressing outliers generated when using ADA to test a single population and concludes with a revenue focused example.