Recently, big data has become a buzzword, or catch-phrase, about analysing massive amounts of structured and unstructured data. I am understand that this data is so large that it’s difficult to process using traditional database and software techniques.
And everyone is looking to do something with big data, however have you considered if you use big data little brother ‘small data’ effectively before you start to spend money on big data?
Especially when it comes to the audit or risk functions the answer to this would be “NO” in a large number of companies. Most audit functions use limited amount of data analysis in the audits they execute. This is typically because of lack of skills/difficulty in getting data.
Analysing small data sets can greatly reduce the time required to audit a process, increase the coverage and give better comfort to the audit committee. Take for example the risk of duplicate payments to vendors, the only real way to address the risk is to analysis the vendor master for duplicate vendors and also the payments made for multiple payments against one invoice. These two tests can be manually only for a small sample which may or may not highlight the problem. However if you simply put your vendor master in excel and check for duplicates you can run the check for over 100 k vendors (no real need for ACL/SQL etc.)
Similarly these analysis can be done for employees in the vendor master or trying to find ghost employees.
Internal audit teams should incorporate some element of data analysis in all their audits, and if they have an audit where they are not able to define a data set for which analysis cannot be done then I would say the audit scope is incorrect.