Simplifying Fraud Analytics

31 July 2017

‘Big Data’. Personally, I view the term as a bit of a misnomer, it is after all just data. Though I’ll grant that there it is more of it being created, and more importantly, organisations are holding on to it for longer. There appears to be a view in some organisations that the more data you have access to then the better chance you will have at deriving value or insight from it.

Over the last 10 years I’ve seen that it is becoming more challenging for those responsible for detecting or investigating fraud, bribery and corruption to quickly derive any meaningful value from the data they have access to. When you have access to more and more complex data sources, and with records stretching back further and further in time, it compounds the issue. It’s a challenge for those who are trying to find the facts and make sense of what has happened.

When it comes to using an organisation’s structured data systems (such as billing, financial,  and HR records) to assist an investigation I'm often seeing that the solution isn't a piece of static data analysis which would answer a set of specific questions, or locate a sample of potentially suspicious transactions. Instead the investigative approach needs to be more iterative, and more aligned to modern exploratory data analytics techniques which allow more immediate feedback to those who are analysing the data.

As an example, on one of our recent large bribery and corruption investigations, we hosted the data in a platform that allowed our forensic investigation team to query and analyse large volumes of complex structured data which originated from disparate sources. This platform was designed in a way to provide context surrounding the data and allowed the users to extract and manipulate it at any time of day, without needing us to provide them a static Excel-based report.

It’s also a scalable solution, we had over 50 investigators who could access the platform and extract out information as and when they needed; rather than creating a bottleneck of requests to our forensic data analytics team. The platform was accessed over 3,000 times during the course of that engagement. I often refer to this approach as Self-Service Analytics; it puts the capability into the hands of the investigators and is definitely something I expect to see more of when working on other large and complex investigations.

The other benefit of this approach is that even if we don't know exactly what the team are looking for - it is possible to design the output of the platform in a way that allows an investigator to explore the data and iterate their searches based on what they are discovering. This approach helps them come to an answer about what exactly has happened in an efficient way.

By giving people the means to explore the data we can allow them to quickly find what they are looking for, without having to put in multiple requests to get selected extracts from various systems. On our particular investigation it freed up our investigative data analytics teams to perform deeper, more complex and customised analysis.

The next time you perform an investigation and need to make sense of the facts then I’d encourage you to take the time to think about how you’re going to analyse all of the data that you have access to.

 

Comments

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Comments are moderated and will not appear until the author has approved them.