Posted By Gbaf News
Posted on January 14, 2017
By Jason Monger, Senior Systems Engineer, Financial Services, Nimble Storage
Fraud is on the rise. In the first half of 2016 alone, there were more than a million cases of fraud in Britain accounting to £399.5 million. This poses a massive threat the financial services industry, often seriously undermining confidence in the affected institutions.
While new digital approaches to financial services have surely changed the game for committing fraud, so too has it provided a great opportunity to help combat it. By deploying the massive amount of financial data generated every day, financial services companies can get better at spotting fraud and eliminate false positives.
Need for speed
Many banks and financial institutions are now routinely analysing a number of aspects of clients’ behaviour to help determine when transactions are fraudulent. From monitoring account balances, location, employment details, spending patterns, and even the speed at which they slide their credit card, behavioural analysis can determine if the card is being used by its owner.
However, this analysis is only effective if the information can be analysed fast enough to make decisions in time to prevent, or at least reduce the impact, of the fraud. Identifying a month later that a transaction was fraudulent does little more for the bank than help verify a customer complaint.
It is for this reason that financial services organisations have always been at the forefront of the big data analytics revolution. It is now, and has been for a long time, the cornerstone of a successful firm.
But with ever-growing data sets, organisations face increasingly complex challenges around both data storage and ensuring the latency of data so they can analyse it in real-time to prevent fraud.
Closing the app data gap
This process of behavioural analysis is only as fast as the slowest component in an organisation’s data centre.
Financial services institutions often have hundreds of terabytes – and in some cases even petabytes – of market databases. When analysing this data, organisations often face performance issues because the applications running the analysis are not able to access the relevant data fast enough. This creates an app data gap.
To understand the issue that the app data gap poses, think of the application experiencing the same performance issues that you face when software on your computer stutters, or struggles to bring up a document off your hard drive or on your organisation’s server. In the same way, the application isn’t accessing the data fast enough to run at speed.
In the case of monitoring for fraud, the slow data delivery to analytics applications reduces the speed of the analysis – and even short delays can result in missing the opportunity to block fraudulent transactions. It’s therefore crucial that financial services institutions remove the barriers to data velocity to improve speed of their big data analytics
Barriers to data velocity
Storage is often assumed to be the cause of application breakdowns, but Nimble Storage’s analysis of 7,500 companies found that 54 per cent of cases rise from issues with the interoperability, configuration, and/or not following best practice steps unrelated to storage.
One underlying reason for this is that the majority of data centre components are designed independently. As such, even ‘best of breed’ components may hinder the interoperability of overall infrastructure.
Even buying all components from one IT vendor’s portfolio can’t protect against this challenge, with so many large company’s solutions being made up of smaller acquired businesses’ solutions.
Optimisation through machine learning and predictive analytics
To remove the barriers to data velocity across many organisations’ increasingly complex infrastructure, company’s should be looking to deploy solutions incorporating machine learning and predictive analytics to address interoperability or capacity issues before they create an app data gap.
One advantage of adopting such solutions is that IT teams can analyse the performance metrics gathered from a large volume of high performing environments to create a baseline. This then helps them identify poor performance early, reducing the impact on the application.
By using sensors to monitor activity of multiple elements across the infrastructure at the time of an event, IT teams can also identify cause and effect relationships. This can help them prevent problems arising from interoperability issues between the different releases for different components by comparing results to those of other environments, and provide smart recommendations on how to avoid conflicts.
Using machine learning to evolve software releases also enables teams to optimise availability and performance from correlations across the stack.
Time is money
With the cost of fraud high and rising, financial services companies need to be reflecting on how they can increase data velocity for the data analysis to ensure that its infrastructure isn’t hindering the powerful analytics applications that identify fraudulent activity.
It’s essential that they look at their entire infrastructure stack to eliminate the diverse and complex operations that can slow down data delivery, and, in turn, analysis. Because when it comes to fraud, time truly can cost money.