Posted By Gbaf News
Posted on May 31, 2018
Dave Excell, Founder & CTO, Featurespace
Artificial intelligence (AI) is fast becoming the defining technology of our age. As with any new technology however, bad actors are equally adept at harnessing its power for their own nefarious ends.
The power of AI has seen fraudsters increasingly able to penetrate banking and payment systems. As in any conflict, there is an arms race between the defenders and attackers developing. In the case of card fraud, fortunately the defenders are winning.
Financial Fraud Action UK figures for 2017 published last month have shown the defenders are valiantly winning the fight, with the total value of unauthorised fraudulent payments on cards, remote banking and cheques falling by 5% to £731.8m – despite the gross number of attempts rising 3%.
So the arms race is on. The only way for us to beat the crooks is to continue to develop our own smart AI weapons to continue to stay one step ahead and begin to turn the tide on fraud.
AI is the new weapon of choice
Algorithms have long been used by fraudsters and hackers to even the odds against defender’s systems. By using AI, the time required to test large quantities of data and probe defences has reduced significantly.
AI is used by fraudsters in a variety of ways but two of the most common are Credit Master and Authorisation Manipulation attacks.
Credit Master attacks
One way in which fraudsters use AI is in so-called ‘Credit Master’ attacks. Algorithms are deployed which automatically generate credit card account numbers from an existing correct number as well as pairs of expiration dates. The criminal software then automatically tests these combinations, working sequentially through number combinations to see which match genuine cards by attempting a transaction. It’s simple trial and error.
Sequential number Credit Master attacks have existed for a long time but increasingly fraudsters can use AI to automate the process of testing mind-boggling amounts of numbers at the speed of computer processors. This fully automated process allows fraudsters to test thousands upon thousands of prospective number combinations at a far greater rate than would be possible with methods that require human input.
Fraudster’s attacks are becoming more nuanced, using AI to test many account numbers in random, rather than sequential configurations (sequential is more easily detected by existing fraud solutions). AI technology can also vary the rate at which numbers are attempted, either hard and fast, overwhelming detection or slow and steady, so the attempts do not seem like the same attack.
When one of the tested combinations proves to match an existing card, the fraudster can then use the details in further attacks to steal money from customers.
As fraudster’s technology improves its efficiency and speed, there is increasing load placed on bank systems to detect these fraudulent transactions.
Authorisation Manipulation
In an Authorisation Manipulation attack, a fraudster presents themselves as a valid merchant through access to a payment terminal, either with a physical chip and pin/swipe machine or online terminal. Payment terminals can be obtained in a variety of ways, for example through taking over an existing account that has access to a terminal, via collusive merchants or by applying for one fraudulently.
When a transaction is put through a payment terminal, various different pieces of information are sent to the bank which, when validated, authorises money to move from a bank account to the merchant (or in this case fraudster).
The fraudster aims to force payments through this authorisation process by providing information to the bank which fits enough criteria for the bank to authorise the payment.
AI is now being used to test many different iterations and combinations of card numbers, customer information and purchase details through a bank’s authorisation process, learning which details have to be valid for a fraudulent transaction to be authorised.
AI allows fraudsters to test many more different combinations at a much higher rate than would otherwise be possible and to learn and adjust methods automatically based on which attempts are successful.
Much like an invader testing a wall’s defences, Authorisation Manipulation attacks find out exactly where weak points are and through a process of elimination will find the gaps in a bank or payment provider’s defences. The fewer valid details an attacker needs to successfully authorise a payment the more efficient fraudsters can be in their attempts.
A new weapon in the war against fraudsters
Using cutting edge technology, banking and payment providers are now able to foresee fraudulent transactions and stop them in their tracks. Rather than relying on static rules or models to detect fraud (that fraudsters can test and learn to defeat), advanced AI solutions are now analysing customer behaviour to provide a real time dynamic defence.
Existing rules or static model based systems are getting out-classed by fraudsters because they aren’t able to learn. When a static fraud system is being exploited, it isn’t able to detect whether or how fraudsters are getting through and isn’t able to seal the gap in its defences.
The subtle art of spotting a fraudster in the act
Fortunately, while AI is becoming more commonplace amongst fraudsters, banks and payment providers are employing ever more advanced AI to protect customers.
Using the very latest in machine learning technologies, the defenders are trying to spot the fraudsters in the act. Instead of trying to notice things that might be ‘wrong’, smart tech can now intuitively ‘know’ a real customer from a crook.
Featurespace’s real time machine learning platform is able to understand the behavioural traits of real customers, learning each individual’s characteristic traits. This is done by spotting the variance from the individual’s normal behaviour. This can be as subtle as the rhythm in which a customer typically enters their password, their preference for key strokes, or how they use a mouse.
It can also be learning where a person is likely to go and what they might spend. This kind of intuitive learning means fraudsters with stolen card information can be picked off in real time.
In the perpetual arms race between financial criminals and the defenders of customers, AI provides tools for both sides, but by using the latest machine learning technologies the defenders are winning the race.