Can Behavioral Analytics Slow Online Fraud?

EMV has done wonders to stop fraud via cards at the physical point of sale. The bad guys have moved off the premises and onto the web in due course, which means separating the good transactions from the ill-intentioned ones has gotten a whole lot harder. Featurespace Chief Commercial Officer Matt Mills says in the latest Data Drivers that real-time machine learning can take guesswork out of the equation.

Pity the poor fraudster. Fraud has no choice but to roll with the punches, and the flurries are enough to daze a boxer as skilled as Ali.

Get a toehold in a consumer’s account with a card and whammo, here comes EMV. Grab a piece of an ID and blam! Machine learning rears its mechanical head.

We’re being tongue in cheek, of course. We pity no fraudsters.

But the evolution is real, and happens quickly. As Matt Mills, chief commercial officer at Featurespace, told Karen Webster in the latest Data Drivers, for financial institutions, the efforts to blunt the bad guys is akin to a game of whack-a-mole.

The numbers are sobering, and picking only a trio of them is hard, but illuminating.

 

Data Point Number One: 81 percent

Fraudulent activity is 81 percent more likely to occur online than at the physical point of sale.

Not a surprise, the exec noted, as there have been strong and successful efforts tied to EMV, which limits the ability of the fraudster to commit successful theft. By now, it’s widely known that chip-enabled cards have been pushing fraud to digital conduits.

The other consideration that has theft gunning for bits and bytes, he said, is the value to the fraudster. Think of it as the time value of (stolen) money.

“The reality is, online I can go from store to store to store and can test the limits of that credit card in a very aggressive manner,” Mills said. “Not only that, I can use a large number of cards in a very short period of time.”

The level of sophistication of online fraud is one that runs the gamut – and moves beyond simple third-party opportunistic fraud, when people run up large purchases on a card, or go to a bank and claim the person behind a transaction wasn’t them.

Ah, but beyond those confines, we move toward the more sophisticated pursuits. Said Mills: The level of card data is detailed, for sale on the dark web, and can be had for not very much money.

Added Mills: “Whatever additional security checks the merchant or the bank puts in place to try and validate that the consumer is who they say they are … the fraudster can pass them.” And ultimately, it doesn’t matter if the attack is at the retail POS or if it targets a credit card or an alternative payments choice.

“The problem with these attacks is that they haven’t just been against organizations that hold small amounts of data … some of them have been with organizations where we really trusted them with some of the most detailed records that we have on ourselves,” he said.

Even with acquiescence, danger lurks – for even when the firm pays the demanded ransom, there is no guarantee the info won’t be shared anyway.

In the end, behavior is what matters, and separating good behavior from bad in milliseconds is what can make all the difference for FIs.

Yet, as Mills stated, the hardest thing in the world – in the ongoing battle to sniff out who commits fraud and who is legit, as they both seek to access the account and the merchant at the same time – is gaining an intelligent handle on behavior.

“The way I behave on one side [of everyday life] and the way I have on the [internet] can be completely different,” he said. “Not only that, but the [FI or merchant] interpretation of how consumers behave can be the difference.”

Featurespace has developed a number of approaches that combine looking at the behaviors and characteristics of the individual historically in real time, and combining that with insight into the underlying transactional activity, said Mills.

The behavioral characteristics, he said, can be helped by data tied to device-specific information, such as location and date. Plus, data can show how people move money between accounts and how they use cards.

It can be the case that behavioral analytics systems can observe how an account user boosts checking account balances or savings accounts ahead of making large transactions.

“These things can help you understand that an anomaly is really not quite that anomalous,” said Mills, “because we are seeing history that makes that behavior look more probable … the most powerful thing about the machine learning approach is that what is unique for you is not going to be unique for me.”

Think of it as financial and transactional DNA.

Data Point Number Two: $31 Billion

This is the estimate of global card fraud losses by 2020. There are soft costs in place, too, which are hard to quantify, said Mills.

Companies grappling with CNP fraud incur an operational cost, said Mills, as they must decide whether to let a transaction that looks risky go through, or stop it in its tracks and risk losing a good customer – forever.

The job is made tougher by the fact that markets vary by region. Consider the U.K., where there are typically fewer cards per individual.

“What you tend to find is that if you decline a transaction, you will see another attempt,” he said. Thus it is easier to determine, at least in those markets, that this is a legitimate customer. Other markets may find that there are seven to eight cards in a holder’s wallet – and a firm will never understand that a transaction or user are on the up and up.

Data Point Number Three: $12.2 Billion

This is the amount of online fraud that wends its way across call centers. This brings to mind the number of access points through devices that comes internationally, which includes voice interaction and transactions made through call centers.

Mills noted that the movement has been toward social engineering to get people to say and do things when they are challenged by a bank. The banks put systems in place, but the “bad apples” have the ability to combine bits and pieces of data gleaned from other channels to get over these barriers.

FIs need to put a holistic view in place of the customer.

One size does not fit all. Some banks are primarily interested in reducing false positives, whereas others – well, what really bothers them is that one big fraud attack.

FIs may think of the value of the transaction as being a really strong indicator of fraud.

“The problem is that a high-value transaction is also an emotional transaction,” Mills noted. “It’s someone buying an engagement ring for their partner online or potentially buying a car,” or a recurring expense like an annual luxury vacation.

Banks need to strike that balance between checks protecting them against fraud as best as possible, while also allowing people to price and complete their emotional transactions with an acceptable level of friction.

That doesn’t mean no friction, of course.

Many consumers don’t mind tackling a challenge for these larger, more emotional purchases. The message, as Mills said, is “we trust you, we like you, we want to protect you.” The bank should not want to make a challenge feel like a challenge, but to make the consumer feel confident.

In one bit of advice, Mills said companies should set their thresholds high, but should not worry about eliminating fraud entirely. Otherwise, he warned, it’s possible to get caught in the trap of making commerce virtually impossible.

“There is no one thing you can do or buy or install that will resolve the issue entirely,” he said, adding that the best defense comes through adding “layers and layers and layers of security.”