When Real-Time Decisions Aren’t Fast Enough

big data

Big Data is never at rest. The information flow is constant, and so is the analysis that must take place to generate corporate action — but at the speed of machines, as SVP Jeff Bettencourt tells PYMNTS. One caveat: Grabbing data on the fly could use a hand from technology that helps standardize, normalize, and make sense of all that information whizzing by.

Poetry in motion? Maybe not.

Value-added data in motion? Definitely so. Ours is the Age of Information, to be sure. The information is constant — and for businesses of all stripes and sizes, it’s a deluge. How to wrap corporate arms around it all?

DataTorrent’s Jeff Bettencourt, SVP of marketing and business development, said traditional attitudes surrounding information are such that firms collect data, store it in a central location and then “extensively poke at that data to get the information you want,” which is, in turn, used by management to help make business decisions.

But with that store-and-search method, said Bettencourt, inefficiencies exist. “You go back and read through it, and you don’t know if the information that you’re looking at is one-hour old, a month old or a year old.”

The fresher the data, the faster you can get insight and take action.

DataTorrent, he told PYMNTS, focuses on capturing real-time information at the point it’s created – with an eye on the fact that, increasingly, business decisions need to be made not at the speed of humans, but at the speed of machines.

Speed is of the essence, particularly when it comes to fraud-fighting efforts by retail or financial services firms.

Picture, for example, the credit card fraud schemes where machines – under the guise of looking like valid users – attack firms and commit fraud, presenting valid (but stolen) data to make off with goods. Real-time analytics can not only detect fraud, but can prevent it from happening in the first place.

But analyzing data in motion for faster insight and action is not just limited to payment fraud or account takeover use cases. Take a less sinister example: The Internet of Things has brought sensors to windmills, oil wells, jet engines and virtually anything, anywhere, delivering real-time information about how those machines are working (or not).

We humans can’t respond to the immense amount of information that is being collected in real time, but by using Big Data analytics of data in motion, as it’s being created, this information can immediately become actionable.

Against this backdrop, DataTorrent late last month unveiled an update to its real-time Big Data analytics streaming platform, DataTorrent RTS 3.10, which it says makes it easier for customers to capture and visualize data trends as they happen, with a nod to the fact that more use cases are demanding analytics of data in motion.

Bettencourt told PYMNTS the platform is based on the company’s Apoxi framework, which leverages pre-built applications (and can integrate independent applications) for DataTorrent customers to grab data on the fly and compare it to historical trends, so that human eyes get support from machine learning and artificial intelligence.

Additionally, DataTorrent customers can record and replay data and apply it to different modeling scenarios. This release also features applications geared toward omnichannel payment fraud prevention, online account takeover prevention and product recommendations, among other initiatives.

Regardless of the vertical in which the data is being supplied, said Bettencourt, “You are getting tons of information that needs to be normalized.” Typically, a machine is logging that information, and yet that data may not be easily digested by human eyes (or brains) to form actionable insights.

There needs to be memory in place, he said, that determines whether pieces of information are forming a trend that is the status quo, an anomaly or even fraudulent.

One caution: This is no “set it and forget it” strategy.

As Bettencourt told PYMNTS, “When people talk about machine learning and how I use my business to make quick decisions, they don’t realize that it’s just like a human being – you constantly have to teach it. The machine has to constantly learn, and it needs to see different types of information to be effective going forward.”

That fluid learning is especially valuable in a retail environment, he said, where new avenues of opportunity for fraud emerge wherever consumer buying patterns change, for example. Consumer choice in eCommerce has evolved from buying online and shipping goods or picking them up in a store, to buying and having items shipped to different locations.

DataTorrent’s Apoxi framework not only allows for building highly scalable, enterprise-hardened applications, but it can also deliver these applications significantly faster than using open-source components alone. For a typical buildout of a real-time, decision-making platform, the gestation time to implementation can be as long as 12 to 14 months, which is not exactly swift.

Bettencourt told PYMNTS that his firm’s offerings improve that “time to value” timeframe by 400 percent and backs that up with the company’s “jump-start guarantee.” Big Data projects can be production-ready in as little as 60 days, he said.

The preassembled pipelines offered by DataTorrent “allow customers to change a few things they need to [for customization]. They are fully assembled enterprise applications that they can put into place … [Once they’re up and running], we verify whether or not that machine learning knowledge is going to allow [the decisions and transactions] they want to allow,” Bettencourt said.

All industries are going through massive transformations in order to survive in the new digital economy. Moving toward, fast Big Data apps that can be quickly deployed to improve customer experience, reduce cost, drive new revenue streams and more are becoming the new norm.