Wednesday, August 31, 2016

Hello Sense and Respond, Bye Bye Command and Control

Transforming the retail industry with real-time insights



Guest Blog by 
Shalini Raghavan, 
Senior Director Product Management at FICO


It was a few years ago, my colleague Matt Beck, General Manager for FICO’s marketing solution, and I were discussing the challenges of doing business in an omnichannel world. Our ability to interact with customers across multiple communication channels is just one factor contributing to the fact that there is now more data than ever before. Yet not all businesses are able to distill the essence of this data into decisions of value. And even when they are able to discern a valid and valuable decision that would improve customer experience, frequently they are not able to do so in a way that would make it actionable at the point of decision with the consumer. The question is how do we “Sense and Respond” to what this data is telling us, such that we bring the right value at the right time to customer need?

Sense and Respond isn’t just some esoteric concept that came out of control theory (although some of FICO’s technology has its roots in missile defense – but that’s a story for another day). It’s about continuously looking at behaviors to understand the who, when, where, what and why. This information can be constantly used to adapt and improve decisions and outcomes. Think of it as continuous insights but without the offline analyses. This is insight in-stream, which can be put into action while those insights are still novel and the opportunities most timely.

So how does Sense and Respond get out of the theory books and become a reality?

#1 - Break down artificial barriers, when it comes to data
The first step is to break down artificial barriers, when it comes to data. Treating batch and streaming sources differently only limits our ability to tap into important signals that might help us improve decisions and outcomes. This is critical since both types of sources offer enormous value in generating more insight and improving decisions and actions —especially when used in combination.

#2 - Capture opportunities in each moment by instantly deriving actionable insights from data
The second step is to capture opportunities in each moment by instantly deriving actionable insights from data. Predictive and prescriptive analytics have long been used to power decisions and actions. Often these analytics are used in batch to extract decision value after an event has occurred making it impossible to respond in that moment with a decision of value. Making these analytics contextually aware while enabling them to extract the value of those moments in near-time and real time is key to driving our most valuable insights into customer value at the moment it will be most appreciated and valued.

#3 - Make systems more usable and manageable
The third step is to make systems that sense and respond more usable and manageable. So it is possible to unleash the power they offer while still maintaining a modest solution footprint and the kind of management controls and test frameworks needed to ensure quality results. Pairing real-time decisions with response and results dashboards is not only possible, but critical in ensuring the performance of your decision strategy.

We put these notions to the test in a marketing example for a retailer who was looking to revamp their existing marketing offerings. By their own accounts this retailer possessed ample and timely data on their customer. The results using traditional campaign type engagement methods were hit or miss, and they felt they were too slow in responding to consumer needs and market demands. A team of internal data scientists was spending time creating analytics that were clearly predictive, but which couldn’t be operationalized in real-time at the point of interaction.

How do you know which offer is the best one, at the current time, for each and every consumer?
Retailers carry a bevy of products that customers can choose from and this makes it a very interesting analytic problem. How do you know which one is the best one, at the current time, for each and every consumer? For several years FICO has had offerings that used predictive and prescriptive models to look at consumer behavior across products, as well as a myriad logistical and business constraints to optimally determine the right offer and timing. In many cases, the solution considers activity on many thousands of products and offers. For many of our customers, this offering has worked rather well as a batch solution tee-ing up offers for the next customer interaction. The clear need was for FICO to transform our batch-oriented solution into one that would serve the real-time needs of this retailer and enable them to generate similarly powerful insights for the current interaction.

Fortunately, we have been transforming both our analytics IP in this space in conjunction with key platform capabilities that enable real-time model execution at scale. We like others felt that “Data is that fourth factor”. So very early on we decided that our platform would be one that helps extract the value of all the moments in that data. We are data agnostic and don’t make artificial distinctions of batch and stream - we simply treat all data as a source that carries some valuable information that we may need to act upon immediately.

The most exciting piece is that we’ve been able to put the smarts that help sense and respond in a box. We make that happen through something we call contextual profiles. Using these profiles, we can add in the context that is so important to how we will sense and respond. Whether is it is the location of the customers, or changes in purchase behavior or their historical behavior – all of these types of data can be distilled into a contextual profile that can be used in combination with analytics for real-time action. We use these profiles in rules and predictive models all of which can run on those moments to drive action in real-time.

Our visual composition capabilities allowed us to build a fully working real-world solution in a few short weeks. No custom code was involved. Bringing in new sources of data, operationalizing a model and tuning the solution were all accomplished with a few clicks. We tested and swapped out models and it was as simple as swapping out PMML with a few more clicks. And if that sounds too good to be true, we could score many thousands of predictive models for a single customer on that bevy of products in 250 milliseconds on a very modest server. We compared this approach to a more traditional approach and were pleasantly surprised that we used 200x less compute power. And the use of contextual profiles reduced the number of required calculations, eliminating the need to go back to transactional databases to recalculate summarizations that had already been performed. Those of us who are close to our efforts know that we’ve been pushing for efficient “Sense and Respond" machines. But a 200x improvement was more than what we expected.

Many of us see benchmarks like this in a lab setting. But observing this successfully in a real-world problem is exciting. Using our technology, we were able to prove that Sense and Respond is real.

For all those marketers out there, consider bidding adieu to “Command and Control” style campaigns. Sense and Respond is here and you can use it to engage with your customers like never before. And in the coming months, you’ll see us transform more domains with our offering.



Tuesday, August 2, 2016

The Data Mining Group Releases PMML v4.3

Another key milestone for PMML and interoperability in data science!

The Data Mining Group (DMG), a vendor-led consortium of companies and organizations developing standards for statistical and data mining models, announced the general availability of version 4.3 of the Predictive Model Markup Language (PMML):

Chicago, IL 8/2/2016 – The Data Mining Group is proud to announce the release of PMML v4.3. PMML is an application and system independent XML interchange format for statistical and data mining models. The goal of the PMML standard is to encapsulate a model independent of applications or systems using an XML configuration file so that two different applications (the PMML Producer and Consumer) can use it.

“The PMML standard delivers true interoperability, enabling machine learning and predictive models to be deployed across IT platforms,” says Michael Zeller, CEO of Zementis, Inc. “A common standard ensures efficiency and fosters collaboration across organizational boundaries which is essential for data science to scale beyond its current use cases. With its latest release, PMML has matured to the point where it not only has extensive vendor support but also has become the backbone of many big data and streaming analytics applications.”

Read the full press release here





Copyright © 2009-2014 Zementis Incorporated. All rights reserved.

Privacy - Terms Of Use - Contact Us