VISTAS 2023 - Drive the Momentum - October 18-20, Caesars Palace, Las Vegas - Learn More

The Very Best Algorithm

by: Maarten Oosten | September 7, 2023

In a not-so-distant past, pricing companies competed, among other things, on algorithms that lend themselves well for pricing. Companies invested money and effort developing their own analytic tools, with as few dependencies on third-party software as possible, to avoid overcomplicating their offerings.

Today, the ability to store data, hold large data sets in memory, and execute analytic algorithms lightning fast using parallel computing have developed quickly. A variety of large innovative software companies provide platforms for cloud services and rich, well-tested, high-performance analytic libraries. This is disrupting the industry of pricing vendors. Instead of developing their own analytic tools from the ground up, a new breed of pricing vendors uses these big data platforms and their associated libraries. This way, they can run multiple algorithms concurrently, automatically tune and let them compete which each other for effectiveness on an amazing level of granularity. All of this is transparent to the user, who is just working with the best results and does not care about the details of the underlying algorithm selection.

To illustrate the opportunities this provides, have a look at some of the recent Kaggle competitions. A Kaggle is a platform for data science and a great place to practice data science and learn from the community. You may have heard about some of their competitions, which sometimes have sizeable cash prizes. A well-known example is Zillow’s Home Value Prediction. A few years ago, Zillow wanted to provide a reliable value estimate of houses as a service to their customers. So, they opened this task up to the data science community, offering over a million dollars to the winner. Teams of talented people compared many inventive combinations of modern data science techniques and generated exceptional results.

But was the task of determining the sale price of a house solved now? It depends, really. If you are trying to sell a house, you’re still left wanting for additional information. Suppose you are moving from Las Vegas, where you own a house, to Chicago for a new job. You can’t afford to own two houses at once, so you decide to rent a place in Chicago for nine months. You give yourself six months to sell your Las Vegas house and three months to close on a house in your new hometown. The first six months you split in two parts: the first three months you put your house on the market for a price you are happy to sell at. If you still have not sold the house after three months, you lower your price to a reasonable but lower level where you are quite sure to sell, based on the interest your house generated in the first three months.

What we just described is the price planning aspect of your Go-to-Market strategy. The Zillow Home Value Prediction may provide a useful reference point, but you’re really looking for an algorithm that predicts the likelihood of selling a house within three months as a function of price. The current Zillow prediction won’t satisfy that requirement.

When you are in the market for a pricing solution, do not worry too much about using the very best algorithm. There is no single algorithm that performs best all the time. There will be multiple algorithms available and a mechanism to automatically finds the best combination of algorithms depending on the circumstances. Instead, focus on articulating your Go-to-Market strategy and make sure the underlying analytic process supports this strategy. If you are responsible for setting and managing prices (or discount agreements, or rebate programs, etc.), you do not have to be an algorithmic specialist. Your part in a successful solution to your pricing challenges is to articulate your Go-to-Market strategy. The next step is to mine for data that supports your strategy and assesses its quality. This quality will rapidly improve once you start defining, monitoring, maintaining and refining key performance metrics that relate directly to your Go-to-Market strategy. This process is referred to as 'working on your data spine.' Once your data spine has been established, an experienced data science team can unleash the power of the advanced analytics libraries and lift the sophistication of your Go-to-Market strategy to a whole new level.

Future entries of this blog series will cover examples of Go-to-Market strategies that have been successfully enhanced with data science techniques. In addition, you may want to download Gary Adams’s eBook titled "Building a Better Analytics Framework" to learn more about data spines.

Reference:
https://www.kaggle.com/c/zillow-prize-1#description