SHARE TO:

Beyond Collaborative Filtering (Part 2)

Note: This is Part 2 of our series on recommendation systems and collaborative filtering. Please check out Part 1 of our series for the challenges of building a retail specific product recommendation system and an overview of collaborative filtering.

Retail Product Recommendations

Once a good collaborative filtering model has been built using matrix factorization, the individual dense latent customer and product vectors can be used in a much more robust way compared with the original ratings matrix. The most straightforward application is computing the estimated rating of customer \(i\) to product \(j\), which is simply the dot product of the two vectors (\(r_{i,j}=u^T_{i,\cdot}p_{\cdot,j}\)). This can be repeated to determine the top \(N\) products for each individual. Similarly, we can use the dot product to compare product-product and customer-customer pairs. The key point is that both of these vectors are mapped to the same latent \(f\) dimensional feature space opening up all sorts of nice applications. Some examples include:

  • One-to-one personalized product recommendations: Simple application of finding the top \(N\) products for each customer.
  • Product to customer targeting: Using the estimated ratings to find the top \(N\) customers with an affinity towards a given product.
  • Customer segmentation: Using the customer vectors, we can run any off the shelf clustering algorithm such as K-means to find different segments with a similar affinity towards products. In the case of K-means, we can use the centroid of each cluster to find the top \(N\) products for each cluster.
  • Brand or category recommendations: Instead of just returning a score on a per product basis, we can aggregate the scores for an individual (or cluster) based on some mapping of products such as brand, category or any other customer to product grouping. This allows us to find the top \(N\) customers for a given grouping.
  • Features for a predictive model: The customer and product vectors can be used directly as features in a predictive model since by definition they are a representation of their respective entity.

As you can see, there is a huge advantage of mapping customer and product vectors to the same latent factor space.

Beyond Collaborative Filtering

Although the matrix factorization approach to collaborative filtering opens up a lot of doors for recommendation systems, it is just the beginning. It’s always important to remember the age-old aphorism from statistics: “All models are wrong but some are useful“. In other words, we should understand the operating conditions as well as limitations of our models. In the case of collaborative filtering, it is an immensely useful way to find a customer’s affinity towards a product. However, it does not directly optimize for purchase behavior, which ultimately is the end goal of our product recommendations.

Collaborative filtering is just one of the models we use in our personalization engine. Our approach has been to avoid the “man with a hammer” syndrome and use the right model for the right job. We’ve found that using an ensemble approach and combining many different models has led to both greater robustness of our system as well as better performance of our models on real-world problems. We have a library of models that form the building blocks of our engine. Each block deals with different aspects of the personalization problem such as seasonality, price, product diversity, offer, response, and uplift. Although theoretically building a joint model to optimize all these dimensions sounds nice, we’ve found it is intractable in practice. Instead, our focus is on smart ways to combine models so that we can incrementally add new ones and improve our existing capability through the judicious use of a test and learn methodology.

If any of these problems sound interesting to you, we’re always looking for talented people to help us build out our personalization engine. We’ve got a diverse team with some pretty bright people, come join us!



SHARE TO: