# Mixing bandits

## A recipe for improved cold-start recommendations in a social network

Stéphane Caron and Smriti Bhagat. SNAKDD 2013.

## Abstract¶

Recommending items to new or “cold-start” users is a challenging problem for recommender systems. Collaborative filtering approaches fail when the preference history of users is not available. A promising direction that has been explored recently [12] is to utilize the information in the social networks of users to improve the quality of cold-start recommendations. That is, given that users are part of a social network, a new user shows up in the network with no preference history and limited social links, the recommender system tries to learn the user’s tastes as fast as possible. In this work, we model the learning of preferences of cold-start users using multi-armed bandits [5] embedded in a social network. We propose two novel strategies leveraging neighborhood estimates to improve the learning rate of bandits for cold-start users. Our first strategy, MixPair, combines estimates from pairs of neighboring bandits. It extends the well-known UCB1 algorithm [5] and inherits its asymptotically optimal guarantees. Although our second strategy, MixNeigh, is a heuristic based on consensus in the neighborhood of a user, it performed the best among the evaluated strategies. Our experiments on a dataset from Last.fm show that our strategies yield significant improvements, learning 2 to 5 times faster than our baseline, UCB1.

## BibTeX¶

@inproceedings{caron2013snakdd,
title = {Mixing bandits: a recipe for improved cold-start recommendations in a social network},
author = {St{\'e}phane Caron and Smriti Bhagat},
booktitle = {Proceedings of the 7th Workshop on Social Network Mining and Analysis},
year = {2013},
pages = {11},
doi = {https://doi.org/10.1145/2501025.2501029},
}


## Discussion ¶

You can use Markdown with $\LaTeX$ formulas in your comment.