🤖 AI Summary
This paper addresses personalized recommendation from sparse pairwise comparison data—rather than explicit ratings. To tackle the challenges of data sparsity and non-convex optimization arising from users’ limited pairwise comparisons, we propose a non-convex matrix factorization framework that models user preferences as ratios of latent feature vectors. Theoretically, we establish, for the first time, that the non-convex loss function satisfies restricted strong convexity in a neighborhood of the true parameters, ensuring exponential convergence of gradient descent. We further extend matrix concentration inequalities to the pairwise comparison setting—a novel theoretical contribution. Empirically, our method achieves both statistical and computational efficiency even when each user compares only $O(log n)$ items, significantly improving recommendation accuracy and convergence speed in highly sparse regimes.
📝 Abstract
This paper provides a theoretical analysis of a new learning problem for recommender systems where users provide feedback by comparing pairs of items instead of rating them individually. We assume that comparisons stem from latent user and item features, which reduces the task of predicting preferences to learning these features from comparison data. Similar to the classical matrix factorization problem, the main challenge in this learning task is that the resulting loss function is nonconvex. Our analysis shows that the loss function exhibits (restricted) strong convexity near the true solution, which ensures gradient-based methods converge exponentially, given an appropriate warm start. Importantly, this result holds in a sparse data regime, where each user compares only a few pairs of items. Our main technical contribution is to extend certain concentration inequalities commonly used in matrix completion to our model. Our work demonstrates that learning personalized recommendations from comparison data is computationally and statistically efficient.