Mitigating propensity bias of large language models for recommender systems

Date

2025

Authors

Zhang, G.
Yuan, G.
Cheng, D.
Liu, L.
Li, J.
Zhang, S.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Journal article

Citation

ACM Transactions on Information Systems, 2025; 43(6, article no. 150):1-26

Statement of Responsibility

Conference Name

Abstract

The rapid development of Large Language Models (LLMs) creates new opportunities for recommender systems, especially by exploiting the side information (e.g., descriptions and analyses of items) generated by these models. However, aligning this side information with collaborative information from historical interactions poses significant challenges. The inherent biases within LLMs can skew recommendations, resulting in distorted and potentially unfair user experiences. On the other hand, propensity bias causes side information to be aligned in such a way that it often tends to represent all inputs in a low-dimensional subspace, leading to a phenomenon known as dimensional collapse, which severely restricts the recommender system’s ability to capture user preferences and behaviors. To address these issues, we introduce a novel framework named Counterfactual LLM Recommendation (CLLMR). Specifically, we propose a spectrum-based side information encoder that implicitly embeds structural information from historical interactions into the side information representation, thereby circumventing the risk of dimension collapse. Furthermore, our CLLMR approach explores the causal relationships inherent in LLM-based recommender systems. By leveraging counterfactual inference, we counteract the biases introduced by LLMs. Extensive experiments demonstrate that our CLLMR approach consistently enhances the performance of various recommender models.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

Copyright 2025 Copyright held by the owner/author(s) Access Condition Notes: Accepted manuscript available on Open Access

License

Grant ID

Call number

Persistent link to this record