Publications

User Factor Adaptation for User Embedding via Multitask Learning

EACL 2021 Workshop on Domain Adaptation for NLP

Publication date: April 20, 2021

Xiaolei Huang, Michael J. Paul, Franck Dernoncourt, Robin Burke, Mark Dredze

Language usage varies across users and their related attributes in social media data. Words authored by users across fields may have different meanings or sentiments. However, most of the existing methods to train user embeddings ignore the semantic variations of user behaviors across domains, such as product categories of books and electronics. In this study, we examine empirically how user interests in Amazon products, IMDb movies and Yelp business units can cause user language variations. We propose a user embedding model to account for the variability of user interests via a multitask learning framework. While existing work evaluated the user embedding through extrinsic tasks, we propose clustering and classification methods for both intrinsic and extrinsic evaluation tasks. The experiments on the three English-language social media datasets show that our proposed approach can generally outperform baselines via adapting the user factors.