Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3523227.3547404acmotherconferencesArticle/Chapter ViewAbstractPublication PagesrecsysConference Proceedingsconference-collections
invited-talk

Imbalanced Data Sparsity as a Source of Unfair Bias in Collaborative Filtering

Published: 13 September 2022 Publication History

Abstract

Collaborative Filtering (CF) is a class of methods widely used to support high-quality Recommender Systems (RSs) across several industries [6]. Studies have uncovered distinct advantages and limitations of CF in many real-world applications [5, 9]. Besides the inability to address the cold-start problem, sensitivity to data sparsity is among the main limitations recurrently associated with this class of RSs. Past work has extensively demonstrated that data sparsity critically impacts CF accuracy [2, 3, 4]. The proposed talk revisits the relation between data sparsity and CF from a new perspective, evincing that the former also impacts the fairness of recommendations. In particular, data sparsity might lead to unfair bias in domains where the volume of activity strongly correlates with personal characteristics that are protected by law (i.e., protected attributes). This concern is critical for RSs deployed in domains such as the recruitment domain, where RSs have been reported to automate or facilitate discriminatory behaviour [7]. Our work at SEEK deals with recommender algorithms that recommend jobs to candidates via SEEK’s multiple channels. While this talk focuses on our perspective of the problem in the job recommendation domain, the discussion is relevant to many other domains where recommenders potentially have a social or economic impact on the lives of individuals and groups.

Supplementary Material

MP4 File (ImbalancedDataSparsity-IN1070.mp4)
In this presentation, we will revisit the relationship between data sparsity and collaborative filtering (CF) from a new perspective, evincing that the former also impacts the fairness of recommendations. In particular, data sparsity may lead to unfair bias in domains where the volume of activity strongly correlates with personal characteristics that are protected by law (i.e., protected attributes). We will discuss the responsibility of recommender systems (RS) to be fair by intent, present a novel source of unfair bias in CF applied to the recruitment domain, and briefly discuss measures that companies can take to increase the visibility of unintentional biases in RSs.
MP4 File (ImbalancedDataSparsity-IN1070.mp4)
In this presentation, we will revisit the relationship between data sparsity and collaborative filtering (CF) from a new perspective, evincing that the former also impacts the fairness of recommendations. In particular, data sparsity may lead to unfair bias in domains where the volume of activity strongly correlates with personal characteristics that are protected by law (i.e., protected attributes). We will discuss the responsibility of recommender systems (RS) to be fair by intent, present a novel source of unfair bias in CF applied to the recruitment domain, and briefly discuss measures that companies can take to increase the visibility of unintentional biases in RSs.

References

[1]
Michael D Ekstrand, John T Riedl, and Joseph A Konstan. 2011. Collaborative filtering recommender systems. Now Publishers Inc.
[2]
Miha Grčar, Dunja Mladenič, Blaž Fortuna, and Marko Grobelnik. 2005. Data sparsity issues in the collaborative filtering framework. In International workshop on knowledge discovery on the web. Springer, 58–76.
[3]
Zihan Lin, Changxin Tian, Yupeng Hou, and Wayne Xin Zhao. 2022. Improving Graph Collaborative Filtering with Neighborhood-enriched Contrastive Learning. In Proceedings of the ACM Web Conference 2022. 2320–2329.
[4]
Yashar Moshfeghi, Benjamin Piwowarski, and Joemon M Jose. 2011. Handling data sparsity in collaborative filtering using emotion and semantic based features. In Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval. 625–634.
[5]
Jérémie Rappaz, Julian McAuley, and Karl Aberer. 2021. Recommendation on Live-Streaming Platforms: Dynamic Availability and Repeat Consumption. In Fifteenth ACM Conference on Recommender Systems. 390–399.
[6]
Yue Shi, Martha Larson, and Alan Hanjalic. 2014. Collaborative filtering beyond the user-item matrix: A survey of the state of the art and future challenges. ACM Computing Surveys (CSUR) 47, 1 (2014), 1–45.
[7]
Shiliang Tang, Xinyi Zhang, Jenna Cryan, Miriam J Metzger, Haitao Zheng, and Ben Y Zhao. 2017. Gender bias in the job market: A longitudinal analysis. Proceedings of the ACM on Human-Computer Interaction 1, CSCW(2017), 1–19.
[8]
Deanne Tockey and Maria Ignatova. 2019. Gender Insights Report: How women find jobs differently. LinkedIn Talent Solutions, https://business. linkedin. com/content/dam/me/business/en-us/talent-solutions-lodestone/body/pdf/Gender-Insights-Report. pdf (2019).
[9]
Quentin Villermet, Jérémie Poiroux, Manuel Moussallam, Thomas Louail, and Camille Roth. 2021. Follow the guides: disentangling human and algorithmic curation in online music consumption. In Fifteenth ACM Conference on Recommender Systems. 380–389.
[10]
Sirui Yao and Bert Huang. 2017. Beyond parity: Fairness objectives for collaborative filtering. Advances in neural information processing systems 30 (2017).

Cited By

View all
  • (2023)Graph Contrastive Learning with Hybrid Noise Augmentation for RecommendationAdvanced Data Mining and Applications10.1007/978-3-031-46674-8_23(325-339)Online publication date: 27-Aug-2023

Index Terms

  1. Imbalanced Data Sparsity as a Source of Unfair Bias in Collaborative Filtering

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    RecSys '22: Proceedings of the 16th ACM Conference on Recommender Systems
    September 2022
    743 pages
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 September 2022

    Check for updates

    Author Tags

    1. Algorithmic Bias
    2. Data sparsity
    3. Fairness
    4. Recommender Systems
    5. Responsible AI

    Qualifiers

    • Invited-talk
    • Research
    • Refereed limited

    Conference

    Acceptance Rates

    Overall Acceptance Rate 254 of 1,295 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)31
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 01 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Graph Contrastive Learning with Hybrid Noise Augmentation for RecommendationAdvanced Data Mining and Applications10.1007/978-3-031-46674-8_23(325-339)Online publication date: 27-Aug-2023

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media