Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3594739.3610676acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article
Open access

Analysing Fairness of Privacy-Utility Mobility Models

Published: 08 October 2023 Publication History

Abstract

Preserving the individuals’ privacy in sharing spatial-temporal datasets is critical to prevent re-identification attacks based on unique trajectories. Existing privacy techniques tend to propose ideal privacy-utility tradeoffs (PUT), however, largely ignore the fairness implications of mobility models and whether such techniques perform equally for different groups of users. The quantification between fairness and privacy of PUT models is still unclear and there exists limited metrics for measuring fairness in the spatial-temporal context. In this work, we define a set of fairness metrics designed explicitly for human mobility, based on structural similarity and entropy of the trajectories. Under these definitions, we examine the fairness of two state-of-the-art privacy-preserving models that rely on GAN and representation learning to reduce the re-identification rate of users. Our results show that these models violate individual fairness criteria, indicating that users with highly similar trajectories receive disparate privacy gain.

References

[1]
Aristos Aristodimou, Athos Antoniades, and Constantinos S Pattichis. 2016. Privacy preserving data publishing of categorical data through k-anonymity and feature selection. Healthcare technology letters 3, 1 (2016), 16–21.
[2]
Solon Barocas, Moritz Hardt, and Arvind Narayanan. 2017. Fairness in machine learning. Nips tutorial 1 (2017), 2017.
[3]
Reuben Binns. 2020. On the apparent conflict between individual and group fairness. In Proceedings of the 2020 conference on fairness, accountability, and transparency. 514–524.
[4]
Anne Elizabeth Brown. 2018. Ridehail revolution: Ridehail travel and equity in Los Angeles. University of California, Los Angeles.
[5]
Hongyan Chang and Reza Shokri. 2021. On the privacy risks of algorithmic fairness. In 2021 IEEE European Symposium on Security and Privacy (EuroS&P). IEEE, 292–303.
[6]
Yves-Alexandre De Montjoye, César A Hidalgo, Michel Verleysen, and Vincent D Blondel. 2013. Unique in the crowd: The privacy bounds of human mobility. Scientific reports 3, 1 (2013), 1–5.
[7]
Cynthia Dwork, Moritz Hardt, Toniann Pitassi, Omer Reingold, and Richard Zemel. 2012. Fairness through awareness. In Proceedings of the 3rd innovations in theoretical computer science conference. 214–226.
[8]
Ecenaz Erdemir, Pier Luigi Dragotti, and Deniz Gündüz. 2019. Privacy-aware location sharing with deep reinforcement learning. In 2019 IEEE International Workshop on Information Forensics and Security (WIFS). IEEE, 1–6.
[9]
Ecenaz Erdemir, Pier Luigi Dragotti, and Deniz Gündüz. 2020. Privacy-aware time-series data sharing with deep reinforcement learning. IEEE Transactions on Information Forensics and Security 16 (2020), 389–401.
[10]
Michael Feldman, Sorelle A Friedler, John Moeller, Carlos Scheidegger, and Suresh Venkatasubramanian. 2015. Certifying and removing disparate impact. In proceedings of the 21th ACM SIGKDD. 259–268.
[11]
Danielle L Ferreira, Bruno AA Nunes, Carlos Alberto V Campos, and Katia Obraczka. 2020. A deep learning approach for identifying user communities based on geographical preferences and its applications to urban and environmental planning. ACM Transactions on Spatial Algorithms and Systems 6, 3 (2020), 1–24.
[12]
Sorelle A Friedler, Carlos Scheidegger, and Suresh Venkatasubramanian. 2021. The (im) possibility of fairness: Different value systems require different mechanisms for fair decision making. Commun. ACM 64, 4 (2021), 136–143.
[13]
Yanbo Ge, Christopher R Knittel, Don MacKenzie, and Stephen Zoepf. 2016. Racial and gender discrimination in transportation network companies. Technical Report. National Bureau of Economic Research.
[14]
Moritz Hardt, Eric Price, and Nathan Srebro. 2016. Equality of opportunity in supervised learning. arXiv preprint arXiv:1610.02413 (2016).
[15]
Suining He and Kang G Shin. 2022. Socially-Equitable Interactive Graph Information Fusion-based Prediction for Urban Dockless E-Scooter Sharing. In Proceedings of the ACM Web Conference 2022. 3269–3279.
[16]
Hoda Heidari, Michele Loi, Krishna P Gummadi, and Andreas Krause. 2019. A moral framework for understanding fair ml through economic models of equality of opportunity. In Proceedings of the conference on fairness, accountability, and transparency. 181–190.
[17]
Kate Hosford and Meghan Winters. 2018. Who are public bicycle share programs serving? An evaluation of the equity of spatial access to bicycle share service areas in Canadian cities. Transportation research record 2672, 36 (2018), 42–50.
[18]
Dou Huang, Xuan Song, Zipei Fan, Renhe Jiang, Ryosuke Shibasaki, Yu Zhang, Haizhong Wang, and Yugo Kato. 2019. A variational autoencoder based generative model of urban human mobility. In 2019 IEEE MIPR. IEEE, 425–430.
[19]
Maria Kamargianni, M Matyas, W Li, and A Schäfer. 2015. Feasibility Study for “Mobility as a Service” concept in London. UCL Energy Institute, Dept. Transp (2015), 1–82.
[20]
Maximilian Kasy and Rediet Abebe. 2021. Fairness, equality, and power in algorithmic decision-making. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. 576–586.
[21]
Juha K Laurila, Daniel Gatica-Perez, Imad Aad, Olivier Bornet, Trinh-Minh-Tri Do, Olivier Dousse, Julien Eberle, Markus Miettinen, 2012. The mobile data challenge: Big data for mobile computing research. Technical Report.
[22]
Rosa Lletı, M Cruz Ortiz, Luis A Sarabia, and M Sagrario Sánchez. 2004. Selecting variables for k-means cluster analysis by using a genetic algorithm that optimises the silhouettes. Analytica Chimica Acta 515, 1 (2004), 87–100.
[23]
Xin Lu, E. Wetter, N. Bharti, A. J Tatem, and L. Bengtsson. 2013. Approaching the limit of predictability in human mobility. Scientific reports 3, 1 (2013), 1–9.
[24]
Massimiliano Luca, Gianni Barlacchi, Bruno Lepri, and Luca Pappalardo. 2021. A Survey on Deep Learning for Human Mobility. arxiv:2012.02825 [cs.LG]
[25]
Afra Mashhadi, Joshua Sterner, and Jeffery Murray. 2021. Deep Embedded Clustering of Urban Communities Using Federated Learning. (2021).
[26]
Ninareh Mehrabi, Fred Morstatter, Nripsuta Saxena, Kristina Lerman, and Aram Galstyan. 2021. A survey on bias and fairness in machine learning. ACM Computing Surveys (CSUR) 54, 6 (2021), 1–35.
[27]
Kun O., Reza S., David S. R., and Wenzhuo Y.2018. A Non-Parametric Generative Model for Human Trajectories. In IJCAI-18. IJCAI, 3812–3817.
[28]
Krishna PN Puttaswamy, Shiyuan Wang, Troy Steinbauer, Divyakant Agrawal, Amr El Abbadi, Christopher Kruegel, and Ben Y Zhao. 2012. Preserving location privacy in geosocial applications. IEEE Transactions on Mobile Computing 13, 1 (2012), 159–173.
[29]
Jinmeng Rao, Song Gao, Yuhao Kang, and Qunying Huang. 2020. Lstm-trajgan: A deep learning approach to trajectory privacy protection. arXiv preprint arXiv:2006.10521 (2020).
[30]
John E Roemer. 2002. Equality of opportunity: A progress report. Social Choice and Welfare 19, 2 (2002), 455–471.
[31]
Nazir Saleheen, Supriyo Chakraborty, Nasir Ali, Md Mahbubur Rahman, Syed Monowar Hossain, Rummana Bari, Eugene Buder, Mani Srivastava, and Santosh Kumar. 2016. mSieve: differential behavioral privacy in time series of mobile sensor data. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 706–717.
[32]
Prasanna Sattigeri, Samuel C Hoffman, Vijil Chenthamarakshan, and Kush R Varshney. 2019. Fairness GAN: Generating datasets with fairness properties using a generative adversarial network. IBM Journal of Research and Development 63, 4/5 (2019), 3–1.
[33]
Luiz Eduardo Virgili Silva, ACS Senra Filho, VPS Fazan, JC Felipe, and LO Murta Junior. 2016. Two-dimensional sample entropy: Assessing image texture through irregularity. Biomedical Physics & Engineering Express 2, 4 (2016), 045002.
[34]
Chaoming Song, Zehui Qu, Nicholas Blumm, and Albert-László Barabási. 2010. Limits of predictability in human mobility. Science 327, 5968 (2010), 1018–1021.
[35]
Zhou Wang, Alan C Bovik, Hamid R Sheikh, and Eero P Simoncelli. 2004. Image quality assessment: from error visibility to structural similarity. IEEE transactions on image processing 13, 4 (2004), 600–612.
[36]
Yonghui Xiao and Li Xiong. 2015. Protecting locations with differential privacy under temporal correlations. In Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security. 1298–1309.
[37]
An Yan and Bill Howe. 2019. Fairst: Equitable spatial and temporal demand prediction for new mobility systems. In Proceedings of the 27th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems. 552–555.
[38]
An Yan and Bill Howe. 2020. Fairness in practice: a survey on equity in urban mobility. A Quarterly bulletin of the Computer Society of the IEEE Technical Committee on Data Engineering 42, 3 (2020).
[39]
Yuting Zhan, Hamed Haddadi, and Afra Mashhadi. 2023. Privacy-Aware Adversarial Network in Human Mobility Prediction. Proceedings on Privacy Enhancing Technologies 1 (2023), 556–570.
[40]
Wenjing Zhang, Ming Li, Ravi Tandon, and Hui Li. 2018. Online location trace privacy: An information theoretic approach. IEEE Transactions on Information Forensics and Security 14, 1 (2018), 235–250.
[41]
Yu Zheng, Xing Xie, and Wei-Ying Ma. 2010. GeoLife: A Collaborative Social Networking Service among User, Location and Trajectory. IEEE Data Eng. Bull. 33 (June 2010), 32–39.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp/ISWC '23 Adjunct: Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing
October 2023
822 pages
ISBN:9798400702006
DOI:10.1145/3594739
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 October 2023

Check for updates

Author Tags

  1. Fairness
  2. Privacy
  3. Spatial-temporal applications

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

UbiComp/ISWC '23

Acceptance Rates

Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 195
    Total Downloads
  • Downloads (Last 12 months)178
  • Downloads (Last 6 weeks)27
Reflects downloads up to 26 Nov 2024

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media