Abstract
In this paper, we discuss several problems with current Big data practices which, we claim, seriously erode the role of informed consent as it pertains to the use of personal information. To illustrate these problems, we consider how the notion of informed consent has been understood and operationalised in the ethical regulation of biomedical research (and medical practices, more broadly) and compare this with current Big data practices. We do so by first discussing three types of problems that can impede informed consent with respect to Big data use. First, we discuss the transparency (or explanation) problem. Second, we discuss the re-repurposed data problem. Third, we discuss the meaningful alternatives problem. In the final section of the paper, we suggest some solutions to these problems. In particular, we propose that the use of personal data for commercial and administrative objectives could be subject to a ‘soft governance’ ethical regulation, akin to the way that all projects involving human participants (e.g., social science projects, human medical data and tissue use) are regulated in Australia through the Human Research Ethics Committees (HRECs). We also consider alternatives to the standard consent forms, and privacy policies, that could make use of some of the latest research focussed on the usability of pictorial legal contracts.
Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.Availability of data and material
Not applicable.
Code availability
Not applicable.
Notes
One problem with the term ‘Big Data’, as Luciano Floridi (2012) points out, is that it is slightly ambiguous, since the predicate ‘big’ is vague. In other words, there is no precise point at which a dataset changes from small to big. In this paper we will use the term ‘Big Data’ in the sense that is most commonly adopted at present—namely, to describe data sets (of ever-increasing sizes) that are too big for humans to analyse for the purpose of identifying new patterns, correlations, and insights. AI algorithms become useful in these domains due to the speed and scale at which they can operate. An ethical issue arises here because AI algorithms have the potential to reveal novel forms of personal information from such data sets. Individuals may have a strong desire for such personal information not to be made public, shared to third parties, or used to modify their behaviour. In short, there is a risk that serious harm can be caused to individuals by the improper use of AI and big data. For a more precise definition of Big Data, see Levin et al. (2015). They characterise Big Data in terms of four key attributes—namely Volume, which refers to the terabytes of new data being added each day; Velocity, which refers to the real time speed at which analyses can now be performed on these data; Variety, which refers to the different types of data, and variety of sources, that are now being collected; and Veracity, which pertains to the trustworthiness of the data sources (Levin et al. 2015 pp. 1661–1662).
In a literature review on the ethics of Big Data by Mittelstadt and Floridi (2016), it was found that informed consent was one of the biggest concerns of researchers.
See also the ‘National Statement on Ethical Conduct in Human Research’ (National Health and Medical Research Council 2007 [updated 2018]). This statement provides ethical guidance for Australian researchers whose work involving human subjects.
It also worth remembering that the concept of informed consent has not always been considered integral to medical ethics. If we look at the Hippocratic physicians of ancient Greece, we not only find a lack of concern for informed consent but also an absence of concern for the truth. The Corpus Hippocraticum (the corpus of early medical texts associated with Hippocrates), for example, for all its innovation and focus on the responsibilities of physicians, features instructions to conceal information from the patient where doing so would be useful (Faden and Beauchamp 1986, p. 61).
Interestingly, this formulation echoes the findings of the High Court of Australia in the landmark medical negligence case of Rogers v Whitaker (1992) 175 CLR 479. The issue was whether the failure to warn a patient, who was about to undergo eye surgery, of a very unlikely risk constituted negligence on the part of the surgeon. With this decision the court moved past the traditional ‘doctor knows best’ approach (whereby the decision on whether or not to warn of a certain risk fell within the discretion of the health professional) and embraced a doctrine that upholds the autonomy of the individual patient and their ability to attach significance to particular risks (Sappideen 2010).
See Australian Competition & Consumer Commission (2019), for the details of this report.
This is most evident in Article 15 of the GDPR—‘Right of access by the data subject’—where it is stated that data subjects have the right to (i) obtain information about what their data will be used for; (ii) know which parties have access to their data; and (iii) know the length of time their data will be stored for (GDPR 2018). For a recent critique of the GDPR’s capacity to ensure that a right to an explanation is secured, see Wachter et al. (2017). They argue that the GDPR does not, in its current form, give data subjects a right to an explanation, due to the fact that the document’s language is ambiguous in parts. In their article, they make recommendations about how this issue can be resolved.
In Australia, at present, this is primarily a moral problem, as the protections currently afforded by the Privacy Act 1988 (Cth) are minimal, while in Europe it is also a legal problem (see GDPR 2018). As mentioned above, the recent ‘Digital Platforms Inquiry’ conducted by the ACCC (Australian Competition and Consumer Commission.
2019) found current Australian legislative and regulatory protections wanting on a number of levels and made recommendations to depart from exclusive reliance on principle-based regulation, and add more rule-based protective requirements, some of which is inspired by the GDPR (2018).
A risk assessment of this kind need not be an elaborate one of course. It is the kind we perform in our everyday lives. For example, when one gets into a car, one (should) know that there is a small chance that they could get into a crash and get seriously injured. Most of us continue to travel by car, however, because it is convenient, and the probability of crashing is typically low. In circumstances where new information is presented to us, however, we may need to revise such probabilities. For example, if one learns that the driver of a car one is about to get into is inebriated, or does not possess a driver’s licence, one would typically not consent to allowing them to drive them home. It would be simply too risky for most people—given that the high probability of getting injured significantly outweighs the gains (in this case convenience). Analogously for repurposed data. If repurposing data introduces new risks for data subjects, it is not fair to subject them to such risks unless they first have knowledge of them and have agreed to proceed anyway.
Furthermore, ethical approval of data-use policies by an independent HREC-like body could become the new frontier of the fast-growing field of ‘corporate ethics’, potentially leading to legislative reform. While it is worth signalling this aspect of the issue, it is beyond the scope of this paper to provide an in-depth analysis of corporate ethics in this space.
The idea of a paradigm shift is from Thomas Kuhn (1962), who applies it to scientific revolutions.
References
Andersen CB (2018) Comic contracts and other ways to make the law understandable. The Conversation. Retrieved from: https://theconversation.com/comic-contracts-and-other-ways-to-make-the-law-understandable-90313. Accessed 23 Aug 2021
Arnold MH (2021) Teasing out artificial intelligence in medicine: an ethical critique of artificial intelligence and machine learning in medicine. J Bioeth Inq 18:121–139. https://doi.org/10.1007/s11673-020-10080-1
Australian Competition and Consumer Commission (2019) ‘Digital platforms inquiry—final report.’ Retrieved from: https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf. Accessed 23 Aug 2021
Baker R (2019) The structure of moral revolutions: studies of changes in the morality of abortion, death, and the bioethics revolution. MIT Press, Cambridge
Beauchamp TL, Childress JF (1979) Principles of Biomedical Ethics. Oxford University Press, New York.
Beauchamp TL (2011) Informed consent: its history, meaning, and present challenges. Camb Q Healthc Ethics 20:515–523. https://doi.org/10.1017/S0963180111000259
Bennett Moses L, Johns FE, Land LPW, Vaile D, Zalnieriute M, Yastreboff M, Zhao S, Nicholson K, de Sousa T, Whitty M (2021) Inquiry into the data availability and transparency bill 2020 and the data availability and transparency (consequential amendments) bill 2020. UNSW law research paper no. 21–37, Available at SSRN: https://ssrn.com/abstract=3807026 or https://doi.org/10.2139/ssrn.3807026. Accessed 23 Aug 2021
Bobek E, Tversky B (2016) Creating visual explanations improves learning. CRPI 1:27. https://doi.org/10.1186/s41235-016-0031-6
Bronskill J (2020) Malls gathered facial images of five million shoppers without consent: watchdogs. National post. Retrieved from: https://nationalpost.com/pmn/news-pmn/canada-news-pmn/malls-gathered-facial-images-of-five-million-shoppers-without-consent-watchdogs. Accessed 23 Aug 2021
Brunschwig CR (2019) Contract comics and the visualization, audio-visualization, and multisensorization of law. Univ W Aust Law Rev 46 (2):191–217. https://www.law.uwa.edu.au/data/assets/pdf_file/0004/3459415/Brunschwig-FInal.pdf. Accessed 23 Aug 2021
Burrell J (2016) How the machine ‘thinks:’ understanding opacity in machine learning algorithms. Big Data Soc 3(1):1–12. https://doi.org/10.1177/2053951715622512
Cancer Institute NSW (2021) NSW population & health services research ethics committee. Retrieved from: https://www.cancer.nsw.gov.au/research-and-data/nsw-population-health-services-research-ethics-com. Accessed 23 Aug 2021
Cohen JE (2019) Between truth and power: the legal constructions of informational capitalism. Oxford University Press, Oxford
Cohen IG, Mello MM (2019) Big data, big tech, and protecting patient privacy. JAMA 322(12):1141–1142. https://doi.org/10.1001/jama.2019.11365
Colaner N (2021) Is explainable artificial intelligence intrinsically valuable? AI Soc. https://doi.org/10.1007/s00146-021-01184-2
Dresden GM, Levitt MA (2001) Modifying a standard industry clinical trial consent form improves patient information retention as part of the informed consent process. Acad Emerg Med 8(3):246–252. https://doi.org/10.1111/j.1553-2712.2001.tb01300.x
Duffy C (2021) Facebook approves alcohol, vaping, gambling and dating ads targeting teens, lobby group finds. ABC News. Retrieved from: https://www.abc.net.au/news/2021-04-28/facebook-instagram-teenager-tageted-advertising-alcohol-vaping/100097590. Accessed 23 Aug 2021
Eurobarometer (2015) Data protection. Special Eurobarometer 431. https://ec.europa.eu/commfrontoffice/publicopinion/archives/ebs/ebs_431_en.pdf. Accessed 23 Aug 2021
Faden RR, Beauchamp TL (1986) A history of informed consent. Oxford University Press, New York
Flack F, Adams C, Allen J (2019) authorising the release of data without consent for health research: the role of data custodians and HRECs in Australia. J Law Med 26(3):655–680
Floridi L (2012) Big data and their epistemological challenge. Philos Technol 25:435–437
Floridi L (2019) The logic of information: a theory of philosophy as conceptual design. Oxford University Press, Oxford
French R (2019) Closing address, comic book contracts conference. Univ West Aust Law Rev 46(2):268–271. https://www.law.uwa.edu.au/data/assets/pdf_file/0011/3442655/8.-French-Closing-Address.pdf. Accessed 23 Aug 2021
GDPR (2018) General Data Protection Regulation. https://gdpr-info.eu/n. Accessed 23 Aug 2021
Innerarity D (2021) Making the black box society transparent. AI Soc. https://doi.org/10.1007/s00146-020-01130-8
Isaac M, Singer N (2019) Facebook agrees to extensive new oversight as part of $5 billion settlement. The New York Times. Retrieved from https://www.nytimes.com/2019/07/24/technology/ftc-facebook-privacy-data.html?mBurodule=inline. Accessed 23 Aug 2021
Kadam RA (2017) Informed consent process: a step further towards making it meaningful! Perspect Clin Res 8(3):107–112. https://doi.org/10.4103/picr.PICR_147_16
Kant I (1993) Groundwork for the metaphysics of morals, James W Ellington (trans.). Hackett Publishing Company, Indianapolis
Kaye J, Whitley E, Lund D, Morrison M, Teare H, Melham K (2015) Dynamic consent: a patient interface for twenty-first century research networks. Eur J Hum Genet 23:141–146. https://doi.org/10.1038/ejhg.2014.71
Kearns M, Roth A (2020) The ethical algorithm. Oxford University Press, Oxford
Keating A, Andersen CB (2016) A graphic contract: taking visualisation in contracting a step further. J Strateg Contract Negot 2(1–2):10–18. https://doi.org/10.1177/2055563616672375
Kemp K (2018) 94% of Australians do not read all privacy policies that apply to them—and that’s rational behaviour. The Conversation. Retrieved from https://theconversation.com/94-of-australians-do-not-read-all-privacy-policies-that-apply-to-them-and-thats-rational-behaviour-96353. Accessed 23 Aug 2021
Kemp K (2019) The ACCC is suing Google over tracking users. Here’s why it matters. The Conversation. Retrieved from: (https://theconversation.com/the-accc-is-suing-google-over-tracking-users-heres-why-it-matters-126020?utm_medium=email). Accessed 23 Aug 2021
Kosinski M, Stillwell D, Graepel T (2013) Digital records of behavior expose personal traits. Proc Natl Acad Sci USA 110(15):5802–5805. https://doi.org/10.1073/pnas.1218772110
Kuhn TS (1962) The structure of scientific revolutions. University of Chicago Press, Chicago
Levin M, Wanderer JP, Ehrenfeld JM (2015) Data, big data, and metadata in anesthesiology. Anesth Analg 121(6):1661–1667. https://doi.org/10.1213/ANE.0000000000000716
Lundgren B (2020) How software developers can fix part of GDPR’s problem of click-through consents. AI Soc 35:759–760. https://doi.org/10.1007/s00146-020-00970-8
Macnish K, Gauttier S (2020) A pre-occupation with possession: the (non-) ownership of personal data. In: Macnish K, Galliott J (eds) Big data and democracy. Edinburgh University Press, Edinburgh, pp 42–56
Manfield E (2021) Police access SafeWA app data for murder investigation, prompting urgent law change. ABC News. Retrieved from: https://www.abc.net.au/news/2021-06-15/safewa-app-sparks-urgent-law-change-after-police-access-data/100201340. Accessed 23 Aug 2021
Manson NC, O’Neill O (2007) Rethinking informed consent in bioethics. Cambridge University Press, Cambridge
Martin K (2019) Ethical implications and accountability of algorithms. J Bus Ethics 160:835–850
McDonald AM, Cranor LF (2008) The cost of reading privacy policies. I/S J Law Pol Inf Soc 4(3):543–568
McGuire J, Andersen CB (2019) Improving aurecon's employment contracts through visualisation. Univ W Aust Law Rev 46(2): 218–236. http://www.law.uwa.edu.au/data/assets/pdf_file/0007/3442651/4.-AndersenMcGuidre-Future-of-Works.pdf. Accessed 23 Aug 2021
Mittelstadt BD, Floridi L (2016) The ethics of big data: current and foreseeable issues in biomedical contexts. Sci Eng Ethics 22(2):303–341
Mittelstadt BD, Allo P, Taddeo M, Wachter S, Floridi L (2016) The ethics of algorithms: mapping the debate. Big Data Soc. https://doi.org/10.1177/2053951716679679
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research (1978) The Belmont Report. United States Government Printing Office, Washington, DC
National Health and Medical Research Council (2007) The National statement on ethical conduct in human research. Available from: https://www.nhmrc.gov.au/about-us/publications/national-statement-ethical-conduct-human-research-2007-updated-2018. Accessed 23 Aug 2021
Newson A, Lipworth W (2015) Why should ethics approval be required prior to publication of health promotion research? Health Promot J Aust 26(3):170–175. https://doi.org/10.1071/HE15034
Nimmon LS, Stenfors-Hayes T (2016) The “handling” of power in the physician-patient encounter: perceptions from experienced physicians. BMC Med Educ 16:114. https://doi.org/10.1186/s12909-016-0634-0
O’Neil C (2016) Weapons of math destruction: how big data increases inequality and threatens democracy. Crown Publishing Group, New York
O’Neill O (2003) Some limits of informed consent. J Med Ethics 29(1):4–7
Page K (2012) The four principles: can they be measured and do they predict ethical decision making? BMC Med Ethics 13:10. https://doi.org/10.1186/1472-6939-13-10
Pasquale F (2015) The black box society: the secret algorithms that control money and information. Harvard University Press, Cambridge
Peterson A (2016) FCC cracks down on verizon wireless for using ‘supercookies’. The Washington post. Retrieved from: https://www.washingtonpost.com/news/the-switch/wp/2016/03/07/fcc-cracks-down-on-verizons-supercookies/. Accessed 23 Aug 2021
Pollach I (2011) Online privacy as a corporate social responsibility: an empirical study. Bus Ethics Eur Rev 20:88–102. https://doi.org/10.1111/j.1467-8608.2010.01611.x
Postelnicu L (2019) Pregnancy club Bounty UK fined £400,000 by data protection regulator. HealthcareITNews. Retrieved from: https://www.healthcareitnews.com/news/pregnancy-club-bounty-uk-fined-400000-data-protection-regulator. Accessed 23 Aug 2021
Powles J, Hodson H (2017) Google DeepMind and healthcare in an age of algorithms. Heal Technol 7:351–367. https://doi.org/10.1007/s12553-017-0179-1
Przybylski AK, Murayama K, DeHaan CR, Gladwell V (2013) Motivational, emotional, and behavioral correlates of fear of missing out. Comput Hum Behav 29(4):1841–1848. https://doi.org/10.1016/j.chb.2013
Purtill J (2021) Apple’s iPhone has a new privacy feature that Facebook has tried to stop. ABC News. Retrieved from: https://www.abc.net.au/news/science/2021-04-29/apple-iphone-tracking-operating-system-update-facebook-privacy/100100172. Accessed 23 Aug 2021
Quelle C (2018) Enhancing compliance under the general data protection regulation: the risky upshot of the accountability- and risk-based approach. Eur J Risk Regul 9(3):502–526. https://doi.org/10.1017/err.2018.47
RAGCP (2019) Informed consent: information sheet. Retrieved from: https://www.racgp.org.au/download/Documents/PracticeSupport/informedconsentinfosheet.pdf. Accessed 23 Aug 2021
RANZCOG (2018) RANZCOG medical schools curriculum in obstetrics & gynaecology (AMC Alignment). Retrieved from: https://ranzcog.edu.au/RANZCOG_SITE/media/RANZCOG-MEDIA/About/RANZCOG-Undergraduate-Curriculum-in-Women-s-Health.pdf. Accessed 23 Aug 2021
Robbins S (2019) A misdirected principle with a catch: explicability for AI. Mind Mach 29:495–514. https://doi.org/10.1007/s11023-019-09509-3
Rosmarin R (2020) Sustainability sells: why consumers and clothing brands alike are turning to sustainability as a guiding light. Business Insider. Retrieved from: https://www.businessinsider.com/sustainability-as-a-value-is-changing-how-consumers-shop?r=AU&IR=T. Accessed 23 Aug 2021
Sappideen C (2010) Bolam in Australia: more bark than bite. Univ New South Wales Law J 33(2):386–424
Schmelzer R (2019) Understanding explainable AI’. Forbes. Retrieved from https://www.forbes.com/sites/cognitiveworld/2019/07/23/understanding-explainable-ai/?sh=122b8bc77c9e. Accessed 23 Aug 2021
Shaban-Nejad A, Michalowski M, Buckeridge DL (2021) Explainable AI in healthcare and medicine: building a culture of transparency and accountability. Springer. https://doi.org/10.1007/978-3-030-53352-6
Singer N, Conger K (2019) Google is fined $170 million for violating children’s privacy on YouTube. The New York Times. Retrieved from https://www.nytimes.com/2019/09/04/technology/google-youtube-fine-ftc.html. Accessed 23 Aug 2021
Solove DJ (2008) Understanding privacy. Harvard University Press, Cambridge, MA
Stahl BC, Antoniou J, Ryan M, Macnish K, Jiya T (2021) Organisational responses to the ethical issues of artificial intelligence. AI Soc. https://doi.org/10.1007/s00146-021-01148-6
Sunstein C (2002) Risk and reasons: safety, law and the environment. Cambridge University Press, Cambridge
Taylor J (2021) Government agencies could access personal data without consent under new bill. The Guardian. Retrieved from: https://www.theguardian.com/australia-news/2021/may/01/government-agencies-could-access-personal-data-without-consent-under-new-bill. Accessed 23 Aug 2021
Thompson SA, Warzel C (2019) Twelve million phones, one dataset, zero privacy. The New York Times. Retrieved from https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html. Accessed 23 Aug 2021
Thorbecke C (2021) What to know about Apple's new privacy update and why it's riling Facebook. ABC News. Retrieved from https://abcnews.go.com/Business/apples-privacy-update-riling-facebook/story?id=77340719. Accessed 23 Aug 2021
Tsamados A, Aggarwal N, Cowls J, Morley J, Roberts H, Taddeo M, Floridi L (2021) The ethics of algorithms: key problems and solutions. AI Soc. https://doi.org/10.1007/s00146-021-01154-8
Wachter S, Mittelstadt B, Floridi L (2017) Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int Data Priv Law 7(2):76–99. https://doi.org/10.1093/idpl/ipx005
Walker T (2020) Value of choice. J Med Ethics. https://doi.org/10.1136/medethics-2020-106067
Zuboff S (2019) The age of surveillance capitalism: the fight for a human future at the new frontier of power. Profile Books, London
Funding
Not applicable.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
Not applicable.
Ethics approval
Not applicable.
Consent to participate
Not applicable.
Consent for publication
Not applicable.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Andreotta, A.J., Kirkham, N. & Rizzi, M. AI, big data, and the future of consent. AI & Soc 37, 1715–1728 (2022). https://doi.org/10.1007/s00146-021-01262-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00146-021-01262-5