Investigation Into The Use of Data Analytics in Political Campaigns Final 20181105
Investigation Into The Use of Data Analytics in Political Campaigns Final 20181105
Investigation Into The Use of Data Analytics in Political Campaigns Final 20181105
1. Introduction ............................................................................. 14
2.1 Failure to properly comply with the Data Protection Principles ...... 20
2.2 The relationship between the GDPR and the Data Protection Act
1998 .......................................................................................... 20
1
3.2 Cambridge Analytica (CA), Global Science Research (GSR) and the
obtaining and use of Facebook data ............................................... 26
3.6 The relationship between AggregateIQ (AIQ), Vote Leave and other
Leave campaigns ......................................................................... 49
2
4.4 Regulatory actions.................................................................. 63
Annex IV: Eldon Insurance Ltd Preliminary enforcement notice .......... 104
3
Commissioner’s message
When we opened our investigation into the use of data analytics for
political purposes in May 2017, we had little idea of what was to come.
The invisible, ‘behind the scenes’ use of personal data to target political
messages to individuals must be transparent and lawful if we are to
preserve the integrity of our election process.
4
reviewed the practices of 30 organisations and are working through 700
terabytes – the equivalent of 52 billion pages – of data.
We have used the full range of our investigative powers and where there
have been breaches of the law, we have acted. We have issued monetary
penalties and enforcement notices ordering companies to comply with the
law. We have instigated criminal proceedings and referred issues to other
regulators and law enforcement agencies as appropriate. And, where we
have found no evidence of illegality, we have shared those findings
openly.
Updated data protection law sets out legal requirements and it should be
government and regulators upholding the law. Whilst voluntary initiatives
by the social media platforms are welcome - a self-regulatory approach
will not guarantee consistency, rigour or public confidence.
I have also called for the UK Government to consider whether there are
any regulatory gaps in the current data protection and electoral law
5
landscape to ensure we have a regime fit for purpose in the digital age.
We are working with the Electoral Commission, law enforcement and
other regulators in the UK to increase transparency in election campaign
techniques.
Data protection agencies around the world must work with other relevant
regulators and with counterparts in other jurisdictions to take full
advantage of the law to monitor big data politics and make citizens aware
of their rights.
Elizabeth Denham
UK Information Commissioner
6
Executive summary
The investigation has become the largest investigation of its type by any
Data Protection Authority - involving online social media platforms, data
brokers, analytics firms, academic institutions, political parties and
campaign groups.
One of the recommendations arising from this report was that the
Government should introduce a statutory code of practice for the use of
personal data in political campaigns and we have launched a call for views
on this code.
7
Political parties
8
Facebook
9
take specified steps to comply with PECR regulation 22. We will
follow this up with an audit of the company.
• We are investigating allegations that Eldon Insurance Services
Limited shared customer data obtained for insurance purposes with
Leave.EU. We are still considering the evidence in relation to a
breach of principle seven of the DPA1998 for the company’s overall
handling of personal data. A final decision on this will be informed
by the findings of our audit of the company.
We have also begun a wider piece of audit work to consider the use of
personal data and data sharing in the insurance and financial sectors.
Remain campaign
10
inadequate third party consents and the fair processing statements
used to collect personal data.
Cambridge University
Data brokers
11
issues raised by their services has been expanded to include their
activities in political campaigns.
12
172 organisations
identified.
71 witnesses
of interest.
40 ICO 31 information
notices issued
investigators
2 warrants executed
monetary penalties
enforcement notices 1 criminal
prosecution
22 documents
seized.
85 pieces of equipment
seized including servers.
13
1. Introduction
1.1 Background
1
https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/05/blog-the-
information-commissioner-opens-a-formal-investigation-into-the-use-of-data-analytics-
for-political-purposes/
14
these organisations of interest to the investigation are located outside the
UK.
15
Similarly, we spoke to nearly 100 individuals of interest, including through
formal interviews, and we continue to engage with people who hold
information of relevance to the investigation.
The aim was to understand how political campaigns use personal data to
micro-target voters with political adverts and messages, the techniques
used, and the complex eco-system that exists between data brokerage
organisations, social media platforms and political campaigns and parties.
16
We have used the full range of our powers under both the current and
previous data protection legislation, including:
17
the United States (US) to co-ordinate elements of our investigation. We
have legal gateways to share and receive information through the DPA
2018 and that has assisted with our investigation and also those of other
data protection authorities. We also have links to data protection
authorities worldwide through our links to the Global Privacy Enforcement
Network (GPEN).
Rapid developments in technology and social media over the last 15 years
have, inevitably, led to data-driven campaigns, as political parties seek to
follow commercial organisations by taking advantage of increasingly
sophisticated marketing techniques to engage with voters.
The fact that political parties and campaigns all over the world have
invested heavily in digital messaging in recent years shows the potential
to reach more people in an efficient, targeted and accessible manner, for
a fraction of the cost of more traditional methods.
18
Our investigation focused particularly on the data protection principle of
transparency. If voters are unaware of how their data is being used to
target them with political messages, then they won’t be empowered to
exercise their legal rights in relation to that data and the techniques being
deployed, or to challenge the messages they are receiving.
Without a high level of transparency and trust amongst citizens that their
data is being used appropriately, we are at risk of developing a system of
voter surveillance by default.
There is no turning back the clock – digital elections are here to stay. We
need to work on solutions to protect the integrity of our democratic
processes. We believe our call for a statutory code to clearly set out the
law, along with our enforcement action, our engagement with political
parties, campaigns, social media platforms and Universities UK for reform
of the political eco-system are all positive steps.
19
2. Regulatory enforcement action
Under the previous law, anyone who processes personal data must
comply with eight principles of the DPA1998, which state that personal
information must be:
• fairly and lawfully processed;
• processed for limited purposes;
• adequate, relevant and not excessive;
• accurate and up to date;
• not kept for longer than is necessary;
• processed in line with individuals’ rights;
• secure; and
• not transferred to other countries without adequate protection.
2.2 The relationship between the GDPR and the Data Protection Act
1998
The DPA1998 was replaced by the GDPR and the Data Protection Act 2018
(DPA2018) on 25 May 2018. Throughout this investigation, consideration
has been given to all relevant legislation, including transitional provisions.
20
2.3 Failure to properly comply with the Privacy and Electronic
Communications Regulations
This report summarises the areas we investigated, actions taken and any
areas where our work needs to continue. The full details of our findings
21
are – or will be – set out in any final regulatory notices we issue to the
parties subject to investigation.
22
3. Summary of investigations and regulatory action
taken
23
The formal warnings included a demand for each party to provide Data
Protection Impact Assessments (DPIAs) for all projects involving the use
of personal data.
Because parties are using special category data (relating political opinions
and ethnicity), as well as automated decision making and profiling, they
would therefore be required undertake a DPIA under the GDPR.
24
One of the main recommendations from our Democracy Disrupted? report
is that the Government should legislate at the earliest opportunity to
introduce a statutory code of practice under the DPA2018 for the use of
personal information in political campaigns.
We have met with the Cabinet Office, DCMS and the Electoral Commission
to discuss how this can be achieved before the next General Election. We
have launched a call for views on the code.
We anticipate that the code will apply to all data controllers which process
personal data for the purpose of political campaigning. By ‘political
campaigning’ we mean activity which relates to elections or referenda, in
support of or against a political party, a referendum campaign or a
candidate standing for election. This includes but is not limited to
processing by registered political parties, electoral candidates,
referendum permitted participants and third party campaigners, as
defined in the Political Parties and Referendums Act 2000.
25
The Tribunal dismissed this appeal on 10 July 2018, stating that UKIP’s
response to the information notice was brief, inadequate and, in some
instances, possibly inaccurate - and that UKIP’s apparent willingness to
co-operate with the Commissioner’s enquiries, rendering an information
notice unnecessary, was insufficient grounds for allowing the appeal.
UKIP has since appealed this dismissal decision to the Upper Tribunal
(Administrative Appeals Chamber), and we are awaiting a date for the
hearing to be set.
Therefore, at the time of writing we are unable to progress the part of the
investigation involving this information notice for UKIP. We will pursue
this once the legal process has concluded, in order to ensure that we have
a complete understanding of UKIP’s practices and involvement with the
other organisations under investigation.
3.2 Cambridge Analytica (CA), Global Science Research (GSR) and the
obtaining and use of Facebook data
26
familiar trading name Cambridge Analytica (CA). For ease of reading we
will be referring to all the company entities using ‘Cambridge
Analytica/CA’, unless there is a specific point which requires further
clarification.
Over the course of 2011 and 2012, the office of the Irish Data Protection
Commissioner (IDPC) audited Facebook’s European headquarters in
Ireland and identified concerns surrounding the prominence of Facebook
privacy policies and giving users more granular privacy controls regarding
the use and accessibility of Facebook friends’ data.
27
community. This included many individuals involved in research
eventually going on to work at the company. We understand that this
engagement with academics continued up until 2016.
Any new apps on the platform were automatically added to API V2 and
did not have access to Facebook friend data.
By 2014, Facebook had begun to migrate third party apps from API V1 to
V2, which limited developers’ access to Facebook friend data. In order to
28
ensure continuity of service for Facebook users and app developers,
Facebook gave developers a one-year ‘grace period’ in order to allow time
to adjust their apps’ code and also to adapt their business models to
account for the withdrawal of access to Facebook friend data.
During the course of our investigation, the ICO has reviewed evidence
which suggests around the same time in 2014, CA wanted to take
advantage of the pre-existing access to Facebook friend data enjoyed by
app developers with access to V1 of Facebook’s API. They planned to use
this data in order to create data models which would inform on their work
on electoral campaigns in the USA. However, CA themselves could not
access V1 at this time because they did not have a pre-existing app on
the platform.
29
friends and colleagues; many of whom had been involved in earlier
campaigns in North America.
The ICO has evidence that CA staff assisted Dr Kogan to set up GSR.
Once the company was set up and a contract signed with CA, Dr Kogan,
with some help from Chris Wylie, overhauled the ‘CPW Lab App’ changing
the name, terms and conditions of the app into the ‘GSR App’ which
ultimately became thisisyourdigitallife (the app). Information reviewed by
the ICO suggests that in order for a Facebook user’s data to be harvested
and processed by CA, the user, or one of their Facebook friends, would
have had to log into and authorise the app. The data of these users and
their Facebook friends was then available to GSR and, ultimately, to CA.
30
• news feed posts;
• Facebook Friends lists;
• email addresses; and
• Facebook messages.
The app also requested permission from users of the app to access the
following categories of data about their Facebook Friends (again, subject
to the settings they had selected):
The total number of users of the app, and their Facebook friends, whose
data was accessed through the use of the app, was estimated by
Facebook to be approximately 87 million.
31
A full list of the countries and locations of users affected has been
published by Facebook. For some of this Facebook data, estimated to
involve approximately 30 million US users, the personality test results
were paired with Facebook data to seek out psychological patterns and
build models.
• The GSR app (the app) was able to obtain the data of Facebook
users who used the app.
• Additionally, the app was also able to obtain the data of the app
user’s Facebook friends (app user’s friend).
The precise nature and quantity of data which was available for the app to
access was defined by the particular ‘privacy settings’ which the app user
and the app user’s friend selected on their own Facebook profiles.
Unless it was specifically prevented by the app user, and the app user’s
Friend, the app was able to access the data of both persons by default.
Once the data had been obtained by GSR, it was then modelled and
transferred to a secure ‘drop-zone’. From this drop-zone, CA was then
32
able to extract the modelled data relating to data subjects that they were
interested in and for whom they had pre-existing data.
CA’s internal data scientists then performed further data modelling and
created ‘proprietary data models’ that they then used during their political
targeting work in the US.
Using our powers under the DPA1998, the ICO obtained a warrant for
access to the premises of CA. We executed the warrant at 20.00 on 23
March and concluded the search at 03.00 the following morning. We
subsequently secured a further warrant and searched other premises
linked to the companies.
33
been deleted. We will be making sure any organisations, which may still
have copies of the Facebook data and its derivatives demonstrate its
deletion.
34
controlling the manner and frequency with which that data was harvested
from the platform.
3.2.3 Regulatory issues for SCLE Elections Ltd (SCLE) and Cambridge
Analytica (CA)
On 3 May 2018, Cambridge Analytica and SCLE as part of the SCLE Group
were placed into administration. Since then the companies have ceased
trading.
Had SCLE still existed in its original form, our intention would have been
to issue the company with a substantial fine for very serious breaches of
principle one of the DPA1998 for unfairly processing people’s personal
data for political purposes, including purposes connected with the 2016
US Presidential campaigns. For ease of reading we’ll again refer to
Cambridge Analytica throughout this section.
Facebook users who accessed the app, together with friends of those
Facebook users, were not made aware:
35
• that their personal data would be used for the purposes of political
campaigning;
• that their personal data would be processed in a manner that
involved drawing inferences about their political opinions,
preferences and their voting behaviour.
36
The underlying objective of issuing a monetary penalty is to achieve
ongoing compliance and best practice, with the organisation being held to
account for previous failings, and to act as a deterrent against other
similar behaviour.
A specific example of CA’s poor practice with regard to data protection law
was its failure to deal properly with a subject access request submitted in
January 2017 by Professor David Carroll.
The terms of the enforcement notice were not complied with by the
deadline of 3 June 2018.
37
Given the seriousness of these issues and the public interest concerns
they raise, we have pursued criminal proceedings against the company as
the corporate entity responsible.
38
made it clear that the fine - the highest permitted by the DPA1998 -
would have been significantly higher had these failings occurred after the
GDPR and the DPA2018 replaced the DPA1998 in May of this year.
These failings meant Dr Kogan and his company GSR were able to harvest
the data of up to 87 million people worldwide, without their knowledge, as
described in section 3.3.1. A subset of this data was later shared with
other organisations, including CA.
39
regulator and other national data protection authorities to develop a long-
term strategy on how we address these issues.
Concerns have been raised about the closeness of the two organisations
including suggestions that AIQ, SCLE and CA were, in effect, one and the
same entity. AIQ did some work directly for some campaigns during the
EU referendum (see section 3.6) so when CA indicated that it did not work
on the EU referendum, the claim seemed to be misleading.
Our concern however, given our remit, was focused on whether there was
any truth to allegations that UK data had been processed in Canada by
AIQ outside the protections of the DPA1998.
40
telephone and email canvassing. In October 2014 AIQ also placed online
advertisements for SCLE on behalf of its clients. This work concluded in
November 2014.
AIQ has explained in its responses to us that all work was conducted with
SCLE and not the trading name company CA, and we have uncovered no
evidence in the material so far recovered that personal data, including
that of UK citizens, was shared with them by CA.
While there was clearly a close working relationship between the entities
and several staff members were known to each other, we have no
evidence that AIQ has been anything other than a separate legal entity.
41
SCLE was listed as one of the main contacts for at least one of the AIQ
Facebook accounts, and the email address for that contact belonged to an
SCLE employee who was also involved in a number of payments. This
pattern is suggestive of a close relationship between the companies but
ultimately we have concluded that this was a contractual relationship -
AIQ provided adverts for SCLE. To ease the administration of this contract
the payments and access arrangements above appear to have been put in
place.
In summary, we found that the relationship between AIQ and SCLE was a
contractual one; AIQ supplied services as outlined above for work on US
campaigns. We found no evidence of unlawful activity in relation to the
personal data of UK citizens and AIQ’s work with SCLE. To date, we have
no evidence that SCLE and CA were involved in any data analytics work
with the EU Referendum campaigns. Our findings to date regarding UK
citizens have been informed by the federal Office of the Privacy
Commissioner of Canada. The Office of the Privacy Commissioner of
Canada and Office of the Information and Privacy Commissioner for
British Columbia have an ongoing investigation into AIQ and have not yet
made findings.
On 5 April 2018 the OPC and OIPCBC announced that they were jointly
investigating Facebook and AIQ as to whether the organisations were in
compliance with Canada’s Personal Information Protection and Electronic
Documents Act (PIPEDA) and the BC’s Personal Information Protection Act
42
(PIPA). That investigation is ongoing, but they have advised us that they
have not located any UK personal data, other than that identified within
the scope of our enforcement notice.
43
• On 23 October 2015, representatives of Leave.EU met with CA
staff; this was a basic introductory meeting to express interest in
potentially working together.
• On 18 November 2015, CA appeared at a press conference with
Leave.EU.
• On 20 November 2015, CA went to Leave.EU’s Bristol offices to
pitch their product.
• On 8 January 2016, representatives of Leave.EU met CA in London,
and CA presented a proposal for future work together.
During our investigation, allegations were made that CA was paid for work
on UKIP membership data in 2015, and that Leave.EU paid for this work.
On 11 October 2017 the ICO served an information notice on UKIP as part
of this investigation. UKIP appealed this information notice - we set out
the legal situation in relation to UKIP in section 3.1.1.
Leave.EU and Eldon are closely linked. Both organisations share at least
three directors, and there is further crossover of both employees and
projects.
44
We investigated allegations that Eldon shared customer data obtained for
insurance purposes with Leave.EU and that the data was then used for
political campaign purposes during the EU referendum, contrary to the
first and second data protection principles under the DPA1998.
45
We have evidence to show that some customers’ personal data, in the
form of email addresses, held by Eldon was accessed by staff working for
Leave.EU and was used to unlawfully send political marketing messages.
46
3.5.2 Leave.EU sending unsolicited marketing information to Eldon
Insurance (trading as GoSkippy) email subscribers
The full factual and legal considerations are set out in the notices. Taking
all of these factors into account, the Commissioner has notified her intent
to impose penalties of £60,000 on each company.
The NOIs set out our areas of concern and invite their representations.
Their representations are due by 5 December 2018 and we have taken no
final view on the case at this time. We will consider carefully any
representations both organisations may wish to make before finalising our
views.
47
3.5.3 Leave.EU newsletter sent to Eldon customers
Eldon claimed that the ICO had been made aware of the error. However,
we have no record of any such incident being reported to us and have
asked the company for details to confirm this. We established that this
incident occurred on 16 September 2015, when Leave.EU marketing staff
sent an email newsletter, intended for Leave.EU subscribers, to more than
319,000 email addresses on Eldon’s customer database.
The full factual and legal considerations are set out in the NOI (Annex ii),
but a key factor is that Leave.EU did not have the consent of the
subscribers for the 296,522 unsolicited direct marketing emails it sent.
The NOI sets out our areas of concern and invites their representations.
Their representations are due by 5 December 2018 and we have taken no
final view on the case at this time. We will consider carefully any
representations Leave.EU may wish to make before finalising our views.
48
3.5.4 Personal data and the University of Mississippi (UoM)
But Leave.EU did explore creating a new organisation, called Big Data
Dolphins, with a view to collecting and analysing large quantities of data
for political purposes. They explored this project with other organisations,
including the UoM.
We investigated Big Data Dolphins, and the possibility that the personal
data of UK citizens was ever transferred to the UoM. We engaged with
Leave.EU, Eldon and the University itself.
3.6 The relationship between AggregateIQ (AIQ), Vote Leave and other
Leave campaigns
The majority of the ads – 2,529 out of a total of 2,823 - were created on
behalf of Vote Leave.
In the run-up to the referendum vote in June 2016, AIQ ran 218 ads
solely on behalf of Vote Leave and directed at email addresses on
49
Facebook. In response to our information notice, Facebook stated that the
email addresses did not originate from data collected through Dr Kogan’s
app but came from a different source (as an analysis of the accounts
affected by the GSR app did not return a greater than random chance
match to the target audience).
Facebook confirmed that Vote Leave and BeLeave used the same data set
to identify audiences and select targeting criteria for ads. However,
BeLeave did not proceed to run ads using that data set. The Electoral
Commission report dated 17 July 2018 confirms that BeLeave did not
submit an electoral return.
Vote Leave ran 1,034 ads between 19 April 2016 and 20 June 2016.
Payment for all of these Facebook ads was made by AIQ, and amounted
to approximately $2 million (£1.5 million) between 15 April 2016 and 23
June 2016. Our regulatory concern was whether, and on what basis, the
two groups shared the personal data of UK voters between themselves
and others in order to target these ads.
50
3.6.1 The use of UK personal data
In response to our investigation, AIQ stated that it used the Git repository
as a form of version control for its work, allowing it to create back-ups of
code during development. Its response when asked about the 1,439 email
addresses was that the emails were stored as part of a back-up process
and were then not deleted, contrary to its usual procedure.
AIQ appealed our enforcement notice to the First Tier Tribunal, asking for
more specific details. After receiving its points of appeal, we decided to
vary the original enforcement notice to clarify the steps AIQ should take.
We have the legal power to vary the enforcement notice under section
153 of the DPA2018.
51
The enforcement notice was reissued on 24 October 2018 with specific
instructions for AIQ. The company has accepted the revised notice and
the Tribunal has allowed it to withdraw its appeal.
It was during this period that the Information Commissioner advised the
Canadian Parliament that AIQ had not been cooperating with our
investigations, noting that it had previously not answered our questions
fully - or at all. Since April 2018, AIQ agreed to co-operate with our
investigation in full.
AIQ had in its possession and control, personal data of individuals in the
UK as a result of work it did on behalf of a UK client.
52
DPA1998, and whether that personal data was also unfairly and
unlawfully processed.
In relation to the work AIQ created for BeLeave — this occurred towards
the end of the EU referendum campaign. We know that AIQ was asked to
provide some online advertising on BeLeave’s behalf. This included
placing ads on platforms and landing pages. AIQ provided input in terms
of the content, and whether the advertisement would ‘work’. In respect of
this work, AIQ reported to BeLeave on the number of times an ad was
53
shown, how many people clicked on it and so on. Any data provided on
the website forms was sent directly to BeLeave; AIQ confirmed it had no
access to this information. We found no evidence that BeLeave unlawfully
processed this personal data.
In respect of Veterans for Britain, AIQ created and placed ads at the
campaign’s direction and reported on them. We have found no evidence
that personal data was misused by either organisation in respect of this
work.
In June and July 2018, we served information notices on Open Britain, the
successor organisation to BSiE, and the Liberal Democrats, under the
DPA1998, to investigate these issues.
54
out a simple enhancement service, for example, adding phone numbers
where available.
The party had further worked with BSiE to model electoral roll data, with
a view to highlighting potential voting intentions.
Both the Liberal Democrats and Open Britain denied that party members’
personal data had been sold. Instead, both confirmed that the In
Campaign bought Electoral Register information from the Liberal
Democrats.
We are still looking at how the Remain side of the referendum campaign
handled personal data, including the electoral roll, and will be considering
whether there are any breaches of data protection or electoral law
requiring further action.
Whilst the media and public focus on our investigation has understandably
been on the role of CA and whether it may have contravened the law, the
development of the targeting techniques at the centre of this issue date
back over a decade and have their origins in the work of academics at the
Cambridge University. The Psychometrics Centre at Cambridge University 2
was set up in 2005 and is a Strategic Research Network dedicated to
research, teaching and product development in both pure and applied
psychological assessment. One of its key objectives is to provide both
2
https://www.psychometrics.cam.ac.uk/about-us
55
academia and research and development (R&D) departments with
cutting-edge tools tailored for the online environment.
This engagement and other work in the UK and abroad, has identified
some common and potentially serious data protection concerns that we
suspect are widespread across the university sector.
56
However, we have made 42 recommendations following our audit and
have raised the following significant concerns:
57
• There is currently a lack of oversight in relation to the management
of IT equipment and hardware assets and, particularly, the use of
non-university equipment by students and researchers. A new
process should be developed to provide this oversight and
assurance for the university.
The ICO’s investigation into access to personal data via the MyPersonality
application is ongoing. The data contained within the MyPersonality app
database was reported to have been anonymised. However, we are
currently finalising our understanding of the anonymisation techniques
used on the dataset in order to ensure that appropriate measures were
taken to prevent de-anonymisation. It is vital that we evaluate the
likelihood of a full de-anonymisation of the dataset in order to come to a
conclusion about the potential detriment to those potentially affected.
58
What is clear is a serious overhaul of data protection practices is needed
in how higher education institutions handle data in the context of
academic research and, while well-established structures exist in relation
to the ethical issues that arise from research, similar structures do not
appear to exist in relation to data protection.
We looked closely at the role of those who buy and sell personal data sets
in the UK and who were linked to the political campaigns. We had already
started work in this area, looking at common sources of data we came
across during our routine enforcement work. This identified links to this
investigation.
59
During the course of our investigation, we found that some political
parties had purchased datasets of personal data from data brokers and
used this for election and campaign purposes, but had failed to obtain
lawful consent for political parties to use those lists in this way. For
example, the brokers had not explained who the data would be sold to or
how it would be used when it was gathered.
We made enquiries with some of the key data brokers operating in the UK
supplying data to political parties, including Experian, Emma’s Diary
(Lifecycle Marketing (Mother and Baby) Ltd), CACI, GB Group and Data8.
Our investigation revealed that this company had illegally collected and
sold personal information belonging to more than one million people.
The company, which provides advice on pregnancy and childcare, sold the
information to Experian Marketing Services, a branch of the credit
reference agency, specifically for use by the Labour Party. Experian then
created a database which the party used to profile new mothers in the
run-up to the 2017 General Election.
60
The Information Commissioner fined the company £140,000 for this
breach. The full facts of the breach are set out in our Monetary Penalty
Notice dated 9 August 2018.
We have also been looking at the services and operations of the three
main credit reference agencies - Experian, Equifax and Callcredit - in
respect of the services they promote to political parties and campaigns.
61
4. Summary of regulatory action
62
4.3 Criminal prosecutions
63
5. Next steps
A number of the issues set out in this report are still ongoing, or require
further investigation or action, but this will be our final update on the
investigation as a whole. Any enforcement action required in the future
will be announced and absorbed as part of our general regulatory action.
Other issues raised have been merged into other existing operations, such
as the ongoing audits of the credit reference agencies, and any
enforcement that arises from these operations will highlight where our
data analytics investigation has contributed.
And some actions will themselves form the basis for new operations and
activities, such as pursuing the recommendations made in “Democracy
Disrupted? Personal information and political influence” - including
formulating our statutory code and working with the Higher Education
sector to make improvements to its handling of personal information
obtained during research.
64
Annex v: List of 30 organisations that formed the
main focus of our investigation
112
• Twitter
• UKIP
• Ulster Unionist Party
• Veterans for Britain
• Vote Leave
113