The post Data protection and working remotely appeared first on GDPR.eu.
]]>These new circumstances demand a different security stance than working from centralized offices. Especially when it comes to maintaining the data security that the GDPR requires.
If you’re suddenly managing remote teams, it can be daunting to think about data security with everything else that’s going on. The GDPR, in general, requires that companies keep personal data private and secure.
This article will show you how, with a few simple actions, you can help ensure you stay GDPR compliant even as your team is spread out.
Many employees who are not familiar with data security issues may not grasp how a simple slip-up on their part could lead to a data breach that exposes the personal data you are charged to protect. These data breaches can not only undermine consumer confidence in your company but also lead to costly GDPR fines.
A cybersecurity policy that instructs your employees on how to keep your business’s data safe is an important tool in data protection. If you don’t have one, you should make one. If you have a policy but haven’t updated it since everyone began working from home, this is the time to do so. A good place to start is by reviewing the NIST cybersecurity framework, which provides you with a set of best-practice guidelines for all stages of threat identification and mitigation.
The NIST framework covers five areas, all of which are essential components of a successful cybersecurity framework:
Your IT security policy doesn’t have to be a complicated document. It should cover the reasons it exists in the first place and then lay out, in easy-to-understand terms, the exact security protocols your fellow employees should follow. If they’re confused, they can ask questions, but no one is exempt from the policy. You can also use the free templates offered by SANS, a globally recognized cybersecurity training and consultancy organization, as models for your policy.
Get a detailed guide to creating a security policy for your company with ProtonMail’s ebook on IT security for small businesses.
Recital 83 essentially stipulates that personal data must be protected both in transit and at rest. Data is in transit pretty much any time someone accesses it. The data passing from this website’s servers to your device is one example of data in transit. On the other hand, data a rest refers to data in storage, like on your device’s hard drive or a USB flash drive.
The two keys to maintaining data protection when your teams are all working remotely are encryption and controlling access.
Your company’s sensitive data should be encrypted both in transit and at rest. Both Recital 83 and Article 32 of the GDPR explicitly mention “encryption” when discussing appropriate technical and organizational security measures. Encryption is important because if your data is encrypted and there is a breach, the data will be illegible and useless.
Keeping sensitive personal data encrypted is much easier in an office, where your cybersecurity team can maintain server security and monitor your network. But there are simple steps your organization can take so that data remains encrypted, even if it is stored on a device at your employee’s home.
First, all devices that your employees use for work — including their work phone — should be encrypted. Your employees can encrypt the hard drives of their Android, iOS, macOS, and Windows devices. There is also third party hard drive encryption software, like VeraCrypt, that will work on a wide variety of devices.
Much of the software your company likely uses, like Microsoft Office or Adobe Acrobat, also offers you the option to encrypt your saved files. This is another way you can keep your data encrypted at rest. You should follow other basic computer security steps and ensure that all employees follow them too, whether they work remotely or not.
The idea is simple. Hackers from afar aren’t the only danger posed to your data. Laptops and other mobile devices are lost or stolen all the time. Encryption software locks down files and folders so that unauthorized users can’t view the data even if they manage to get into the machine.
You should revisit who in your company has access to sensitive data. Employees should only have regular access to the data they need to complete their daily tasks. Limiting the amount of data each individual can access mitigates the damage one employee’s security lapse can cause.
Your company should also use a corporate virtual private network (VPN) to limit access to your sensitive data. The VPN will encrypt your employees’ connection to your servers, letting them safely and securely access your company’s network. The corporate VPN’s encrypted tunnel will help keep your data safe in transit. It will also prevent attackers that do not have your corporate VPN from accessing your servers.
As a reminder, using public WiFi without a VPN is unwise, particularly if your work deals with sensitive data. These networks can easily be monitored by others. Your employees should even use a trustworthy VPN if they are working from home, just to be safe.
By encrypting your data, limiting each employee’s access, and using a corporate VPN to control access to your company’s servers, you significantly decrease the likelihood of there being a massive data breach.
Human error is one of the main causes of data breaches. Cybersecurity is difficult enough when everyone is in an office on a network you control. Relying on your employees to immediately pick up and master all the new cybersecurity policies and tools you implement while working from home will not be effective.
Your data protection officer or the team in charge of your cybersecurity should plan to run training sessions on the new policy with the entire company. This team should then train your employees (in small groups) on the new security tools and processes they will use in their day-to-day work.
Your employees will still need help even after they are trained on how to use these new tools. Your cybersecurity team should always have someone on standby to respond to questions. If possible, they should also schedule short follow-up video calls with all your employees to evaluate whether everyone is following your new security policy.
By putting some of these suggestions into practice, you can relieve some of the stress of remote work. These are the data security steps that can help you avoid costly GDPR fines.
To boil it down to four steps, the most significant things that you, a small business owner, can do to stay GDPR compliant while your team is working from home are:
The post Data protection and working remotely appeared first on GDPR.eu.
]]>The post How the GDPR could change in 2020 appeared first on GDPR.eu.
]]>However, are other developments, like Brexit, other countries introducing their own data protection laws, and rulings from the Court of Justice of the European Union that could have an immediate effect on the GDPR this year. This blog post will give you a sneak peek at what the next year holds in store for the GDPR, what could change, and how it could impact your business.
While it is true that Facebook, Google, and WhatsApp have received GDPR fines, their number and size disappointed GDPR’s advocates. The French data protection agency fined Google a record €50 million, but this amount is a rounding error compared to its overall budget. For the largest tech companies to truly take data protection seriously, experts think that the fines will need to be much higher. None other than Margarethe Vestager, head of the European Commission, has called for stronger enforcement of the GDPR and policies that promote competition in the tech industry.
Additionally, as of July 2019, some countries — namely Greece, Portugal, and Slovenia — still had not brought their national laws into accordance with the GDPR. Others are still hiring and training staff for these new regulatory bodies. This lag means that the GDPR has not been fully enforced across the EU. Because a country needs national laws in place before they can have a data protection agency, they delay impacts the number of people in the EU who can file a complaint or even just understand their rights. That should end in 2020 as these last countries implement national legislation, incentivizing Greek, Portuguese, and Slovenian companies to ensure they are fully compliant.
This could be a make-or-break year for the GDPR as it attempts to establish comprehensive and strong data protections.
The GDPR has inspired many imitators, from Brazil’s LGPD to the CCPA in California. While many of these laws agree on the broad terms of data protection, each implements these protections in its own way. And these two new regulations are just the start: Canada and Australia are both considering new data protection regulations, and India’s legislature will vote on its Personal Data Protection Bill. In the US, several states, including Nevada, New York, Texas, and Washington, are considering following California’s lead and passing their own data protection law.
Brexit has dominated European news for the past several years, and UK and EU regulators need to create an alternate data protection regulatory framework for the future. However, this will have relatively little impact on 2020, at least as far as data protection is concerned. Despite the fact that the UK formally exited the EU on Jan. 31, 2020, they will still adhere to all the EU standards and regulations throughout this year. That means the GDPR will still be the law of the land in the UK.
The oft-delayed counterpart to the GDPR, the ePrivacy Regulation, seems likely to fall even further behind schedule. In fact, the Permanent Representatives Committee of the Council of the European Union voted down its proposal in Nov. 2019. This makes it likely that there will need to be a revised proposition put forward this year, meaning actual implementation is likely at least still a year off. The ePrivacy Regulation was meant to be implemented in 2017 to replace the current ePrivacy Directive, the current law that governs how cookies are regulated throughout the EU.
In 2015, Max Schrems, an Austrian privacy advocate, filed a complaint with the Irish Data Protection Commissioner challenging Facebook Ireland’s reliance on standard contractual clauses as a legal basis for transferring personal data to Facebook Inc. in the U.S. Essentially, Schrems was arguing that such standard contractual clauses do not provide an adequate level of protection for EU data subjects. This led to a contentious ruling, which was then contested, leading to the Schrems II case, which is currently nearing a conclusion.
At the heart of the matter is Art. 46, which states a data controller (a company that determines how and why data is processed) may transfer data internationally or to a third party “only if the controller or processor has provided appropriate safeguards, and on condition that enforceable data subject rights and effective legal remedies for data subjects are available.”
On Dec. 19, 2019, the Advocate General to the Court of Justice of the European Union (CJEU) released an opinion that upheld that standard contractual clauses could be used to transfer data internationally. However, in the fine print, the AG also suggests that the use of such clauses should be reviewed on a case-by-case basis. It also raises serious questions about the data protections in the US, throwing data transfers to the United States into doubt.
While the AG’s opinion is non-binding, it is often a preview of the CJEU’s ruling. You can expect the CJEU’s final decision — and for another fight over data transfers to begin — later this year.
Check in on this blog to follow up on these stories and other GDPR-related developments. If this is your first time visiting the blog, we have created a GDPR checklist and an overview of the regulation to help business owners with compliance. And if you run a business in the US, we have a checklist for you as well.
The post How the GDPR could change in 2020 appeared first on GDPR.eu.
]]>The post Italy fines Eni Gas e Luce €11.5 million for multiple GDPR violations appeared first on GDPR.eu.
]]>The first fine (in Italian), for €8.5 million, was served because EGL was found to be illegally processing personal data by making marketing calls to individuals that had opted out of receiving such promotional calls. The Italian SA also determined the company did not follow the specific procedures that required it to verify the public opt-out register. These actions are clear violations of Article 6 and Article 13 of the GDPR.
In addition to the fine, the Italian SA is forcing EGL to put in place processes that will prevent it from making similar calls in the future. This includes forcing EGL to verify that it has a customer’s consent before it contacts them as a part of any promotional drive. They are also banned from acquiring data from any third parties, namely list providers, that could not prove that the customers had consented to having their data shared.
The second fine (in Italian), which totaled €3 million, was to sanction EGL’s conclusion of unsolicited contracts (or, basically including new customers on EGL contracts without informing those individuals that EGL was now their energy company) and their use of inaccurate and, at times, forged information on those contracts. This is a clear violation of multiple sections of the GDPR, including several sections of Article 5 and Article 7.
EGL entered into contracts with over 7,000 Italians without their knowledge. In many cases, individuals did not know that EGL was their power supplier until they received their first bill from the company. EGL worked through external agencies that allowed it to acquire new or expiring electricity and gas contracts without ever having to contact the end customer. The Italian SA has ordered the company to take steps to correct this abuse of data fairness and to introduce checks to detect such procedural anomalies in the future.
That EGL is at fault is fairly cut and dry: even if these GDPR violations do not focus on online data, there is little to contest when a company uses inaccurate data to conclude contracts without an individual’s consent. It also indicates that companies may not have considered how the GDPR impacts how they use and protect data offline.
These fines underline the importance every company should place on evaluating how it treats all personal data, not just online data. (It also suggests every company should review which third parties it is working with to get customer data and leads.) We have created a GDPR checklist and an overview of the regulation to help you get started. And if you’re a business in the US, we have a GDPR checklist for you as well.
The post Italy fines Eni Gas e Luce €11.5 million for multiple GDPR violations appeared first on GDPR.eu.
]]>The post What is the LGPD? Brazil’s version of the GDPR appeared first on GDPR.eu.
]]>Brazil’s Lei Geral de Proteção de Dados (or LGPD) brings sorely needed clarification to the Brazilian legal framework. The LGPD attempts to unify the over 40 different statutes that currently govern personal data, both online and offline, by replacing certain regulations and supplementing others. This unification of previously disparate and oftentimes contradictory regulations is only one similarity it shares with the EU’s General Data Protection Regulation, a document from which it clearly takes inspiration.
Another similarity is that the LGPD applies to any business or organization that processes the personal data of people in Brazil, regardless of where that business or organization itself might be located. So, if your company has any customers or clients in Brazil, you should begin preparing for LGPD compliance. Fortunately, you still have time before the law takes effect. And if you are already GDPR compliant, then you have already done the bulk of the work necessary to comply with the LGPD.
In addition to its extraterritorial application, the LGPD and the GDPR agree on several basics when it comes to data protection.
While the LGPD does not have a single definition for personal data, if you read the entirety of the text, you can see echoes of the GDPR’s definition of personal data. The LGPD states in various places that personal data can mean any data that, by itself or combined with other data, could identify a natural person or subject them to a specific treatment. While this definition will likely be clarified as Brazil nears implementation of the LGPD, as currently stated, the LGPD takes a broad view of what data qualifies as personal data, even more expansive than the GDPR.
Article 18 is another section of the LGPD that will look familiar to businesses that have dealt with GDPR compliance. It explains the nine fundamental rights that data subjects have, which include:
While the GDPR is known for granting its data subjects eight fundamental rights, they are essentially the same rights the LGPD mentions. It seems the LGPD split “The right to information about public and private entities with which the controller has shared data” out of the GDPR’s more general “Right to be informed” to make it more explicit.
Despite their similar goals and the apparent influence the GDPR had on Brazilian lawmakers, there are some key differences to note between the two pieces of legislation.
Both acts require businesses and organizations to hire a Data Protection Officer (DPO). However, while the GDPR outlines when a DPO is required, Article 41 in the LGPD simply says, “The controller shall appoint an officer to be in charge of the processing of data,” which suggests that any organization that processes the data of people in Brazil will need to hire a DPO. This is another area that will likely receive further clarification, but as written, it is one of the few areas where the LGPD is more stringent than the GDPR.
Possibly the most significant difference between the LGPD and the GDPR concerns what qualifies as a legal basis for processing data. The GDPR has six lawful bases for processing, and a data controller must choose one of them as a justification for using a data subject’s information. However, in Article 7, the LGPD lists 10. They are:
Having the protection of credit as a legal basis for the processing of data is indeed a substantial departure from the GDPR.
While both the GDPR and the LGPD require organizations to report data breaches to the local data protection authority, the level of specificity varies widely between the two laws. The GDPR is explicit: an organization must report a data breach within 72 hours of its discovery (although different organizations are already testing that deadline).
The LGPD does not give any firm deadline: Article 48 merely states that “the controller must communicate to the national authority and to the data subject the occurrence of a security incident that may create risk or relevant damage to the data subjects… in a reasonable time period, as defined by the national authority.” Since the national data protection agency has not, as yet, been established, there is no guidance for what constitutes a “reasonable time period.”
A regulation is only as strong as its teeth. That is why the maximum GDPR fines are substantial, requiring organizations that commit grave GDPR violations to pay to up to €20 million or 4% of annual global revenue, whichever is higher.
The fines under the LGPD are much less severe. Article 52 states that the maximum fine for a violation is “2% of a private legal entity’s, group’s, or conglomerate’s revenue in Brazil, for the prior fiscal year, excluding taxes, up to a total maximum of 50 million reals” (this works out to roughly €11 million). The LGPD fines are in line with GDPR’s fines for less egregious infractions, but €11 million is not going to concern the world’s largest data processors.
This is not an exhaustive overview of the LGPD, but it should reassure business owners that, in most respects, if you have achieved GDPR compliance, you are already well on your way to complying with the LGPD. Data protection laws are beginning to be considered all around the world, from India to the USA. GDPR.eu will be here to help you keep up with the latest developments and attain compliance.
The post What is the LGPD? Brazil’s version of the GDPR appeared first on GDPR.eu.
]]>The post Do consumers know their GDPR data privacy rights? appeared first on GDPR.eu.
]]>In our 2019 GDPR Small Business Survey, we asked European small business leaders how well they understood their obligations under the GDPR. The results were mixed. While many businesses invested heavily in complying with the GDPR, others seemed not to care. Around half were reported not GDPR compliant on two major aspects of the law.
The survey was a comprehensive look at whether organizations understood how to comply with the GDPR. And it made us wonder about the other side of the GDPR, the people it is intended to benefit: consumers.
The objective of the GDPR was to give individuals more control over their personal data, and it goes about doing this by requiring data protection (ensuring businesses keep data secure) and data privacy (ensuring people can exercise their right to privacy). If companies work hard to be GDPR compliant for the benefit of their current and potential customers, do the consumers know enough about the GDPR to recognize those compliance efforts? How well do they know their GDPR data privacy rights?
The GDPR covers consumers’ data privacy rights in Chapter 3. We’ve summarized all of the GDPR data privacy requirements for businesses in a previous article, but they generally deal with these main areas:
We decided to conduct a non-scientific poll on Twitter. The polling is by no means rigorous, and the sample size is quite small. So take this with a grain of salt. However, the individuals that responded performed no better than the companies we polled. Some people seem to clearly understand their GDPR data privacy rights, while others… not so much. If anything, it seems likely the poll results are skewed by the fact that our followers (and those of our parent company, ProtonMail, which retweeted the poll) tend to be interested in privacy. Check out the results below.
The responses to question 1 reflect a misunderstanding of the right to erasure. Companies are not always required to delete personal data just because someone makes a request. There are several exemptions, such as when data is used to exercise the right to free expression.
Most people got this one correct. The various data privacy rights are listed here.
Most users were also correct here. Information is only considered as protected personal data if it can be used to identify someone.
The crowd was wrong here. Article 8 defines the age at which a person can legally consent to data processing at 16. Countries can set more lenient standards, but it can’t be below 13 years old.
Most people got this one right. All of these are good ways to hold organizations accountable for GDPR compliance.
So it appears that even one year after the GDPR came into force, many consumers and business leaders still do not understand the law. Indeed, the lack of understanding and awareness among consumers may be part of the reason more small businesses don’t prioritize GDPR compliance. We believe data privacy and security are important values and crucial to building a better Internet.
If you’re a business concerned about GDPR compliance, our website is filled with resources you can use. Start with our overview of the GDPR and then check out our GDPR checklist.
The post Do consumers know their GDPR data privacy rights? appeared first on GDPR.eu.
]]>The post Millions of small businesses aren’t GDPR compliant, our survey finds appeared first on GDPR.eu.
]]>One year after the EU’s General Data Protection Regulation entered into force, we were curious to learn more about how the law would affect the 23 million small businesses in Europe and whether they were struggling to comply. Owners, managers, and other GDPR-compliance supervisors answered our questions from all over Spain, the United Kingdom, France, and Ireland. What we found surprised us.
Read the 2019 GDPR Small Business Survey
You can read the full report for more details about our findings and our methodology. But here are some of the key takeaways and results that surprised us most:
You can read more excerpts from the survey, graphs, and data by reading the full 2019 GDPR Small Business Survey.
The post Millions of small businesses aren’t GDPR compliant, our survey finds appeared first on GDPR.eu.
]]>The post What the first Italian GDPR fine reveals about data security liabilities for processors appeared first on GDPR.eu.
]]>The Italian Data Protection Authority, known as the “Garante,” issued the fine against Rousseau on April 4 for violating Article 32 of the GDPR. This is not the first time Rousseau has run afoul of the Garante. The Italian DPA presented the platform a series of recommendations in December 2017 (in Italian) to address its vulnerabilities, and in 2018, it fined the platform €32,000 over concerns that it illegally shared member data with third parties. While authorities admit the security surrounding Rousseau’s data processing has improved, it is still not compliant with GDPR standards, which led to this most recent fine.
Rousseau’s two remaining violations were a failure to adequately anonymize e-voting data and regulate access to the personal data on the platform. Garante found that a small group of individuals from the Rousseau Association and the 5 Star Movement can access the platform and its data (which includes sensitive personal data, such as political preferences) without leaving a trace. In paragraph 4.2, authorities wrote that there was:
sharing of authentication credentials by several employees with high privileges for the management of the Rousseau platform and [a] failure to define and configure the different authorization profiles in order to limit access to only the data necessary in the various fields of operation, which in the previous legal system were qualified as minimum security measures for data controllers… It is, therefore, evident that the failure to adopt such measures and, conversely, the sharing of the authentication credentials among subjects entitled to manage the platform represent a violation
If you would like to read the complete ruling by the Garante, click here. (In Italian)
Article 32 discusses the minimum standards of security that data controllers and data processors must meet. It requires that organizations use both technical protections and administrative processes to “ensure a level of security appropriate to the risk.” It mentions four specific measures that companies should implement:
Rousseau actually did a respectable job meeting the Garante’s recommendations regarding those four factors. However, the fact that they allowed their staff and 5 Star Movement party members to share credentials made it impossible for Rousseau to comply with Section 4 of Article 32, which requires that “The controller and processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller.”
Businesses can avoid these sorts of GDPR fines by preventing employees from sharing credentials. Adding additional encryption to sensitive data would be another. This could also assist with the anonymization of e-voting data. Rousseau, for its part, states that it plans on using blockchain technology to address the issues the Garante has pointed out.
The GDPR has created higher security expectations for the personal data that organizations process. It requires organizations to use advanced encryption, limit their employees’ access to only the personal data they need to do their work, and assess their overall security on a regular basis. The penalty against Rousseau also shows that data processors may be particularly liable for GDPR compliance in terms of data security. In this case, the 5 Star Movement was the data controller and Rousseau was the data processors, but the party avoided any sanction. To avoid GDPR fines, therefore, data processors must also be careful to meet the expectations of Article 32.
The GDPR makes many demands of businesses, but by being proactive and putting in the effort, you can achieve GDPR compliance. Our GDPR checklist and our overview of the law are great places to start. If you’re a business in the US, we have a checklist for you as well.
The post What the first Italian GDPR fine reveals about data security liabilities for processors appeared first on GDPR.eu.
]]>The post Cookies, the GDPR, and the ePrivacy Directive appeared first on GDPR.eu.
]]>Cookies are small text files that websites place on your device as you are browsing. They are processed and stored by your web browser. In and of themselves, cookies are harmless and serve crucial functions for websites. Cookies can also generally be easily viewed and deleted.
However, cookies can store a wealth of data, enough to potentially identify you without your consent. Cookies are the primary tool that advertisers use to track your online activity so that they can target you with highly specific ads. Given the amount of data that cookies can contain, they can be considered personal data in certain circumstances and, therefore, subject to the GDPR.
Before analyzing what the GDPR and the ePrivacy Directive have to say about cookies, it is essential to have a basic understanding of the different types of cookies.
In general, there are three different ways to classify cookies: what purpose they serve, how long they endure, and their provenance.
Duration
Provenance
Purpose
These are the main ways of classifying cookies, although there are cookies that will not fit neatly into these categories or may qualify for multiple categories. When people complain about the privacy risks presented by cookies, they are generally speaking about third-party, persistent, marketing cookies. These cookies can contain significant amounts of information about your online activity, preferences, and location. The chain of responsibility (who can access a cookies’ data) for a third-party cookie can get complicated as well, only heightening their potential for abuse. Perhaps because of this, the use of third-party cookies has been in decline since the passage of the GDPR
The General Data Protection Regulation (GDPR) is the most comprehensive data protection legislation that has been passed by any governing body to this point. However, throughout its’ 88 pages, it only mentions cookies directly once, in Recital 30.
Natural persons may be associated with online identifiers provided by their devices, applications, tools and protocols, such as internet protocol addresses, cookie identifiers or other identifiers such as radio frequency identification tags. This may leave traces which, in particular when combined with unique identifiers and other information received by the servers, may be used to create profiles of the natural persons and identify them.
What these two lines are stating is that cookies, insofar as they are used to identify users, qualify as personal data and are therefore subject to the GDPR. Companies do have a right to process their users’ data as long as they receive consent or if they have a legitimate interest.
Passed in the 2002 and amended in 2009, the ePrivacy Directive (EPD) has become known as the “cookie law” since its most notable effect was the proliferation of cookie consent pop-ups after it was passed. It supplements (and in some cases, overrides) the GDPR, addressing crucial aspects about the confidentiality of electronic communications and the tracking of Internet users more broadly.
To comply with the regulations governing cookies under the GDPR and the ePrivacy Directive you must:
The EPD’s eventual replacement, the ePrivacy Regulation (EPR), will build upon the EPD and expand its definitions. (In the EU, a directive must be incorporated into national law by EU countries while a regulation becomes legally binding throughout the EU the date it comes into effect.)
The EPR was supposed to be passed in 2018 at the same time as the GDPR came into force. The EU obviously missed that goal, but there are drafts of the document online, and it is scheduled to be finalized sometime this year even though there is no still date for when it will be implemented. The EPR promises to address browser fingerprinting in ways that are similar to cookies, create more robust protections for metadata, and take into account new methods of communication, like WhatsApp.
The rules regulating cookies are still being set, and cookies themselves are continually evolving, which means maintaining a current cookie policy will be a continuous job. However, properly informing your users about the cookies your site is using and, when necessary, receiving their consent will keep your users happy and keep you GDPR-compliant.
The post Cookies, the GDPR, and the ePrivacy Directive appeared first on GDPR.eu.
]]>The post Data anonymization and GDPR compliance: the case of Taxa 4×35 appeared first on GDPR.eu.
]]>Taxa 4×35 is a Danish service that allows its users to hail cabs in Copenhagen with an app, similar to Uber. When a user hails a taxi, the Taxa system collects an assortment of data, including the customer’s name, telephone number, the date of the trip, the start and end time of the trip, the number of kilometers driven, the payment, the GPS coordinates of the beginning and end of the trip, as well as written address and other coordinates. Taxa 4×35 then links this data to the user’s tax information to ensure that the proper amount of taxes are collected.
In October of 2018, the Danish data protection agency, Datatilsynet, found that Taxa had kept the data from nearly 9 million taxi rides for five years, well after they were still needed. This hoarding of records goes against Article 5 of the EU’s General Data Protection Regulation, which states that personal data shall be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed,” and “kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed.”
Taxa 4×35’s management thought they were exempt from these two sections of Article 5, which represent the principles of data minimization and storage limitation, because they were anonymizing the data by deleting the names associated with the trip records from their database after two years. (The remaining data was then deleted after five years.) Datatilsynet found this attempt at data anonymization to be inadequate, pointing out that even without the user’s name, Taxa 4×35 still had enough personal information to identify an individual. The agency concluded that “Information about the customer’s taxature (including collection and delivery addresses) can therefore still be attributed to a natural person via the telephone number, which is only deleted after five years.”
You can read the full Datatilsynet statement on Taxa 4×35 here. (In Danish)
The GDPR makes critical differences between personal data, pseudonymized data, and anonymized data. Taxa 4×35’s reasoning that anonymized data can be used much longer than personal data was correct. According to Recital 26, “The principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable.”
However, Taxa 4×35 failed to meet the high standard that the GDPR sets for data anonymization. Earlier in Recital 26, it states that not only must an organization consider whether it can identify an individual using the data it has within its database, but it must also consider:
all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly. To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.
Since it is relatively easy to look up a phone number and match it to an individual, the Taxa dataset is not anonymous. Because the records are not anonymous, they are still subject to the full protections listed in the GDPR, which means that Taxa 4×35 should have deleted the data after two years and had documentation to prove it.
True data anonymization
Effective data anonymization is made up of two parts:
In WP 216, the Article 29 Working Party examined several different methods of data anonymization and clarified what measures data processors and controllers have to take. They specifically say that “removing directly identifying elements in itself is not enough to ensure that identification of the data subject is no longer possible. It will often be necessary to take additional measures to prevent identification, once again depending on the context and purposes of the processing for which the anonymised data are intended.”
In the Taxa 4×35 example, their justification for maintaining their database for five years was business development. In this case, they could have made accurate models of when and where they needed drivers and anonymized their data by deleting all other data besides the the date of the trip, the start and end time of the trip, the number of kilometers driven, and the GPS coordinates of the beginning and end of the trip. Then, they could have grouped this data by day or location rather than by account. This would have allowed Taxa to identify geographic hot spots and rush hours for its drivers, but would not allow it to identify individual data subjects.
The GDPR aims to give individuals control over their personal data, not to prevent companies and organizations from reaping the benefits that analyzing big data can offer. By fully understanding the GDPR requirements regarding the anonymization of data, organizations can continue to process data and reduce their exposure to GDPR fines. Taxa 4×35 made a half-hearted attempt to anonymize its data, and it was caught.
The GDPR has many requirements for how personal data should be handled. It can be daunting, but we made this website to help businesses with the basics of GDPR compliance. See our GDPR checklist and overview of the law to get started.
The post Data anonymization and GDPR compliance: the case of Taxa 4×35 appeared first on GDPR.eu.
]]>The post Data sharing and GDPR compliance: Bounty UK shows what not to do appeared first on GDPR.eu.
]]>In the United Kingdom, Bounty is a well-known but somewhat controversial provider of pregnancy and parenting packages, advice, apps, and maternity ward photos. In the past, they’ve drawn criticism about privacy concerns because of their practice of sending representatives into new mothers’ rooms to sell picture packages. Now, Bounty is in even bigger trouble, this time for data privacy reasons.
This month the UK’s top data protection agency, the ICO, announced the findings of an investigation into Bounty’s data sharing practices. Until April 30 of last year, just before the GDPR entered into force, the company sold 34.4 million user records with outside firms like Equifax (of data breach infamy) without informing the data subjects. The data even included the birth date and sex of newborns. The ICO fined the company £400,000.
Because Bounty ended the practice just before the start date of the GDPR, the practices violated the Data Protection Act 1998, not the GDPR. This fact capped the possible fine at £500,000. The GDPR fine for a similar violation could have reached £17 million (€20 million).
The director of the ICO’s investigations issued a scathing reproach of the company:
The number of personal records and people affected in this case is unprecedented in the history of the ICO’s investigations into data broking industry and organisations linked to this.
Bounty were not open or transparent to the millions of people that their personal data may be passed on to such large number of organisations. Any consent given by these people was clearly not informed. Bounty’s actions appear to have been motivated by financial gain, given that data sharing was an integral part of their business model at the time.
Such careless data sharing is likely to have caused distress to many people, since they did not know that their personal information was being shared multiple times with so many organisations, including information about their pregnancy status and their children.
There’s nothing inherently wrong with sharing people’s personal data with third parties. But you have to go about it the right way. Below are the relevant GDPR requirements if you want to share your users’ personal data outside your organization.
Be clear about your intentions
People have a right to know how their personal data will be used. GDPR Article 12 explains these requirements. These communications must be “concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child.”
You must communicate this information at the moment you collect the data. Article 13 lists the information that must be provided and when.
You must have a lawful basis
GDPR Article 6 and Article 7 deal with the lawful bases for processing personal data. Most likely, in the case of selling user data to third parties, the lawful basis will be consent, which involves extra caution to ensure consent is properly sought and freely given. We’ve previously explained the GDPR consent requirements in detail.
It may seem obvious, but you must gain explicit consent for each of the processing activities you intend to carry out with people’s data. In the Bounty case, the company shared personal data with 39 organizations. Bounty members were unaware that their data would be shared with so many third parties. This infringed upon their ability to exercise their data privacy rights because they didn’t know where their data was being stored or how it was being used.
International data transfers
If you intend to share information with organizations in other countries, this triggers extra responsibilities covered in Chapter 5 of the GDPR. Specifically: “A transfer of personal data to a third country or an international organisation may take place where the Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organisation in question ensures an adequate level of protection.”
There’s no question the GDPR makes it more difficult to profit from other people’s personal data. But that’s the point of the law: it’s other people’s data; if you want to use it, you need to have a good reason, or just ask. Bounty’s data sharing practices clearly crossed the line, and they knew it. That’s why they ended the practice just before the GDPR drastically increased their exposure to fines.
That said, GDPR compliance doesn’t have to be difficult. We built this website to make it easier for businesses to comply. Our GDPR checklist and our overview of the law are great places to start. If you’re a business in the US, we have a checklist for you as well.
The post Data sharing and GDPR compliance: Bounty UK shows what not to do appeared first on GDPR.eu.
]]>