Major Issues

#1

Consent

Since 1995 the EU law requires that consent to a processing operation has to be given "unambiguously". This means that there has to be some affirmative action by a person. However some have interpreted the term to e.g. include "using a webpage", not turning off “cookies” in your browser, or using services that are subject to (lengthy and unclear) privacy policies.

The European Commission wanted to clarify once and forever that only an active "yes" constitutes a consent ("explicit consent"). This does not mean that data cannot be used on another basis (e.g. a contract, a law or "legitimate interests"), but if a company relies on a consent, it should be a clear "opt-in" by the user.

#2

Minimizing / "Not Excessive"

See Article 5(c)

The principle of "data minimization" means that only data that is really necessary should be collected, stored and processed. It is currently known in many member states, but was removed by the Council in the current proposal. Instead companies should be allowed to collect and store as much information as they wish, as long at the storage is "not excessive" – whatever this may mean in practice.

Example: Companies may collect and store much more data than they need and e.g. ask for additional details when you visit a web shop. Data must not be deleted as soon as it is not necessary anymore, leading to unnecessarily large "digital footprints" that can be breached or misused.

#3

Legitimate Interests

See especially Article 6(1)(f) and Recital 38-40

A controller can process your data on many grounds (e.g. consent, a legal duty, a contract with you). If none of these grounds are applicable, the law allows controllers to also use your data when they pursue a so-called "legitimate interest". The key question is: What is a "legitimate interest" that overrides the user’s right to privacy in these cases?

Recitals 38-39 define "legitimate interests" to include basically any business activity. This is done by e.g. naming "direct marketing", "credit information services", "transmitting personal data for public security purposes, which are not required by law" as examples of "legitimate interests".

Examples: Sale of financial data (from banks, paypal or a credit card company) to "advertisement" or "credit scoring" agencies, without consent of the data subject; Use of transmission data by an internet provider for advertisement, credit ranking and (without any legal basis) monitoring the users for illegal behavior.

What used to be an "exception" in national laws (e.g. the use of data for "direct marketing") and subject to restrictions in many member states is now defined as the "typical case" of a "legitimate interest". If "sending people advertisement" is a "legitimate interest" that overrides the fundamental rights to data protection, it is hard to think of any other business activity that would not be a "legitimate interest".

#4

Purpose Based Processing / Further Processing

See especially Article 5(b), Article 6(3a) & (4), Recital 40

EU law does not regulate data sharing between companies. The main European legal limitation for data flows is the "specific purpose" of each processing operation. EU law relates to "purposes", because sharing information is often necessary and useful, but unexpected use (even within a company) is typically problematic.

Example: It is reasonable that an online store forwards your data to a credit card company and the postal service in order to process payments and deliver the product. On the other hand the store should e.g. not sell your purchase history to someone unrelated for another purpose (e.g. advertisement or credit scoring).

Many other principles of EU data protection law relate to the purpose (e.g. we consent to the use of data for a specific purpose, or data must be deleted when it is not necessary for the purpose anymore). This makes "purpose based processing" the backbone of European data protection law. The principle is also recognized as a "fundamental right" in Article 8(2) of the Charts of Fundamental Rights and very prominent in the current law.

The newly introduced concepts of "compatible purposes" and "further processing" are circumventing this crucial protection and are in essence abolishing "purpose based processing" in Europe – allowing almost unlimited sharing, linking and dealing with personal data.

Examples: Sale of your health data to the pharmaceutical industry to generate e.g. statistics on the prescription of certain drugs by each hospital. Sale of financial data (e.g. from banks, paypal, credit card companies) to "advertisement" or "credit scoring" agencies, without consent of the data subject. Your Facebook, Gmail or your iCloud data could be sold to "data brokers".

Council documents mainly name "economic interests" as the reason to change the law and abolish "purpose base processing" in Europe. Lobby papers show that data brokers want to benefit from unlimited sharing, cross-referencing and analytics of data, independent from the original source. It is questionable if this approach is compliant with Article 8 of the Charta of Fundamental rights.

#5

Profiling

See especially Article 20

Currently we see very fascinating developments in the field of data analytics (e.g. "big data"). Such systems can also be used against the interest of users and consumers. There has been a section in the law that limited "automated decisions" since 1995, but given the developments this law was updated.

The Council has now limited the scope of the law by replacing the word "measure" with "decision" that “produced legal effects or significant effects” a person. In combination with the exceptions in Article 20, the overall wording is so narrow, that the law does not cope with the realistic threats of "big data" analytics.

Examples:An airline may engage in "price discrimination" and only give specific people a certain day a discount (e.g. because the airline knows that they would otherwise not fly), while everybody else will have to pay a much higher price than before (e.g. because the airline knows that they need to fly on certain days).

An advertisement company may perform deep analytics of all your data and e.g. calculate your income, interests, location, political thoughts, sexual preferences and alike, none of which constitutes a "decision" – as the Council proposal requires.

#6

Risk Based Approach

The idea of a "risked based approach" is logical: The duties of data controllers should be scaled according to the risk of the processing. The problem is, that "risk" is very subjective and it seems impossible to define "risk" precise enough to make these rules enforceable.

This results in a situation where controllers can simply claim that according to their assessment their data processing operations constitute a "low risk" and thereby bypass legal obligations that would otherwise apply to them.

On the other hand it seems that especially small companies will face an even more complicated legal situation, as they will need to do a very complicated legal analysis to figure out if a legal obligation applies to them or not.

#7

Right to Access:

See especially Articles 12 and 15

When it comes to privacy violation it is typically very hard to find out what a company knows about you, where it got the data and what it uses if for. Most EU countries also have no procedural rights to get information when a dispute arises. This means that the right to access is typically the core element to even understand, if a company is complying with the law, or if old or incorrect data stored about a person (this is e.g. important in the credit ranking context).

Some Council proposals are limiting this right massively, by requiring fees, allowing extensive times to respond to access requests or introduce wide-reaching exceptions that can be used by companies to hinder the right to access, especially for average citizens with no legal expertise.

#8

Codes of Conduct ("Lobby-Laws")

See Article 38 & 38a

The Article was changed from having non-binding "guidelines" to allowing industry groups to "specify" vague terms of the law (e.g. "legitimate interests") or the details how a data subject should "exercise his/her rights".

The industry groups even have a right to get these coded approved by national data protection authorities (DPAs) this was achieved by changing "may" to "shall". DPAs have to approve these codes, unless they violate the Regulation (which will be hard to prove given the numerous vague terms). If a DPA does not approve a code, industry groups can appeal against the DPA, which allows them to generate substantial legal pressure. Users, consumer rights organizations or NGOs on the other hand will not be heard in the procedures, while the result ("code of conduct") will be binding for them.

The main argument, reported by persons involved in the decision, was that the Council could not agree on specifying certain terms. To avoid "administrative burdens" the conclusion was reached to simply let the industry "self-regulate" the fundamental rights of citizens.

In fact Article 38 privatizes power to pass unilateral de facto laws. It is questionable if this approach is legal under the EU treaties and national constitutions.

Example: A financial lobby group may agree that it is a "legitimate interest" for them to exchange information on requests for financial products, payment histories, income, property, jobs, family status and other information on each European. Because they are subject to "access requests" they agree that it is unreasonable for customers to request their personal data more than once every three years. To make it even harder to get a copy of your data, they also introduce complicated forms and additional requirements to proof your ID. Through a code of conduct, they would be able to "specify" the law in such a way and make it binding for everyone else.

#9

Pseudonyms

See especially Article 4(3b)

"Pseudonymous data" is data, which can be attributed to a person, even though the information that obviously relates to a person was removed.

Example: A company has a normal database, but splits it in two parts. One part holds information that only relates to your "Costumer ID", the second data base holds your personal details (name, address, email) and the Costumer ID. Together all information constitutes "personal data", looked at it separately the first database only holds "pseudonymous data" on you.

In reality most databases and tracking methods are only using "pseudonyms" like a "cookie ID" to track your browser or a "device ID" to track your smartphone. When you log into a system you usually only use a "user ID", not your real name. This shows that "pseudonyms" are not really making things more private – they are merely a technical operation. Only in the case of a "data breach" it may for example be helpful if personal data is separated from "pseudonymous data", as long as hackers only catch one data base.

Some member states are trying to lower the level of protection overall for "pseudonymous data", which may mean that you can be tracked or analyzed more extensively and you lose your rights as long as companies are keeping information like your real name separate from the data that is used to analyze or track you.