Data protection is an area of law that is evolving rapidly. In the European Union (“EU”), the General Data Protection Regulation (EU) 2016/679 (“GDPR”) was implemented on 25 May 2018. Switzerland is currently in the process of revising its Federal Act on Data Protection (“FADP”).[1] Both texts introduce new provisions to regulate data processing, especially in light of the ongoing technological developments. Indeed, profiling and automated decision-making are used in an increasing number of sectors, both private and public. On the one hand, they allow the tailoring of services and products to align with each individual’s needs. On the other hand, these processes are often opaque, and individuals might not be aware that they are being profiled or understand what it involves.
This newsletter aims to give an overview of the specific provisions that enter into account when processing personal data, under Swiss law –with guidance from GDPR and Working parties-, and to give a critical analysis of the limits of such provisions.
Scope of automated individual decision-making and profiling
The notion of profiling means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a person, in particular to analyse or predict elements of such person’s behaviour, interests, preferences, economic situation, etc.[2] Profiling is often used to make predictions about people, using data from various sources to infer something about an individual, based on the qualities of others who appear statistically similar.[3]
Automated decision-making refers to the ability to make decisions by technological means, without human involvement. Automated decisions can result from profiling. Furthermore, for the specific provision of art. 22 GDPR/art. 19 P-FADP to be applicable, automated decision-making (including by profiling) should produce legal effects concerning the data subject or similarly significantly affect him or her.
A legal effect requires that the decision affects the person’s legal rights, such as the freedom to associate with others (for instance through the conclusion or cancellation of a contract), vote in an election, etc.[4] If a decision-making process does not have an effect on people’s legal rights, it could still fall within the scope of art. 22 GDPR/art. 19 P-FADP if it produces an effect that is equivalent or similarly significant in its impact. The law does not set the threshold for the degree of the impact that is required, which should be assessed on a case-by-case basis, but taking into consideration the following elements:
- whether the decision significantly affects the circumstances, behaviour or choices of the individuals concerned;
- whether it has a prolonged or permanent impact on the data subject; or
- if it leads to the exclusion or discrimination of individuals.[5]
Lawful bases for processing: highlight on consent
Under the GDPR and the P-FADP, general principles must be complied with whenever personal data[6] is processed. According to art. 5 (1) GDPR/art. 5 P-FADP, the principles of data processing are the following:
- data processing should be lawful, fair and transparent;
- the purpose of the treatment must be specified, explicit and legitimate;
- the data that is processed should be the minimum necessary (data minimisation);
- data should be accurate; and
- data should not be retained longer than is necessary for and proportionate to the purposes for which it is being processed (storage limitation).
According to art. 22 GDPR and art. 19 P-FADP, automated individual decision-making, including profiling, require the existence of a lawful base to processing.
One of the key provisions is that the data subject has given consent to the processing of his or her personal data for one or more specific purposes. The notion of consent is less precise under Swiss law (art. 5 para. 6 P-FADP) than the GDPR, thus the processor should rather seek to comply with the GDPR definition. According to art. 4 (11) GDPR, consent should be:
- freely given;
- specific;
- informed; and
For automated individual decision-making, the data subject’s consent should, in addition, be “explicit”. This means that the data subject’s attention should be specifically drawn to the kind of data processing that will occur and its consequences. Controllers will need to prove that data subjects understand exactly what they are consenting to. Thus, they should provide them with sufficient information about the envisaged use and consequences of the processing to ensure that any consent the data subject provides represents an informed choice.[7] In other words, for automated decision-making, the specificity of the treatment refers to the kind of treatment that will be done, as well as the finality.
Right to be informed
Both in the EU and in Switzerland, the reform of data protection regulations seeks to adapt to the development of new technologies.[8] In this perspective, transparency is a key aspect of the P-FADP dispositions. This is especially tangible in terms of the rights of the data subject over his personal data.
Closely linked to the condition of explicit consent, art. 13 and 14 GDPR set a list of information that must be provided to the data subject when his or her data is collected. These include the right to be informed.
The data subject’s right to be informed involves the obligation for controllers to ensure they explain clearly and simply to individuals how the profiling and automated decision-making process works.[9] It should be made clear that the processing is for the purposes of both profiling and making a decision based on the profile generated.[10]
In the case of automated decision-making, including profiling, “meaningful information about the logic involved” should be given about how the algorithm arrives at the decision. This means that the controller should explain to the data subject the rationale behind, or the criteria relied on in reaching the decision in an understandable way.[11] This does not necessarily involve a complex explanation of the algorithms used or disclosure of the full algorithm. In addition, the “significance” and the “envisaged consequences” of the processing must be explained to the data subject.
The information should be meaningful and understandable. Such notions should be analysed from the point of view of the data subject.[12] In particular, the quantity and nature of the data used, as well as the weighting of all information on the data subject must be explained. Examples should be given; in the case of credit scoring, data subjects should be entitled to know the logic underpinning the processing of their data and resulting in a “yes” or “no” decision.[13]
Regarding this aspect, an issue arises: indeed, very often the logic involved in the algorithms cannot be explained in human words.[14] In addition, some mechanisms might include some random elements that are difficult to explain. In light of this, it can be argued that it would be more effective and feasible to give information to the data subject about the general operation of the algorithm, instead of how a specific decision was taken.
Also read:
GDPR laws - Compliance and way forward
Right of access and safeguards
The right of access involves the right of the data subject to obtain details of any personal data used for profiling, including the categories of data used to construct a profile.
The data subject does not have to justify himself or have any interest: he or she can invoke their right of access even if it is out of mere curiosity.[15] Furthermore, the data subject cannot waive in their right of access in advance.[16]
Art. 23 P-FADP gives a non-exhaustive list of the information that the data subject can ask to be given, in order to invoke his or her rights. For the most part, they coincide with the information that is given under the right to be informed and in order to obtain explicit consent.
GDPR gives more precise definitions than P-FADP. Thus, we need to look to GDPR for many guidance on P-FADP.
Recital 63 GDPR sets a limit to the data that must be provided: the right of access should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property and in particular copyright protecting any software.
According to art. 22 (3) GDPR, if the basis for processing is a contract between the data subject and a data controller (art. 22 (2) (a) GDPR) or is based on the data subject’s explicit consent (art. 22 (2) (c) GDPR), controllers should implement suitable measures to safeguard data subjects’ rights and legitimate interests. These minimal measures to be taken involve at least the possibility for the data subject to [17]:
- obtain human intervention, by a person who has the appropriate authority and capability to change the decision;[18]
- express their point of view;
- contest the decision.
The possibility to have the decision reviewed by a human being is a key element. The reviewer undertakes an assessment of the automated decision, using all the relevant data, including any additional information provided by the data subject.[19] However, it has clear limits regarding an automated decision because an algorithm, and especially artificial intelligence, evolves by itself and not all of its elements can be understood and reviewed by humans. Thus, if the processor wants to strictly follow the provisions of the GDPR/P-FADP, they might need to limit their use of artificial intelligence and algorithms, which might not be to the benefit of the data subjects themselves as the purpose of algorithms is to reach more accurate decisions, and faster, than human beings.
Conclusion
In conclusion, if the controller is making automated decisions, including by profiling, they must, before such processing, and during the processing of the data –if the data subject invokes their right of access-:
- tell the data subject that they are engaging in this type of activity;
- provide meaningful information about the logic involved; and
- explain the significance and envisaged consequences of the processing.
The obligations of the controller and rights of the data subject, as set both in the GDPR and in the P-FADP, seek more transparency of the process and more involvement of the data subject. To some extent they are fulfilling these objectives. However, as seen above, these regulations are to some extent incompatible with how an algorithm works, especially when it comes to explaining how a given decision was taken. The future will tell if these new regulations have a stimulating or rather a discouraging effect on the use of automated decision-making, including profiling, of personal data.
Footnote
[1] The Project of the revised FADP can be found in the Federal council’s Message FF 2017 p. 6567 ss. It will be referred throughout the present newsletter as “P-FADP”.
[2] Art. 4 (4) GDPR; art. 4 lit. f P-FADP.
[3] Article 29 Data protection Working Party, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, 2017, p. 7 (cited: WP29, p. […])
[4] WP29, p. 21.
[5] WP29, p. 21 ff.
[6] Personal data means any information relating to an identified or identifiable natural person (“data subject”) (art. 4 (1) GDPR)/ art. 4 lit. a P-FADP).
[7] WP29, p. 13.
[8] FF 2017 p. 6567
[9] WP29, p. 16.
[10] WP29, p. 16.
[11] WP29, p. 25.
[12] Mitrou Lilian, Data protection, artificial intelligence and cognitive services – is the General Data Protection Regulation (GDPR) “Artificial Intelligence-Proof”?, p. 56.
[13] Council of Europe, Draft Explanatory Report on the modernized version of CoE Convention 108, paragraph 75.
[14] Mitrou, p. 58.
[15] FF 2017 p. 6685. ATF 138 III 425, recital 5.4.
[16] Art. 23 para. 5 P-FADP.
[17] Recital 71 GDPR.
[18] WP29, p. 27.
[19] WP29, p. 27.