{"id":3736,"date":"2019-04-24T08:59:44","date_gmt":"2019-04-24T08:59:44","guid":{"rendered":"http:\/\/homodigitalis.gr\/posts\/3736"},"modified":"2023-05-15T10:51:50","modified_gmt":"2023-05-15T08:51:50","slug":"%cf%84%ce%bf-%ce%bd%ce%ad%ce%bf-%cf%84%ce%b5%cf%8d%cf%87%ce%bf%cf%82-%cf%84%ce%bf%cf%85-gdpr-today-%ce%b5%ce%af%ce%bd%ce%b1%ce%b9-%ce%b5%ce%b4%cf%8e-copy-2-copy-copy-copy","status":"publish","type":"post","link":"https:\/\/homodigitalis.gr\/en\/posts\/3736\/","title":{"rendered":"Algorithmic transparency and accountability of online service providers: Exercising the right to explanation"},"content":{"rendered":"<div class=\"wpb-content-wrapper\"><p>[vc_row][vc_column column_width_use_pixel=&#8221;yes&#8221; gutter_size=&#8221;3&#8243; overlay_alpha=&#8221;50&#8243; shift_x=&#8221;0&#8243; shift_y=&#8221;0&#8243; shift_y_down=&#8221;0&#8243; z_index=&#8221;0&#8243; medium_width=&#8221;0&#8243; mobile_width=&#8221;0&#8243; uncode_shortcode_id=&#8221;668898&#8243; column_width_pixel=&#8221;746&#8243;][vc_column_text el_class=&#8221;blogtext&#8221; uncode_shortcode_id=&#8221;196994&#8243;]<\/p>\n<h2><strong>The copyright owner of the above image is Any IP Ltd. You can visit their website at <a href=\"https:\/\/anyip.io\/\">https:\/\/anyip.io\/<\/a><\/strong><\/h2>\n<p>Written by <em>Theodora Firingou *<\/em><\/p>\n<p>While making use of various online services, one cannot help but notice that we are constantly bombarded with suggestions regarding not only content that we may like, but also products and services of multiple advertisers. <em>YouTube<\/em> and <em>Netflix<\/em>, for instance, offer recommended videos and movies; <em>Spotify<\/em> even provides you with the <em>Spotify Radar<\/em> to help you discover new music; <em>Facebook<\/em> and others advertise products according to your previous searches. Common denominator of the use of recommendation algorithmic systems by the various online service providers is their will to \u201cplease\u201d the user by enhancing his\/her personalised online experience.<\/p>\n<p>However,<strong> is this attempt to personalise your interaction with the service really that innocent? How does an algorithm decide what I would like to watch or listen to?<\/strong><\/p>\n<p>Probably, most of us have experienced having just met someone and then, all of a sudden, receiving a suggestion to befriend this person on <em>Facebook<\/em>. Or have you ever noticed that a product you were discussing about on the <em>Messenger<\/em> app is afterwards advertised to you on your <em>Facebook<\/em> newsfeed? Doesn\u2019t that feel creepy? And what happens if things get serious? For instance, a user could be discriminated when offered a recommended job or could even end up stranded in a filter bubble with limited worldview or choices.<\/p>\n<p>Baffled by this series of unanswered questions, I was motivated to carry out research on the right to explanation. Five months later and after having conducted empirical research by exercising my right to explanation against major online service providers, I received an award for my thesis on algorithmic accountability and the right to explanation. Herewith I would like to share some of my findings.<\/p>\n<p><strong><i>Problem statement<\/i><\/strong><\/p>\n<p>Due to the complex and opaque nature of algorithmic systems, their extensive use in automated decision-making and profiling has brought up questions regarding issues of transparency and accountability. Algorithms are conceived as a \u2018black box\u2019 and thus hinder any attempt to assess the decision-making process and its results.<\/p>\n<p>At the same time, the European data protection legislation, and predominantly the General Data Protection Regulation (GDPR), demands transparency and accountability on behalf of the data controller and lays down relevant safeguards which the controllers must respect. One of these safeguards is the right to explanation, the existence and scope of which have, however, initiated an extensive academic debate.<\/p>\n<p>In this context, <i>whether or not the implementation of the right to explanation in practice reflects its underlying scope<\/i>, became the main research question of my master thesis. To that end, the following methodology was applied: Firstly, in order to identify the right\u2019s scope, I focused on mapping and analysing the European legislative framework and the relevant legal literature. Secondly, emphasis was given to the conduct of empirical research regarding the implementation of the right to explanation. In particular, the right to explanation was exercised against five different online service providers, who were questioned about the way their recommendation algorithmic systems regarding personalised content and targeted advertisements work.<\/p>\n<p><strong><i>The legal framework<\/i><\/strong><\/p>\n<p>Despite the lack of a neat, explicit \u2018right to explanation\u2019 labelled provision in either the Data Protection Directive or the GDPR, the right derives from Article 22 and Recital 71 GDPR on the safeguards against automated-decision making, Articles 13 (2) (f) and 14 (2)(g) GDPR regarding controllers\u2019 notification duties and, lastly, Article 15 (1) (h) GDPR as well as Article 12 of the Directive 95\/46 on the right of access. In particular, according to Article 22 of the GDPR, \u2018data controllers shall implement suitable measures to safeguard the data subject&#8217;s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his\/her point of view and to contest the decision.\u2019<\/p>\n<p>Moreover, as laid down in Articles 13 to 15 of the GDPR, the data subject shall have access to the personal data and the information about \u2018the existence of automated decision-making, <i>meaningful information about the logic involved<\/i>, as well as the significance and the envisaged consequences of such processing for the data subject.\u2019 Additionally, according to Article 12 of the Data Protection Directive controllers must provide data subjects with \u2018knowledge of the logic involved\u2019 in any automated decision-making.<\/p>\n<p><strong><i>Scope and applicability of the right to explanation<\/i><\/strong><\/p>\n<p>Through the analysis of the relevant legal provisions, the academic debate around them and the contradiction of the argumentation against the right to explanation, it was made possible to identify the right\u2019s scope and applicability. It was concluded that a systemic and teleological reading of the provisions -especially in the light of GDPR\u2019s spirit to empower individuals\u2019 data protection- confirms the existence of the right to explanation.<\/p>\n<p>In particular, it resulted from the analysis that the right to explanation entails the provision of meaningful information about the logic involved to the data subject, in the sense that the meaningfulness of the information provided must be interpreted <i>flexibly<\/i>. Thus, the information may refer to either the <i>system functionality<\/i> or a <i>specific decision<\/i> and it can constitute either <i>ex ante<\/i> or <i>ex post<\/i> explanation in relation to the timing at which the decision was reached. Moreover, in order to assess whether an explanation is meaningful or not the information provided must be examined in the light of its <i>functional value<\/i> (especially with regard to enabling the exercise of the data subject\u2019s rights and freedoms). Additionally, the explanation should lead to <i>individualised transparency<\/i>, in the sense of personalised understanding of information of a meaningful depth. The information should also be <i>intelligible<\/i> and provided in a clear and plain language, so that a regular data subject (i.e. usually a user with no expertise on technology-related matters) would be able to fully comprehend it.<\/p>\n<p>Regarding the applicability criteria of the right, automated decision-making (including profiling) which results in a <i>solely automated decision<\/i> being reached without any meaningful human intervention ought to be taking place. Furthermore, the automated decision must have <i>legal or similarly significant effects<\/i>, which however should be interpreted in a broad sense, including cases where the data subject\u2019s freedoms and rights are endangered or even the case of targeted advertising based on profiling.<\/p>\n<blockquote><p><strong>Lastly, the right to explanation must be respected regardless of <i>trade secrets and IP rights<\/i>. This means that they cannot serve as a justification to refuse providing information and that data protection rights overweight trade secrecy or IP rights.<\/strong><\/p><\/blockquote>\n<p><i><strong>Compliance issues revealed<\/strong> <\/i><\/p>\n<p>Taking into consideration the scope and applicability of the right to explanation, my empirical research focused on examining whether the right to explanation fulfils this scope when exercised in practice. In particular, I filed a number of explanation requests regarding recommended content and targeted advertisements before five online service providers, namely Facebook, YouTube, LinkedIn, Spotify and Netflix.<\/p>\n<p>Unsurprisingly, the analysis of the empirical research\u2019s results revealed a great number of <i>compliance issues <\/i>and a<i> gap between theory and practice<\/i>.<\/p>\n<p>Filing explanation requests and obtaining meaningful information about the logic involved in algorithmic systems responsible for automated decision-making was actually an extremely challenging procedure; it required legal literacy on the matter, organisation, persistence and patience. In other words, it is doubtful whether a regular data subject would ever manage to efficiently exercise his\/her rights after facing such hurdles.<\/p>\n<p>Although <i>privacy policies<\/i> were easily found, they were often problematic in terms of completeness and clarity. Identifying the right <i>communication means<\/i> to contact the controllers was even more troublesome.<\/p>\n<p>However, the most worrying findings resulted from the correspondence with the controllers. Various malfunctions, such <i>as organisational and administrative avoidance strategies, lack of awareness, ignorance and denial to address the requests <\/i>rendered the procedure complicated. Moreover, the explanations provided were not satisfactory; <i>generic, fragmental and misleading information<\/i> was provided and could thus not possibly fulfil the scope and rationale of the right to explanation since it could not be conceived as <i>meaningful<\/i> information. Some controllers refused to provide a full explanation and justified their position either on <strong>trade secrecy grounds or by arguing that Article 22 GDPR, and consequently Article 15 (1)(h), do not apply since the automated processing does not produce legal or similarly significant effects.<\/strong> However, none of these arguments constitute valid grounds on which data controllers could rely in order to avoid providing an explanation to the data subject.<\/p>\n<p>To sum up, the findings of the empirical research on a limited number of broadly used online service providers indicated that <strong>the right to explanation does not fulfil its scope under the European data protection legislation when practically exercised against data controllers.<\/strong> Most worryingly, it was confirmed that the data subjects\u2019 rights are being <strong>significantly disrespected<\/strong> in the online environment. Afterall, maybe we should think twice before celebrating this generously \u2018enhanced personalised experience\u2019 since the legally provided safeguards to protect us against malicious processing of our personal data, especially during automated decision-making and profiling, do not seem to be implemented by major controllers. It is thus <strong>doubtful<\/strong> that we could rely on a transparent and accountable processing of our data.<\/p>\n<p><i>* Theodora Firingou is a lawyer holding an LLM in Penal Law (LL.M, University of Hamburg) and\u00a0<\/i><i>LL.M IP\/ICT Law, KUL. She focuses on data protection &amp; privacy law and mainly on the issues arising from the use of new technologies such as Artificial Intelligence (&#8216;AI&#8217;).<\/i>[\/vc_column_text][\/vc_column][\/vc_row]<\/p>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>Theodora Firingou presents her award-winning research on algorithmic accountability, the right to explanation and targeted advertising.<\/p>\n","protected":false},"author":7,"featured_media":3738,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"give_campaign_id":0,"footnotes":""},"categories":[74],"tags":[],"class_list":["post-3736","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-74"],"_links":{"self":[{"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/posts\/3736","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/users\/7"}],"replies":[{"embeddable":true,"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/comments?post=3736"}],"version-history":[{"count":3,"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/posts\/3736\/revisions"}],"predecessor-version":[{"id":119949,"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/posts\/3736\/revisions\/119949"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/media\/3738"}],"wp:attachment":[{"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/media?parent=3736"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/categories?post=3736"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/homodigitalis.gr\/en\/wp-json\/wp\/v2\/tags?post=3736"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}