VIDEO on the CONFERENCE Legality Attentive AI: Awareness Conference on Explainability of AI|COMANDE GIOVANNI

VIDEO

Legality Attentive AI: Awareness Conference on Explainability of AI

28th of January, 2022

Time 16:00 – 17:30 CET

Teams Platform

For further info on the event and registration click here Legality Attentive AI Conference | Brussels Privacy Hub

CALL FOR EXPRESSION OF INTEREST FOR THE PREDICTIVE JUSTICE PROJECT

The LiderLab  of the Scuola Superiore Sant’Anna and EMbeDS are seeking applications from candidates who can make outstanding contributions to the development and the experimentation of an innovative platform for inquiring and interpretating legal documents across natural language processing techniques based on deep learning models.

FOR FURTHER INFO ABOUT THE CALL AND TO SUBMIT THE EXPRESSION OF INTEREST

 

 

 

 

 

 

 

 

 

Legality Attentive AI: Awareness Conference on Explainability of AI

Webinar organised by LeADS in collaboration with the Brussels Privacy Hub

28th of January, 2022

Time 16:00 – 17:30 CET

Teams Platform

For further info on the event and registration click here Legality Attentive AI Conference | Brussels Privacy Hub

To celebrate Data Protection Day, the Legality Attentive Data Scientists H2020 project with the Brussels Privacy Hub will explore one of the biggest open challenges of data protection law: Explainability and accountability of AI. The event, with high-level stakeholders and experts, will address the twists and thorns of developing legality attentive AI as a standard for our societies.

AI raises concerns for many fields of its actual and possible application for its risks of extending control on individuals and further unbalancing powers among individuals and\or with institutions and businesses. Keeping AIs in line with the law and with the EU fundamental values and ethical principles is more than a need, it is the footprint of the European approach and benchmarking of research and production of AI-based solutions.

Explainability is often offered as an answer to many concerns related to AI development and deployment. Nevertheless, explainability is not always possible yet, and explainability itself can be problematic for personal data protection.

The webinar will be moderated by Giovanni Comandé, Professor of Private Comparative Law at Scuola Superiore S. Anna Pisa. Gianclaudio Malgieri, CoDirector of the Brussels Privacy Hub will give introductory remarks.

Confirmed speakers are;

  • Paul Nemitz – Principal Advisor in the Directorate-General for Justice and Consumers of the European Commission,
  • Catelijne Muller – President of ALLAI, Member High Level Expert Group on AI and OECD Network of Experts on AI
  • Fosca Giannotti – Director of research of computer science at the Information Science and Technology Institute “A. Faedo” of the National Research Council, Pisa, Italy
  • Riccardo Masucci – Global Director of Privacy Policy – INTEL
  • Dafna Feinholz – Bioethics and Ethics of Science Section, UNESCO

 

 

SoBigData ++ and LeADS joint Awareness Panel

Recent Perspectives on Dynamic Consent in Research: a Combined Legal and Technical Approach

18th of January 2022, 1:00 – 2.30 PM CEST

Join us on Webex
password: WGkgwhYa563 (normally not requested)

Speakers: Roberta Biasotto; Tommaso Crepax; Cristian Lepore; Deborah Mascalzoni Giulia Schneider

Dynamic Consent (DC) uses information technology to enable continuous communication and interactive consent. It allows research participants to change their choices and preferences on participation and receive updated information on the research that is being conducted with their data and samples.

In this awareness panel, we therefore reflect on Dynamic Consent:

  1. From an ethical and legal point of view, particularly on how DC may:
    • Enable an appropriate balance between the protection of participants’ fundamental rights, including their right to data protection under the GDPR, and the promotion of data sharing and research, with particular regards to the case of secondary uses. In this respect the definition of different data protection regimes will be advanced in accordance to the nature of research that are targeted with secondary uses and to the nature of the subjects involved.
    • Empower the research participants with more control over their data within the new European Health Data Space.
    • Be enhanced to address the inequalities entailed by the digital divide.
  2. From the research participant’s point of view, particularly on how DC may help build a transparent trust relationship between participants and researchers.
  3. From a technological point of view, particularly on how DC technically works and the related technical opportunities and challenges

Schedule

13.00 – 13.15 Greetings
Prof. Giovanni Comandé, Lider Lab, Scuola Superiore Sant’Anna

13.15 – 14.15 Roundtable

Dynamic Consent in practice: the Chris Study, Dr . Deborah Mascalzoni, EURAC
Research Center

Participant’s Perspective and Dynamic Consent, Dr . Roberta Biasotto , EURAC Research Center

Dynamic Consent for Responsive Data Governance : a Legality Attentive Analysis, Dr. Tommaso
Crepax, Lider Lab, Scuola Superiore Sant’Anna; Dr. Cristian Lepore, Irit , University Paul Sabatier
Toulouse III

The Possible Interplay Between a «Dynamic» Consent and other Legal Bases for Research in the
GDPR, Dr. Giulia Schneider, Catholic University of Milan

14.15 – 14.30 Final Discussion

Moderator
Prof. Giovanni Comandé, Lider Lab, Scuola Superiore Sant’Anna

Join us on Webex
password: WGkgwhYa563 normally not requested

Contacts: Segreteria Lider Lab
email address: segrliderlab@santannapisa.it
ph. +39 050883533

 

 

PRE-PRINT ”Elgar Encyclopedia of Law and Data Science”

Elgar Encyclopedia of Law and Data Science

Edited by Giovanni Comandé, Professor of Law, Sant’Anna School of Advanced Studies (www.santannapisa.it) and Coordinator, LIDER-Lab (www.lider-lab.it)

Publication Date: February 2022

For further info

 

SELECTION for 4 MARIE CURIE EARLY-STAGE RESEARCHERS (ESR) positions at Scuola Superiore Sant’Anna, Italy, funded in the framework of the “Legality Attentive Data Scientists (LeADS) Project” (Grant Agreement n. 956562) – RANKING LIST

ranking list_LeADS

Admission to the Programme
In case of equal score the youngest candidate prevails.

Solving the conflicts between data owners and data exploiters through a spectrum of quasi-property models

The European Union keeps moving forward with its plans for a regulatory framework to guide the data economy development and foster data-driven innovations for further economic and societal growth.[1] The use of and access to data plays a key role in this context, and different actors can have different priorities. In particular, individuals and companies both have an interesting in enjoying a degree of control over the information used to fuel these data-driven innovations: individuals because the use of data related to them might affect them, and companies – and other controllers – because they might wish to generate economic and societal development by processing data.

 

This reopens and further develops a question to which no single uniform answer has been found yet: what exactly is data, whom it belongs to, and what legal relationship is there between the subject and the data? The answers to these questions are extremely relevant, in particular where the data economy has moved as far as using personal data as consideration for digital services.[2] Seeking answers from a legal perspective can be troublesome as there are different regulations, even in the European context, that provide different and, in some cases, contradictory solutions.

 

The question is particularly timely as a proposal for a Data Act should be published soon by the European Commission. While the exact content of the Data Act is still unknown as the proposal from the European Commission hasn’t been published, this piece of legislation is intended to address a considerable number of issues surrounding the data economy and the possibility of data ownership.

 

Currently, from a legal point of view, there are different notions of what data means exactly. Often, we tend to defer to the General Data Protection Regulation (GDPR- and the realm of data protection regulations to answer this where data is associated with an individual and known as ‘personal data’.[3] We can also find ‘non-personal’ data where it is not related to a natural person, as in the case of the Free Flow Regulation.[4]However, this doesn’t stop here but in upcoming legislation, such as the Data Governance Act, we can also find general wider notions for data.[5] As such, in different situations, we might be confronted by a particular regulatory framework that deals with a set of situations. Consequently, a comprehensive and systematic view is necessary to tackle this first question in a holistic manner.

 

On the questions of whom it belongs to – if it belongs as such to anybody – and what legal bond is there between them and the data, the literature has discussed different approaches, has tried for quite some time to find a balance between the interests of the involved stakeholders. When it comes to companies, the database sui generis right, trade secrets, or copyright were seen as the potential solutions for it.[6] On the other hand, the legal literature dealing with ‘ownership’ of data by individuals, while a tempting solution, [7] is besieged by the fact that personal data is also safeguarded as a fundamental right.[8] In this sense, it was pointed out that people would not own personal data but rather control access to it via the notice and consent scheme and/or the general data protection framework, including the exercise of associated data rights, even on a collective basis.[9]

 

This latter scenario, a more active data rights exercise approach, is finding an echo in recent technological developments, such as decentralized identity management systems.[10] Until now, companies acted as data controllers and oversaw every single activity related to the data processing, from the collection of the personal data until its destruction going through its usage and possible sharing. Decentralized identity management systems, such as self-sovereign identities or personal data stores, allow for further control by data subjects themselves rather than having to file a request before a data controller and wait for an answer.[11] In this sense, data controllers do not select which data are they going to collect but rather have to accommodate the data that individuals create and make available for use.

 

This difference in the existing approaches for answering our initial question shows that there might be tensions between the involved stakeholders as their rights on the same object are different and, in some cases, expressing contradictory concerns. it is unclear how rights transfer between the involved parties should operate. To achieve a balance between different positions, it has been suggested the adoption of a quasi-property model, with a different grounding on a particular right depending on the scholar analyzed.[12]Through it, it would be possible to adopt a practical and hands-on solution to the issue of data ownership and, consequently, bridge the different positions mentioned above. Exploring whether or not this approach is compatible with the GDPR or not shall be one of the main challenges for the LeADS project.

 

As mentioned, the European regulatory framework is currently in flux and attempting to tackle the new future economic developments sustainably in the long run. There are currently different proposals undergoing discussion at a different level that deals with the uneasy question of what exactly is (personal) data from a legal point of view in a unified manner and try to find an answer to the question of ‘what is data ownership?’, which forms one of the main research crossroads for the LeADS project, as well as with other research questions that form up its core.

 

Authors: Prof. dr. Paul de Hert, Prof. dr. Gloria González Fuster, Andrés Chomczyk Penedo

 

[1] ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: A European Strategy for Data’ (European Commission 2020) COM(2020) 66 final.

[2] Carrie Gates and Peter Matthews, ‘Data Is the New Currency’, Proceedings of the 2014 New Security Paradigms Workshop(Association for Computing Machinery 2014) <https://doi.org/10.1145/2683467.2683477> accessed 1 April 2021.

[3] Art. 4(1) GDPR: ‘(…) any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person (…)’.

[4] Art. 3(1) Free Flow Regulation: ‘(…) means data other than personal data as defined in point (1) of Article 4 of Regulation (EU) 2016/679; (…)’.

[5] Art. 2(1) DGA: ‘(…) means any digital representation of acts, facts or information and any

compilation of such acts, facts or information, including in the form of sound, visual

or audiovisual recording; (…)’.

[6] Gianclaudio Malgieri, ‘“Ownership” of Customer (Big) Data in the European Union: Quasi-Property as Comparative Solution?’ (Social Science Research Network 2016) SSRN Scholarly Paper ID 2916079 <https://papers.ssrn.com/abstract=2916079> accessed 15 July 2021.

[7] Ignacio Cofone, ‘Beyond Data Ownership’ (Social Science Research Network 2020) SSRN Scholarly Paper ID 3564480 <https://papers.ssrn.com/abstract=3564480> accessed 1 April 2021; Václav Janeček, ‘Ownership of Personal Data in the Internet of Things’ (2018) 34 Computer Law & Security Review 1039; Patrik Hummel, Matthias Braun and Peter Dabrock, ‘Own Data? Ethical Reflections on Data Ownership’ [2020] Philosophy & Technology <http://link.springer.com/10.1007/s13347-020-00404-9> accessed 1 April 2021.

[8] Gloria González Fuster, The Emergence of Personal Data Protection as a Fundamental Right of the EU (Springer Science & Business 2014).

[9] Nestor Duch-Brown, Bertin Martens and Frank Mueller-Langer, ‘The Economics of Ownership, Access and Trade in Digital Data’ (European Commision, Joint Research Centre 2017) JRC Digital Economy Working Paper 2017–01 <https://www.ssrn.com/abstract=2914144> accessed 1 April 2021; Tommaso Fia, ‘An Alternative to Data Ownership: Managing Access to Non-Personal Data through the Commons’ [2020] Global Jurist <https://www.degruyter.com/document/doi/10.1515/gj-2020-0034/html> accessed 1 April 2021.

[10] Christopher Allen, ‘The Path to Self-Sovereign Identity’ (Life With Alacrity, 25 April 2016) <http://www.lifewithalacrity.com/2016/04/the-path-to-self-soverereign-identity.html> accessed 27 June 2019.

[11] Andrés Chomczyk Penedo, ‘Self-Sovereign Identity Systems and European Data Protection Regulations: An Analysis of Roles and Responsibilities’ (Gesellschaft für Informatik 2021) <https://dl.gi.de/bitstream/handle/20.500.12116/36505/proceedings-08.pdf?sequence=1&isAllowed=y>.

[12] Malgieri (n 6).

Public-private data sharing from “dataveillance” to “data relevance”

Data sharing has become a common practice between public and private entities all over the world. The reasons for this are broad and varied, ranging from making more data available for data-rich scientific research to allowing law enforcement agencies to pursue criminal activities with greater precision. While data collection remains a fundamental activity, as it enables an ever-growing amount of data to exist, the sharing of data and its subsequent repurposing can enable further major economic and social value. A single data controller can collect so little information in comparison to the data that can be made available from several third parties.

Regulators have taken notice of this and are planning accordingly to reap the supposed benefits of the data economy by further enabling and pushing for the sharing of data. In this sense, the recent European Strategy for Data puts this practice at its core, envisaging an environment of trusted data-driven innovations fueled by data sharing between digital platforms, governments, and individuals alike.[1]

In the field of law enforcement, the amount of data available also caught the attention of competent authorities a long time ago, as it allows for more ‘smart’ crime prevention yet at the expense of more privacy-invasive practices.[2] In this respect, the increasing amount of available data is highly interesting for the deployment of a forever-expanding surveillance apparatus by public authorities.[3] This has led to the emergence of what has been described as ‘dataveillance’[4] and its considerable expansion in the last decades, rooting itself in our society to become a troublesome practice.[5] In this context, the private sector makes available, either voluntary or not, a considerable portion of their data to law enforcement agencies,[6] with limitations.[7]

Data sharing can also involve access to public sector generated data by private businesses. In this respect, the open data movement has been for years pushing in this direction and, certain cases, triggering legislation that reduces the obstacles to making such data available for re-use by, for example, companies. While it is possible to find certain regulations that either foster or mandate such data sharing practices, all of them must be subject to general applications rules, such as the General Data Protection Regulation (GDPR).

As mentioned above, regulators intend to foster data sharing between private and public sectors. As the recent European Strategy for Data points out, certain kinds of information, such as that generated within smart cities, can provide an interesting field where public-private data sharing would be beneficial to society and individuals.[8] For example, data generated by the financial services industry provides a considerable amount of information, both in quantity and quality.[9] Nevertheless, a single payment can provide a sensitive insight into an individual’s life, from health data -for example from recurring pharmacy expenses- up to religious information -as in the case of monthly contributions to a religious organization-. This could be overcome by sharing certain information about payments in an aggregated manner, for example merely their time and date, which could help in understanding citizens movements in a city and plan city’s policies accordingly to accommodate for citizens’ benefit.[10]

But how can we avoid that these public-private data-sharing activities end up contributing to more ‘dataveillance’? While the GDPR covers a significant amount of data processing activities, we also need to involve other relevant pieces of legislation that contemplate public authorities, particularly law enforcement agencies, such as the Law Enforcement Directive. While the obligations and rights within the relevant legal framework diverse, it is possible to highlight that most of these activities should be conducted following some common principles.

Among these we point out that only accurate and relevant data should be used for a particular and specific purpose. In this respect, we can ask when the data are relevant enough for the intended purposes; in other words, we need to question when we have “good enough data” [11] for the intended public-private data sharing. By doing so, we can assess whether compliance with these rules has been reached. Through this, we can effectively implement the principles of data accuracy and minimization, alongside other applicable and relevant principles.

Understanding how these rules are effectively applied to, and guide, these public-private data sharing practices is crucial as regulators seek to foster them. For example, the European Union is currently working on a proposal for a Data Governance Act, which introduces data sharing services,[12] as well as data altruism.[13] Both of these categories, with their particularities, seek to foster data-sharing activities between private and public entities alike. Data protection watchdogs have raised their concerns regarding the current wording and extent of this proposal.[14] Among these concerns, the lack of clear integration between them and, in particular, the GDPR was highlighted as a troublesome issue.

Public-private data sharing activities are not likely to stop. On the contrary, the current data strategy for the European Union is to further expand the sharing of data in an automated manner using APIs, such as in the case of open finance.[15] The question that remains open on this front is whether these new data governance schemes can make us move from a dataveillance perspective towards a data relevance scenario. Within this context, we intend to explore this broad question in the different crossroads that this topic is present in the LeADS project and seek ideas to tackle the matter in an interdisciplinary manner.

Authors: Prof. dr. Paul de Hert, Prof. dr. Gloria González Fuster, Andrés Chomczyk Penedo

[1] ‘Citizens will trust and embrace data-driven innovations only if they are confident that

any personal data sharing in the EU will be subject to full compliance with the EU’s strict

data protection rules’ (see ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: A European Strategy for Data’ (European Commission 2020) COM(2020) 66 final.)

[2] David Wright and others, ‘Sorting out Smart Surveillance’ (2010) 26 Computer Law & Security Review 343.

[3] Margaret Hu, ‘Small Data Surveillance v. Big Data Cybersurveillance’ (2015) 42 Pepperdine Law Review 773.

[4] Roger Clarke, ‘Information Technology and Dataveillance’ (1988) 31 Communications of the ACM 498.

[5] Roger Clarke and Graham Greenleaf, ‘Dataveillance Regulation: A Research Framework’ (2017) 25 Journal of Law, Information and Science 104.

[6] David Lyon, Surveillance After Snowden (John Wiley & Sons 2015).

[7] ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: A European Strategy for Data’ (n 1).

[8] ibid.

[9] V Ferrari, ‘Crosshatching Privacy: Financial Intermediaries’ Data Practices Between Law Enforcement and Data Economy’ (2020) 6 European Data Protection Law Review 522.

[10] Ine van Zeeland and Ruben D’Hauwers, ‘Open Banking Data in Smart Cities’ (VUB Chair Data Protection on the Ground – VUB Smart Cities Chair – imec-SMIT-VUB 2021) Round table report <https://smit.vub.ac.be/wp-content/uploads/2021/02/Report-roundtable-Open-Banking-Smart-Cities_def.pdf> accessed 6 September 2021.

[11] Angela Daly, Monique Mann and S Kate Devitt, Good Data (Institute of Network Cultures 2019).

[12] According to the current wording of the proposal, under this service, we can include: (i) intermediate between data holders and data users for the exchange of data through different means; (ii) intermediate between data subjects and data users for the exchange of data through different means for the purpose of exercising data rights provided for in the GDPR, mainly right to portability; and (iii) provide data cooperatives services, i.e. negotiate on behalf of data subjects and certain data holders terms and conditions for the processing of personal data.

[13] According to the current wording of the proposal, under this term, we are referring to “(…) the consent by data subjects to process personal data pertaining to them, or permissions of other data holders to allow the use of their non-personal data without seeking a reward, for purposes of general interest, such as scientific research purposes or improving public services, such as scientific research purposes or improving public services”.

[14] ‘Joint Opinion 03/2021 on the Proposal for a Regulation of the European Parliament and of the Council on European Data Governance (Data Governance Act)’ (European Data Protection Board – European Data Protection Supervisor 2021) Joint Opinion 03/2021 <https://edpb.europa.eu/sites/edpb/files/files/file1/edpb-edps_joint_opinion_dga_en.pdf> accessed 25 March 2021.

[15] ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions on a Digital Finance Strategy for the EU’ (European Commission 2020) Communication from the Commission (2020) 591 <https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52020DC0591&from=EN> accessed 1 December 2020.

SELECTION for 4 MARIE CURIE EARLY-STAGE RESEARCHERS (ESR) positions at Scuola Superiore Sant’Anna, Italy, funded in the framework of the “Legality Attentive Data Scientists (LeADS) Project” (Grant Agreement n. 956562) – List of candidates admitted at the interview

  1. ABOLHASSANI MARYAM
  2. BEYSÜLEN ANGIN BERFU
  3. BRIFA PINELOPI MARIA
  4. CASALUCE ROBERTO
  5. CREPAX TOMMASO
  6. GAUR MITISHA
  7. LIYEW CHALACHEW MULUKEN
  8. POE ROBERT LEE
  9. SATKA ZENEPE
  10. SPERA FRANCESCO
  11. ULLAH ZAHID
  12. YANG QIFAN

IMPORTANT INFORMATION FOR THE CANDIDATES

The interviews are scheduled  on September 15th at 4 p.m CET online in the following public meeting room: https://sssup.webex.com/meet/g.comande

 

Data Privacy in the Financial and Industrial Sectors

Nowadays, one of the most important issues for enterprises across the financial services industry is privacy and data protection. Records and in particular financial records are considered sensitive for most of the consumers and good data handling practices are promoted by the respective regulators targeting for increased customer profiling in order to identify any potential opportunities and make a risk management analysis. To this extend, the management of data privacy and data protection is of great importance throughout the customer cycle. For example, there are several use cases in the finance sector that involve sharing of data across different organizations (e.g., sharing of customer data for customer protection or faster KYC, sharing of businesses’ data for improved credit risk assessment, sharing of customer insurance data for faster claims management and more).

To facilitate such cases, several EU funded projects have already discussed the need to reconsider data usage and regulation in order to unlock the value of data while fostering consumer trust and protecting fundamental rights. Permissioned blockchain infrastructure is utilized in order to provide privacy control, auditability, secure data sharing, as well as faster operations. The core of the blockchain infrastructure is enhanced in two directions: (i) Integration of tokenization features and relevant cryptography, as a means of enabling assets trading (e.g., personal data trading) through the platform; and (ii) utilization of Multi-Party Computation (MPC) and Linear Secret Sharing (LSS) algorithms in order to enable querying of encrypted data as a means of offering higher data privacy guarantees. Based on these enhancements the project will enable the implementation disruptive business models for personalization, such as personal data markets.

LeADS builds upon those results and steps forward by setting a more ambitious goal: to experiment, in partnership with businesses and regulators, on a way to pursue not only lawfulness of data mining and AI development, but both the amplest protection for fundamental rights and, simultaneously, the largest possible data exploitation in the digital economy using coexisting characteristics of the data driven financial services, LeADS helps to define: Trust; Involving; Empowering; Sharing. Participation in several of the mentioned projects (e.g. XAI, SoBigData++) and/or close scientific connections with the research teams (e.g. CompuLaw) by several consortium members ensure close collaboration with the named projects. There are great potentials to be found in data science and AI development entailing both great risks in terms of privacy and industrial data protection. Even considering legal novelties like: the Digital Single Market strategy, the GDPR, the Network and Information Security (NIS) directive, the e-privacy directive and the new e- privacy regulation, legal answers are often regarded as inadequate compromises, where individual interests are not really protected. As far it concerns the academic elaboration with the subject, there are several challenges that still need to be addressed in data-driven financial services that LeADS could met regarding the empowerment of individuals (users, clients, stakeholders, etc.) in their data processing, through “data protection rights” or “by design” technologies, like for example blockchain as described before.

The approach LeADS Early-Stage Researchers will be developed in two folds: 1) The study of digital innovation and business models (e.g. multisided markets, freemium) dependent on the collection and use of data in the financial sector. It will also: a) link this analysis to the exploration of online behaviour and reactions of users to different types of recommendations (i.e. personalized recommendations by financial/industrial applications) that generate additional data as well as large network effects; b) assess (efficiency, impact studies) the many specific privacy regulations that apply to online platforms, business models, and behaviours, and 2) Proposal of a user centric data valorisation scheme by analysing user-centric patterns, the project aims to: a) Identify alternative schemes to data concentration, to place the user at the heart of control and economic valorisation of “his” data, whether personal or not (VRM platforms, personal cloud, private open data); b) Assess the economic impact of these new schemes, their efficiency, and the legal dimension at stake in terms of liability and respect of privacy. The project will also suggest new models allowing the user to obtain results regarding the explainability of the algorithms that are being utilized by financial organizations to provide the aforementioned personalized recommendations for their offerings. LeADS research will overcome contrasting views that consider privacy as either a fundamental right or a commodity. It will enable clear distinctions between notions of privacy that relate to data as an asset and those which relate to personal information affecting fundamental rights.

Against this background, LeADS innovative theoretical model, based on new concepts such as “Un- anonymity” and “Data Privaticity”, will be assessed within several legal domains (e.g. consumer sales and financial services, information society contracts, etc.) and in tight connection with actual business practices and models and the software they use. Finally, due to the increasing potential of Artificial Intelligence information processing, a fully renewed approach to data protection and data exploitation is introduced by LeADS by building a new paradigm for information and privacy as a framework that will empower individuals’ awareness in the data economy; wherein data is constantly gathered and processed without awareness, and the potential for discrimination is hidden in the design of the algorithms used. Thus, LeADS will set the theoretical framework and the practical implementation template of financial smart models for co- processing and joint-controlling information, thereby answering the specific need to clarify and operationalize these newly- introduced notions in the GDPR.