Special Edition Blog Series on PhD Abstracts (Part II)
This post is a continuation of the blog post series on PhD abstracts. You can find the first part of the series here.
Tommaso Crepax: Unchaining Data portability in a Lawful Digital Economy.
Data portability is a key instrument to realize the EU policy vision on data governance. Because it allows for data sharing and re-use through forms of access control, it has the power to benefit all players while adequately protecting their rights. Regrettably, economic, legal, and technical issues have hindered the development of information exchange systems supporting data portability. To create platforms and tools for data portability, developers need that emerging expertise of “legal engineers” identifies the legal requirements, to make sure that users, consumers, and “prosumers” can enjoy their rights securely, effectively, and without infringing others’ rights and legitimate interests. This research aims at finding such legal requirements inside the actual, dynamic wave of EU legislation on the issues of data governance (including data sharing, access, control, re-usability), competition in digital markets and provision of digital services. This quest for legal requirements moves beyond black letter law, leveraging case law development, as well as European and national relevant authorities’ guidance. The goal is to clarify what is requested to developers of portability services and personal data controllers in terms of implementable organizational and technical measures. This clarification effort uses established methods of requirements engineering elicitation and documentation, and is carried out with the use of relational databases. It is coordinated with the mapping of relevant ISO standards (most importantly, ISO/IEC 27701), and further evaluated for compatibility with the elicited requirements in a loop that potentially leads to guidelines for either reform or implementation. Lastly, this work provides a list of technical solutions as individuated by relevant authorities, case law and field experts.
Cristian Lepore: A Framework to Assess E-Identity Solutions
Digital identity is important for businesses and governments to grow. When apps or
websites ask us to create a new digital identity or log in using a big platform, we do not know what happens to our data. That is why experts and governments are working on creating a safe and trustworthy digital identity. This identity would let anyone file taxes, rent a car, or prove their financial income easily and privately. This new digital identity is called Self-Sovereign Identity (SSI). In our work, we propose an SSI-based model to evaluate different identity options and we then prove our model value on the European identity framework.




In this special edition series of blog posts, we are excited to present the PhD abstracts of our 15 Early Stage Researchers (ESRs). Each ESR has not only contributed to the interdisciplinary research within the LeADS project and its four Crossroads but has also pursued their own individual research within the scope of their PhD thesis.
Qifan Yang: Reciprocal interplay between personal data protection under the GDPR and market competition in the data-driven society.
Data processing and AI-based techniques are now widely used in multiple sectors, including business, sociology, healthcare, mobility, research, etc. Moreover, companies and public organizations have produced and/or collected various types of data which today are stored in data silos that need to be integrated to build a data economy that drives innovation. Such data spaces should involve different stakeholders in collaborative data processing including distributed data life cycle as well as decentralized data governance. Naturally, when several systems are interconnected to carry out each step of the data life cycle, this data life cycle can be defined as distributed. When multiple entities manage data governance, this type of data governance is called decentralized data governance. Collaborative data processing raises several issues and challenges, especially, ensuring the reliability of distributed systems, trust in the decentralized governance of data processing, and compliance with legal requirements concerning data processing. Data quality plays a central role in these challenges to create a data economy. Data quality evaluation is a potential indicator to enhance the reliability, trust, and legal compliance of shared data across collaborative data processing. The main contribution of my research will respond to questions such as: are data governance stakeholders able to make the right decisions to maintain data quality? What are the data quality criteria that can be used to assess trust in all data governance stakeholders based on their actions and decisions? What are the data quality criteria pertinent to data governance? Then, how to assess the reliability of all components in distributed systems, i.e. the ability of each component to perform correctly and not degrade the quality of the data? How to create data quality contracts at each step of the data life cycle based on appropriate data quality criteria? Finally, how do we respond to the fact that there is no existing work that categorizes data quality criteria according to different EU regulations, such as the GDPR, the Data Act, or the Data Governance Act?


Barbara presented her research written along with colleague Pablo Rodrigo Trigo Kramcsak on the topic of “