IRIT and LeADS Conference on Data Sciences and EU Regulations

The LeADS project will co-organize a Conference, titled “Conference on Data Sciences and EU Regulations” in a hybrid format at the IRIT, Université Toulouse III – Paul Sabatier, Toulouse, France on Tuesday 6 December 2022.


9:00 – 9:15 – Introduction by Jean-marc Pierson (IRIT)

9:15 – 10:30 – Session 1

Keynote talk on “Technological barriers & opportunities for Data Sciences” – Jean-michel Loubes  (IMT-ANITI)

Keynote talk on “Barriers and opportunities for Data Sciences brought by EU regulations” – Emanuel Weitschek
(Italian Competition Authority)


10:30 – 11:00 – Session 2

Posters presentation by Early Stage Researchers of the LeADS project

11:00 – 12:30 – Session 3

Panel on “Best practices for digital technology development in the era of big regulation” –  Gabriele Lenzini
(University of Luxembourg), Jessica Eynard (Université Toulouse 1 – Capitole), Nicolas Viallet (Université de Toulouse), Teesta Bhandare (Art Garde)


12:30 – 14:00 – Lunch


Participation is free and open to all, but prior registration is mandatory for December 4 at the latest.

ESRs Research Pitches are available on Youtube!

Our Early Stage Researchers present their research topics on LeADS Channel on Youtube!

Each Researcher has prepared an individual 1-minute Research pitch in which they present their research topics, their main research questions, why their topics are important, and how they aim to research them. It is also possible to read a summary of their topic in the description of the video.

The videos are also organized in an easy-to-watch playlist that can be accessed here. We invite you to watch them and follow the LeADS Project on Youtube!

ESR Barbara Lazarotto participation at 1st Democracy & Digital Citizenship Conference Series

Early Stage Researcher Barbara Lazarotto (ESR 7) presented her research at the 1st Democracy & Digital Citizenship Conference Series, hosted by Roskilde Universiteit on September 29-30 2022 in Roskilde, Denmark.


As a part of the panel named Data bodies/digital panopticonBarbara presented her topic of “The myth of “dataism” and the construction of citizen-centered cities” exploring the role of sociotechnical imaginaries on the datafication of cities.

To do that, Barbara first presented the concept of sociotechnical imaginaries, explaining its role in pushing for the datafication of society, especially cities, promising impartial, reliable, and legitimate decision-making, yet giving the datafication and categorization of citizens and entire populations.

Barbara presented a solution of placing citizens at the center of the decision-making process of creating smart cities, connecting with Lefebvre’s idea of “Right to the City”. The new participatory city-making does not intend to replace the current democratic system but add to the current one new forms of citizen participation. At last, Barbara highlighted that Data Protection is also essential to increase citizen empowerment in cities, by enhancing data minimisation, transparency, and proportionality.

ESRs on their first day in Kraków

A Recap of LeADS Training Module 5

From the 14th to 23rd of September 2022, the 15 ESRs met again for their fifth and final training module. This time they had the chance to meet in the beautiful city of Kraków at Jagiellonian University which is one of the seven beneficiaries of the project. The LeADS training program is structured around several training modules that together aim at training a new generation of researchers as legally attentive data scientists that become experts in both law and data science capable of working within and across the two disciplines. Whereas the fourth training module in Crete focussed on the computer scientist perspective, this training module focussed again more on the legal perspective.

Training Week 1: On Data and Ownership

The first week kicked off with issues surrounding propertization of information. Ewa Laskowska-Litak from Jagiellonian University introduced the ESRs to existing legal regimes as well as technical solutions that can provide legal or de facto protection to data. Dariusz Szostek and Rafał Prabucki introduced the ESRs to benefits and risks of blockchain as an opportunity for privacy and intellectual property rights. Furthermore, Adrianna Michałowicz presented and discussed current approaches of two key deliverables from the European Strategy for Data: The Digital Governance Act as well as the Data Act and how they reflect the latest European approach to data governance. Marietjie Botes from Luxemburg University presented legal approaches to ownership and how prior to the European Strategy for Data the Commission reflected on introducing new property rights in machine-generated non-personal data.

ESRs on their first day in Kraków

ESRs on their first day in Kraków

Katarzyna Południak-Gierz from Jagiellonian University focussed her presentation more on the consumer perspective and how personalisation techniques (e. g. behavioural or contextual tailoring) are used on them online. Together with Fryderyk Zoll, the ESRs had the opportunity to discuss the methodology of legal research and difficulties they have encountered throughout their first year of research. A completely different topic was subsequently presented by Katherine Lee, research engineer at Google Brain. In her talk on language models, i. e. models that learn a probability distribution of a sequence given the previous tokens, she elaborated on how they are trained with data as well as how they might pose privacy risk if they leak information. The first training week ended with a session by Marietjie Botes on discrimination and debiasing of algorithms.

Training Week 2: The Next Steps of the LeADS Project and Conference on Human Rights Centred AI

Week 2 kicked off with a session by Arianna Rossi from Luxemburg University on ethical and legal aspects of social media data mining from a researcher’s perspective. In her lecture on transparency and algorithmic decision-making, Agnieszka Jabłonowska focussed on transparency, its promises as well as its shortcomings. Using the example of consumer law, she discussed with the ESRs the efficiency of newly imposed transparency obligations on online platforms with regard to the disclosure of ranking parameters. Looking at the concrete example of a platform where consumers can reserve hotels, the group concluded that in its current implementation the imposed transparency obligations might not achieve their goal of informing consumers on why certain offers are ranked above o thers. An IP law perspective was adopted in the following lectures. Michał Wyrwiński explained approaches to “Protection of Image” in different jurisdictions. Furthermore, Żaneta Zemła-Pacud and Gabriela Lenarczyk discussed approaches to propertization of information in life sciences and to what extent the IP system already provide control over data.

Patricia Thaine adopted again a more technical perspective in her presentation on privacy enhancing technologies such as differential privacy or synthetic data. Mohamed Ali Kandi from Toulouse University introduced the ESRs to the fundamentals of networking, e. g. by presenting to the researchers the technical structure ‘behind’ the internet. Claudia Castillo Arias from INDRA, one the LeADS partners that is closely involved in LeADS activities, provided ESRs with insights into their business activity in her lecture on Military operations planning: difficulties and challenges. On Wednesday 21st, all ESRs attended the conference “Designing Human Rights Attentive AI – An Interdisciplinary Perspective” co-organized by LeADS and LiderLab of Scuola Superiore Sant’Anna. The conference perfectly reflected the interdisciplinary approach of the LeADS project with diverse presentations on topics such as “Can we trust Fair-AI?” (Salvatore Ruggieri – University of Pisa), “Ethical and Legal Issues in Accessing Biobank Data to develop AI tools” (Andrea Parziale – Maastricht University) or “Explaining AI in Cyber Defence” (Marco Antonio Sotelo – Indra).

Conference on Human Rights Attentive AI

Conference on Human Rights Attentive AI

On the day after the conference, ESRs and beneficiaries of the LeADS project met to discuss and evaluate the overall progress of the project. The training in Kraków constituted the last training module for the ESRs. The next time, the ESRs will come together in December in Toulouse for the Technical innovations In Law Laboratories (TILL) workshop, where they will work on challenges that the non-academic partners of the LeADS project are confronted with in their business activity.




Finding a Way Out the Ethico-Legal Maze of Social Media Data Scraping

For the latest issue of the European Research Consortium for Informatics and Mathematics (ERCIM) news journal, Arianna Rossi from LeADS beneficiary University of Luxembourg wrote an article titled “Finding a Way out of the Ethico- Legal Maze of Social Media Data Scraping”.

In her contribution, Arianna writes about the experience of the interdisciplinary project “DECEPTION” with regard to thecompliance of data protection rules and ethical research ethics principles when doing research on internet data.

She concludes that in order to encourage researchers to comply with the ethico-legal maze, practical guidance drafted in laymen terms as well as best practices and toolkits tailored to certain technologies should be created. Her article is available on this website. A more extensive version of their findings has been published in the Privacy Symposium 2022.

TcIoT Workshop “Trusted Computing and the Internet of Things”


We are pleased to invite you to the workshop “Trusted Computing and the Internet of Things” that will take place on Thursday, November 10 in hybrid mode at Institut de Recherche en Informatique de Toulouse – IRIT (a LeADS partner). Participation in this event is free and open to all, but prior registration is mandatory. For more information and registration go to the event website.Program:

9:15-9:30 – Introduction

9:30 – 10:30 – Session 1 

  • Trust management in large-scale IoT – Pr. Mawloud OMAR (Université Bretagne Sud)
  • How to authenticate things (objects) remotely? Opportunities and challenges of today’s technologies – Dr. Gabrielle LENZINI (University of Luxembourg)

10.30 – 11.00 – Coffee Break

11.00 – 12.00 – Session 2

  • Analyzing the risk of IoT enabled cyber-physical attack paths against critical systems – Pr. Panayiotis KOTZANIKOLAOU (University of Piraeus)
  • Privacy Preserving Authentication for Internet of Vehicles (IoV) – Dr. Khaled HAMOUID (ESIEE Paris)

12.00 – 14.00 – Lunch Break

14.00 – 15.00 – Session 3

  • Secure integration of IT and OT – Prof. Sokratis KATSIKAS (Norwegian University of Science and Technology)
  • Trust in the IoT ecosystem – Dr. Youcef IMINE (Université Polytechnique Hauts-de-France)

15.00 – 15.30 – Coffee Break

15.30 – 16.30 – Session 4

  • Greater reliability in the IoT thanks to the group – Pr. Maryline LAURENT (Télécom SudParis)
  • Blockchain-based cryptographic key management for IoT – Dr. Mohamed Ali KANDI (Paul Sabatier University – Toulouse 3)

16.30 – 17.30 – Round table and conclusion

The social contract sauce. Contains: Europol, big data, spyware, employment contracts (May contain traces of privacy)

Credit: Europol

On the 19th and 20th of October I was invited to participate to the Europol Cybercrime Conference held at the headquarters of Europol in Den Haag, Netherlands.

This year’s theme was “The evolution of policing” and it brought together law enforcement (“LE”) agents, private and public cybersecurity experts, Data Protection Officers, researchers and professors from all around the world to answer the question whether there’s a need for a social contract in cyberspace.

Although it may seem that the topic of cyber policing is somewhat distant from the LeADS’ scope, the two are surprisingly connected. Many links exist between aspects of European cybersecurity and law enforcement to key issues in the LeADS project, such as the regulation of cyberspace within contrasts of individual freedom vs public interests, the concept of trust and its declensions in law enforcement of the metaverse, fair vs effective data governance, the use of big data vs machine learning, as well as opportunities and challenges of portability, interoperability and re-usability of data for policing purposes.

Introducing one of the first debates, the Commissioner of Home Affairs Ms. Ylva Johansson -one of the only 8 female speakers out of 39 in the conference–opened with the statement that security is the social contract. It is understandable why Rousseau’s social contract idea be intertwined with that of demanding security to the power of the state, with its checks and balances, and take away the pursuit of justice from the hands of people who are driven by their individualistic amour propre. However, there is a part of that reading that is missing that I personally believe is the most important, and which has been too many times discounted throughout the conference-sometimes accidentally, sometimes wilfully—that is the following: in Rousseau’s vision of the social contract each person should enjoy the protection of the state “whilst remaining as free as they were in the state of nature.”

Security is a necessary pillar for the existence and evolution of democratic societies, but it is only a starting point, one of the bases, not the social contract itself. It is conditional to the existence of the social contract, but it is far from exhausting its functions. There is so much more that citizens of a democratic society can and should expect form a national state other than the mere prevention and investigation of crimes, offline and online. Examples are, the upholding of policies for improving social welfare, civil rights, healthcare, protection from discrimination or anti-competitive behaviors, and so on. Understanding the social contract in its miopic security meaning would legitimize Orwellian-like states that secure people through mass surveillance and social credit scoring. Privacy, in this context, is the first line of protection to that Rousseau’s individual freedom, together with personal data protection that functions as proxy to the protection of every other fundamental right, freedom and legitimate interest enshrined in the European “constitutions“. It is in the anticipation of the moment for law enforcement action before the violation of fundamental rights that lays the essence of the social contract, while all fundamental rights are in the balance—and security is only one among many.

This underlying leitmotiv of the conference has resurfaced in many occasions. Representatives of law enforcement have repeatedly lamented that bureaucracy concerning rule of law and privacy most times end up dulling investigative tools, for example, when limiting the collection of personal data to specific legal bases, along with the time for its retention and analysis. However, what these laws limit is only the indiscriminate, trawl collection of non-contextual data for unspecified use and unlimited time in case they might come handy in the future. It seems also clear that LE is still holding on to the promise of big data analytics, with its tenet of always collecting and retaining everything possible, while discounting the use of privacy friendlier alternatives powered by machine learning algorithms that do not need such amounts of data, but smaller, sanitized, quality datasets to train and test models. A hybrid system that combines machine learning models to targeted data analysis would reduce dramatically the need for voluminous, noisy, cumbersome, leakable data collection and storage, while respecting privacy of non-interested citizens: the first would help in the hunt for suspicious activities online, while the second circumscribes the area of investigation to only suspected individuals –so upholding proportionality.

LE’s request for more access to data depends on the trust of people in governmental institutions. And such trust is hard to establish, but breaks easily. One investigative journalist, in this regard, raised the thorny issue about the use of the Pegasus spyware by European LE agencies. The reference was to the spyware found installed on phones belonging not only to criminal suspects, but also to journalists, European prime ministers, members of parliament, and civil society activist; in total, it collected 4 petabytes of data of innocent people before being exposed by Citizen Lab, a Canadian research centre. Mutatis mutandis, but with the same critical lenses, we should look at the current EDPS legal action against Europol. Pending before the ECJ, the EDPS wants to fight the legitimacy of the new articles 74a and 74b of the Europol Regulation that retroactively legalize Europol’s processing of large volumes of individuals’ personal data with no established link to criminal activity. It is no wonder that the happening of such events erode the trust for people in LE. Transparency in operations and decision making could have played a positive role in establishing trust between private citizens and LE, yet in these occasions the lack thereof backfired abundantly–perhaps irremediably.

The problems that LE is facing is not only the need for more data and easier access to it, but also that data be formatted, visualized and shared in a way that is actionable. Data actionability, in the context of coordination and crime prevention, requires both understandability by operators (starting with human-readability) and portability to receiving system (starting with machine-readability). Unfortunately, on the side of operators, many high-level officers lamented the extreme lack of human resource with data sciences skills, which is in stark contrast with their pledge to big data and their concomitant jettisoning or not-hiring of digitally competent personnel coming from civil society or the private sector—most open vacancies at Europol are restricted to seconded officials. On the side of portability and interoperability of data and systems there is a lack of standardization, which renders communications and coordination among the national police forces cumbersome and inefficient–much like in the European market for data.

All in all, the conference left a bitter taste in my mouth. One of the biggest tenets that years of research in regulatory aspects of technology taught me is that technology regulation is complex. To make sense of it, analysts need a granular, expert and sensible look at the specific context in which technologies are deployed, but also an understanding of their effects in the macroscopic picture of international geopolitical, economic and social systems. Cybercrime prevention and repression is one of such complex systems, whose analysis and management need multidisciplinarity, of the box thinking, lateral and longitudinal vision, innovative skills, state of the art tools. But most importantly, this evolutionary process of policing will need to be built on the essence of Rousseau’s social contract, the credo that security is corollary to freedom-not the other way around–and it must serve its purposes.

Unfortunately, at least from an organizational standpoint, it seemed that Europol is following a different-if not altogether opposite–path to reach its security goals: the call for more data retention, the discounting of machine learning, the lack of expertise in digital skills and the admission to have difficulties in acquiring some, the hunt for human and technical resources from only inside LE seems less like an evolution of a trustworthy, pioneering, EU values-driven agency, and more like a gradual transformation into an old-school police department.

Scuola Superiore Sant’Anna ESRs at Bright Night – Night of the Researchers

On 30.09.2022, teams from Scuola Superiore Sant’Anna and Consiglio Nazionale Della Ricerche have been scouting “intergalactic parliamentarians” to solve the most pressing legal challenges of the next 100 years. And what better parliamentarians than those who will be living then? Children and parents, participants of the discussion game Regolare Technologie che Regolano (“how to regulate technologies that regulate”), got to try it for themselves!

The ESRs joined efforts to deliver an electrifying interactive spectacle during the Bright Night – Night of the Researchers, as the discussion game gave participants a unique chance to learn about the regulatory aspects of new technologies while engaging in dynamic and family-friendly debates based on an idea of tug-of-war. Participants were presented with an idea or problem [“Should we install one thousand new CCTV cameras in Pisa?” “Should we restrict video game time to only three hours a week?” Should we provide housecare robots to all over 65ers?”] and then asked to express their opinion by moving around the room and placing themselves on one of the five sectors of the parliament (strongly in favor, in favor, not sure, in disagreement, strongly in disagreement).

Participants were then confronted with a set of facts, based on real-life events, specifically picked to question their primitive opinions –and hopefully switch sides, repeatedly! According to the first goal of the game, this first part of the game was designed to make them reflect on how hard it is to regulate technologies, as the changing of context and use would significantly affect their “gut” opinions. After the round of facts was finished, teams were formed based on their positioning into one (Agree) or the other (Disagree) “hemisphere” of the parliament. Eventually, the two opposing sides clashed against each other in a heated and often unexpectedly funny debate about the pressing issues. Based on the outcomes of the debate, a final vote was cast, and the proposition was either adopted, abandoned, or modified to reach an agreement –the highlight of the game was when the kids stacked against their parents snatched the result to play 1.5 hours a day!

The procedure was so designed to reach goal 2 of the game, to critically evaluate facts and put forward the most convincing argumentation, and 3, to learn how democratic debates develop by mimicking the actual rules of parliamentary democracies (albeit – in a slightly simplified version).

The game lasted for more than 4 hours and dozens of participants debated over more than seven available topics that touched on different areas of regulatory challenges. Given h
how much attraction and enthusiasm the game has generated we expect reeditions in the upcoming years.

Join us there!

RGDP: Une maturité sans cesse challengée Conference

The LeADS supervisor Prof. Jessica Eynard is co-organizing a Conference titled “RGDP: Une maturité sans cesse challengée” at the Université UT1 Capitole – Amphithéâtre Maury, Toulouse on Friday, October 21 2022.


13:30 – Introduction by Prof. Jessica Eynard

13:45 – Introduction by Prof. Reinout Van Tuyll

14:00 – Quelle Effectivité des droits de la personne concernée? – Prof. Jessica Eynard and Remi Cauchois

15:00 – Le casse-tête des durées de conservation des données – Prof. Guillaume Desgens-Pasanau and Dr. Benjamin Laroche

16:00 – Pause

16:30 – Les impossibles (?) transfers e données vers les États-Unis – Prof. Cécile de Terwangne and Reinout Van Tuyll

17:30 – Une Approache par le risque à Renouveler? – Fabien Crozet and Prof. Yves Poullet

18:30 – Closing remarks

For registration, please contact


ESR Mitisha Gaur presents her work at the IE Law School’s LawTomation Days Conference 2022

The IE Law School (Madrid, Spain) hosted a conference on the 29th and 30th of September – LawTomation Days, which was focused on the examination and discussion about the development of AI in various aspects of society. ESR Mitisha Gaur presented her work with predictive justice as a panel member on the discussion track Legal Tech and E-Justice.

Her work titled “The core tenets for designing a reliable predictive justice AI system” focused on investigating the use of AI systems in courts and the basic design issues with AI Systems which make justice merely statistical and prediction based instead of deliberative in nature. Through her work, she highlighted the importance of creating an AI system which is focused on including context and background of the facts as a core component which will allow AI Systems to better understand the issues placed before them.

Her work also focused on highlighting the hyper-reliance on substantive law and how that skewers the ability of judicial officers and lawyers to rely on the computations of predictive justice algorithms as they completely ignore the use of procedural law in the system and therefore produce results which are incompatible with the real-world applications and how the inclusion of procedural law while designing predictive justice systems is crucial from the context of fairness, reliability and accountability of the AI system.