Nouri. Zouhaier. Walid Ben Salah, and Nayel AlOmran. AuArtificial Intelligence and Administrative Justice: An Analysis of Predictive Justice in France,Ay Hasanuddin Law Review 10 no. : 119-143. DOI: 20956/halrev. HasanuddinLawReview Volume 10 Issue 2. August 2024 P-ISSN: 2442-9880. E-ISSN: 2442-9899 This work is licensed under a Creative Commons Attribution 4. 0 International License Artificial Intelligence and Administrative Justice: An Analysis of Predictive Justice in France Zouhaier Nouri1. Walid Ben Salah2. Nayel AlOmran3 1 Faculty of Law and Political Sciences. Tunis El Manar University. Tunisia. E-mail: zouhaier. nouri@fdspt. 2 College of Humanities bensalah@zu. 3 College of Humanities alomran@zu. and Social Science. Zayed University. United Arab Emirates. E-mail: and Social Science. Zayed University. United Arab Emirates. E-mail: Abstract: This article critically analyzes the ethical and legal implications of adopting predictive analytics by the French administrative justice system. It raises a key question: Is it wise to integrate artificial intelligence into the administrative justice system, considering its potential benefits, despite the associated risks, ethical dilemmas, and legal challenges? The research employs a method based on an extensive literature review, a qualitative analysis of the adoption by the French administrative justice of predictive analytics tools, and a critical evaluation of the benefits and issues these tools bring. The study finds that AI can make the administrative justice system more efficient, reduce backlogs, and enhance the consistency and predictability of judicial decisions. However, the study also identifies important risks and serious ethical and legal issues associated with integrating AI tools into the justice system. Especially. AI utilization can lead to the dehumanization of justice and poses real risks to the independence and impartiality of justice. While AI can offer significant benefits to all the stakeholders of the administrative justice system, its integration must be approached with caution. A progressive and responsible approach to AI adoption is necessary to avoid compromising judicial integrity and upholding fundamental justice values. Keywords: Artificial Intelligence. Administrative Justice. Legal. Judicial. Transparency. Technology Introduction Artificial Intelligence (AI) is being integrated across a wide range of fields, including security, defense, transportation, finance, and education, exercising a transformational This assimilation enhances efficiency and drives innovation, sometimes sparking fears and ethical concerns. The legal domain makes no exception. as a product of human intellectual activity, it has increasingly embraced AI technologies. Consequently, the structured nature of legal texts provides a valuable foundation for constructing automatic analyses,1 announcing a new era where AI's potential to transform legal practices and principles is both promising and challenging. Indeed, digital technologies can serve as valuable tools for judges. According to some scholars. AI is not just supplementing but potentially replacing human intelligence, specifically that of the judge and those involved in the administration of justice. 1 Clyment. Marc. "Algorithmes au service du juge administratif, peut-on en rester maytres?" Actuality Juridique Droit Administratif . : 2453-2460. 2 Yves Gaudemet. AuLa justice y lAoheure des algorithmesAy. Revue du droit public, 3 . : 651-664. P-ISSN: 2442-9880. E-ISSN: 2442-9899 What exactly is artificial intelligence? Before answering this question, it should be noted that the definition of AI has evolved over time since its emergence in the 1950s. Its modern origin is commonly attributed to Alan Turing and the test described in his foundational article, "Computing Machinery and Intelligence". 3 Turing's test posits that artificial intelligence encompasses humanity's endeavor to create a mechanism that simulates or replicates human cognition. This creation relies on algorithms and computational power. The adjective 'artificial' is used in contrast to a biological development process. The concept of artificial intelligence differentiates between two types of artificial intelligence: weak artificial intelligence and strong artificial intelligence. 4 The former is a simulation of intelligence, generating a mere programmed mimicry of human behavior. The latter is a production of intelligence, giving rise to the ability to develop reasoning Ae possibly even consciousness and emotions Ae similar to humans. At first glance, justice and algorithms seem so alien to each other. "Until recently, administrative judges, and more broadly public law jurists, either knew nothing about predictive justice or had vaguely heard of it and looked at it with amused disdain, seeing it as just the latest trend of some civil judges or some private law scholars, overly fond of futuristic novels and science fiction movies. "6 However, in the current era of big and open data7, computing, and the digitalization of justice, the integration of artificial intelligence into the judicial is no longer pure fiction. This integration signals a transformation in how the judiciary operates, influencing not just access to justice but also altering the working methods of judges, clerks, and legal assistants. AI in the justice system involves using sophisticated computer algorithms and machine learning to replicate aspects of human thinking. In the judiciary context. AI can process massive legal data, forecast case outcomes, and aid judges by uncovering patterns and insights that might be missed by humans. This technology aims to make judicial processes more efficient, consistent, and transparent, ultimately assisting judges with informed, data-driven decisions. Among the various technologies employed, predictive justice emerges as a revolutionary application. In recent years, algorithms have made their way into the domain of judicial activity, leading to the emergence of what lawyers have somewhat prematurely labeled as predictive justice. This concept "burst into the routine and hushed world of administrative justice like thunder suddenly breaking in a clear sky. Ay8 3 Alan Turing. AuComputing machinery and intelligenceAy. Oxford University Press. Vol. 59, nA236, (October 1. : 433-460. 4 Council of Europe. European Commission for the Efficiency of Justice (CEPEJ). "European Ethical Charter on the Use of Artificial Intelligence (AI) in Judicial Systems and Their Environment. " December 2018. Accessed June 24, 2024. Source: https://rm. int/ethical-charter-en-for-publication-4-december2018/16808f699c. 5 Lasserre. Marie-Cycile. "LAointelligence artificielle au service du droit: la justice prydictive, la justice du " Les Petites Affiches, 130 . : 6-12. 6 Benoit Plessix. AuVers une justice administrative prydictive?Ay, in Le droit administratif au dyfi du numyrique, (Dalloz, coll. Thymes et commentaires, syrie AFDA, 2. , 81-103. 7 Loic Cadiet noted that Auwithout data, there is no data openness, nor data exploitation, no artificial intelligence, no virtual trial, nor predictive justiceAy. Blanc. Nathalie, and Mustapha Mekki. "Le juge et le numyrique: un dyfi pour la justice du XXIe siycle. " Dalloz, . : 93. 8 Benoit Plessix. AuVers une justice administrative prydictive?Ay, 81. Hasanuddin Law Rev. : 119-143 Predictive justice software is a computational tool designed to indicate, based on judicial data input into a computer, the solution that statistically has the highest probability of being chosen. Loyc Cadiet, in his 2017 report on the open data of judicial decisions, defined it as: "Predictive justice consists of tools developed by analyzing a vast amount of judicial data that suggest, especially through probability calculations, how a dispute might turn out. This means it will be possible, from exploiting various data from all decisions made on an issue, whether legal . ature of the action, norm applied, solution provided by the judg. or factual . rofile and behavior of the parties, context of the disput. , to predict what a judge might decide in such a specific case, estimate the expected gain, evaluate the foreseeable duration of a procedure. In essence, predictive justice uses algorithms to assess the likelihood of success in contentious procedures. However, it is important to understand that predictive justice does not equate to justice itself but rather serves as a tool for analyzing case law and partiesAo submissions to predict the direction of future court decisions based on a comprehensive review of a large number of previously resolved disputes. 11 Thus, the rise of the internet and dematerialization, open data on court decisions,12 combined with the development of algorithms and AI, presents the judiciary system, and particularly administrative justice, with a new and exciting challenge: the challenge of predictive This requires forward-thinking and particular vigilance from lawyers and legal There is a clear link between predictive justice and the open data of court decisions, which ensure the reproduction, free accessibility, and reuse of digital data related to court decisions. These freely available, free, and accessible data are now capable of automated processing by algorithms. In other words, referring to the definition given by the French Council of State in its study on public power and digital platforms of September 28, 2017, "by applying a finite series of rules and operations to obtain a 9 Loic Cadiet. AuLAoopen data des dycisions de justiceAy, (Report French Ministry of Justice November 2. Accessed June 24, 2024. https://w. fr/sites/default/files/migrations/portail/publication/open_data_rapport. 10 Garapon. Antoine. "Les enjeux de la justice prydictive. " La semaine juridique 1 . : 47-52. See also Dondero. Bruno. "Justice prydictive: la fin de l'alya judiciaire?" Recueil Dalloz 10 . : 532. For this author, predictive justice relies on Authe idea (A) of having tools which, based on an analysis of existing jurisprudence, allow predicting what future jurisprudence will be, that is to say, identifying which solution will be given to a dispute X by a judge Y, either in view of the data of the dispute X, or through an analysis of the parties' submissions . his is not about graphology but textual analysi. Ay. 11 Guy Canivet, "Justice: faites entrer le numyrique", (Institut Montaigne. Report November 2. , 27. Accessed June 24, 2024. https://w. org/ressources/pdfs/publications/justice-faitesentrer-le-numerique-rapport. 12 It's important to acknowledge the close relationship between the terms open data and big data. Open data pertains to digital data that is predominantly gathered or maintained by public entities, and to a lesser extent, private individuals. This data is freely accessible and can be used by anyone without any cost . pen Conversely, big data describes exceptionally vast collections of digital data generated by modern technologies and stored with the aid of advanced computing resources . assive data or mega dat. For a comprehensive study, see: Bourcier. Daniyle, and Primavera De Filippi. "L'Open Data: universality du principe et diversity des expyriences?" La Semaine juridique. yOdition gynyrale 38 . : 1-9. P-ISSN: 2442-9880. E-ISSN: 2442-9899 "13 This development raises the question, can predictive justice replace human justice? With the digitization of the justice system and the availability of judicial data, predictive justice is likely to influence the future of legal proceedings, possibly leading to the introduction of "robot-judges" or a more automated form of justice. The purpose of this article is to critically analyze the consequences of the adoption of predictive analytics by the French administrative justice system. The research presents the historical evolution of the integration of AI into the French judiciary and aims at analyzing both the benefits and drawbacks of integrating AI into judiciary proceedings. Based on this critical analysis, the article advocates for a cautious, progressive, and responsible adoption of AI tools by the judiciary. Between reality and fiction, hope and mistrust, predictive justice is no longer an illusion It has transitioned in France from a futuristic concept to become a tangible reality, marking a logical evolution resulting from the digitization of the public service of administrative justice. However, this evolution is accompanied by a need for careful consideration and balanced perspectives. Predictive Administrative Justice in France: A Logical Evolution Digital technologies are now widely used as tools in the work of administrative judges. The digital space is profoundly transforming the traditional working methods of administrative justice . The deployment of digital tools by the administrative judge, combined with the development of algorithms, artificial intelligence, and the open data of court decisions, led to a more radical evolution characterized by the digitization of the judging function. This experience with predictive justice raises the question of a future judgment rendered by or at least assisted by a robot-judge . From Human Justice to Algorithmic Justice Like any public service, the administrative justice system is fundamentally designed to fulfill the needs of its users by facilitating the resolution of disputes within reasonable timeframes through a simple, efficient, accessible, and transparent process. Achieving these goals necessitates the dematerialization and digitization of the administrative justice service, leveraging new information and communication technologies. These technologies are essential for evaluating the quality of service in an environment that is increasingly digital and automated. The integration of computer tools within the administrative justice process is intrinsically tied to the principle of the mutability of public service. This principle underscores the necessity for the administrative justice system to evolve in response to societal and technological advancements, aiming to modernize administrative justice and enhance the services provided. This is because "justice, in any form, is not rendered for the convenience or security of judges or legal staff but for the litigants. It is through their perspective that we must evaluate the standards of well-administered justice. "14 13 Conseil dAoEtat, "Etude annuelle 2017. Pouvoir public et plateformes numyriques : accompagner lAo, " La documentation franyaise, 2017. Accessed June 24, 2024. https://w. vie-publique. fr/rapport/36918etude-annuelle-2017-du-conseil-etat-accompagner-luberisation 14 Jacques Robert. A La bonne administration de la justice A. AJDA 1995, 117. Hasanuddin Law Rev. : 119-143 In France, the "Tylyrecours" application serves as a pivotal tool for facilitating electronic communication at every procedural stage before the administrative tribunal. This application was mandated by Decree No. 2016-1481 of November 2, 2016, which dictates the use of electronic procedures before the Council of State, administrative courts of appeal, and administrative tribunals, coming into effect on January 1, 2017 15. The decree establishes a comprehensive requirement for legal professionals to use the "Tylyrecours" application for submitting case documents to the administrative court, rendering any case filed otherwise as inadmissible. This mandate extends to defending parties and other case participants, necessitating the digitization and electronic cataloging of documents, complete with detailed bookmarks for easy identification according to the inventory 16. Moreover, video hearings have been a possibility in administrative courts since 2005, proving particularly beneficial for overseas . utre-me. administrative tribunals that lack resident magistrates. The digital transformation of administrative justice in France extends beyond the "Tylyrecours" system and the use of video hearings. Presently. French administrative judges, along with third parties, can access a comprehensive database of rulings and judgments that were previously available exclusively through specialized legal journals, such as Lebon and GAJA. These jurisprudence databases democratize the consultation of decisions made by various administrative jurisdictions, significantly enhancing transparency and accessibility in the legal process. Within the same framework, the Law for a Digital Republic of October 7, 2016,17 as amended by the Law of March 23, 2019,18 mandates that administrative jurisdictions make their judgments available to the public, at no cost in electronic form. This requirement is specified in Article L. 10 of the Administrative Justice Code, as amended by Law n o 2019-22 of March 23, 2019. 15 Also see the AuTylyrecoursAy decree of October 9, 2020 (Official Journal of October 11, 2020, text nA. , whose provisions have been in effect since January 1, 2021, which aims to reorganize and improve the drafting of the provisions of the administrative justice code concerning electronic communication before administrative courts. 16 For a comprehensive study on the Tylyrecours application see Laurence Helmlinger. AuTelyrecours : la dymatyrialisation devient obligatoire devant les juridictions administratives pour les avocats et les administrationsAy. RFDA 2017, 12. For this author. Auit is not forbidden to think that, like the revolution experienced in documentary research, the gradual appropriation by legal professionals of techniques specific to working on digitized documents and the development of increasingly sophisticated computer applications are renewing their methods of apprehending a caseAy. 17 See the application decree n o 2020-797 of June 29, 2020, relating to the public availability of decisions of judicial and administrative courts. Articles 1 to 3 of the first title of the decree are devoted to making the decisions of administrative jurisdictions available to the public, while articles 4 to 6 of title 2 are devoted to making the decisions of judicial jurisdictions available to the public. 18 Law No. 2019-222 of March 23, 2019, law for digital justice reform. For a comprehensive study on the said law, see: Thierry. Jean-Baptiste. "Ryforme de la justiceAeLa loi no 2019-222 du 23 mars 2019, loi de ryforme pour la justice numyrique?" La Semaine juridique. yOdition gynyrale NA 19-13 Mai 2019 . : 524543. 19 Provided that the conditions set out in paragraphs 3 and 4 of Article 10 of the Administrative Justice Code, along with paragraphs 2 and 3 of the Judicial Organization Code, are fulfilled. See articles 4 to 6 of title 2 of the aforementioned decree of June 29, 2020. P-ISSN: 2442-9880. E-ISSN: 2442-9899 Furthermore, the "Skipper" software, used by administrative magistrates and particularly, by clerks, incorporates algorithms designed for processing and analyzing cases pending before any administrative jurisdiction. This software executes three main functions: Firstly, it employs a sorting algorithm that organizes statistical case data daily, distinguishing cases based on their phaseAifirst instance, appeal, or cassationAiand their adjudication status. Secondly, "Skipper processes case information to direct the activities of clerks and other members of the jurisdiction. This includes investigatory measures, notifications to parties, assessment of jurisdictional competence, and regularization of Lastly, the software includes an application named "juradinfo," which identifies patterns in litigation to suspend contentious proceedings temporarily. This suspension facilitates coordination between involved administrative courts until a designated "lead" court resolves the dispute, ensuring the completion of all legal remedies, or it initiates the contentious opinion procedure automatically upon detecting a group action before the administrative judge. The administrative justice system's public service is deeply influenced by the rapid advancements in computer technology and the digitization of society, marked notably by the development of algorithms and big data. These technological advancements have sparked debates around the introduction of predictive justice, driven by algorithmic The Emergence of Predictive Justice The concept of predictive justice, although not new, has transitioned from theory and science fiction to a tangible reality . The emergence and development of predictive justice in France is the result of both technological and legal advancements . The Concept and Reality of Predictive Justice Does predictive justice exist, or is it merely a temptation to forecast judicial outcomes? Is it feasible for administrative justice to adopt predictive methodologies? Could robots equipped with algorithmic software be the judges of tomorrow? Or should we heed the words of Carbonnier when he contended that "the judge is a man and not a syllogism machine: he judges as much with his intuition and sensitivity as he does with his knowledge of the rules and his logic. "20 At first glance, administrative justice is not an act of prophecy and prediction. The judge does not predict the law, he states the law. 21 Justice, in its essence, is not and cannot be Human administrative justice and algorithmic administrative justice seem, at first glance, so foreign to each other. Yet, asking about the existence of predictive administrative justice without being burdened by the interrogative form to allow for doubt is neither bold nor provocative. It is less about wondering if administrative justice and algorithms intersect than about wondering why they were late in doing so. The esteemed Italian legal theorist Norberto Bobbio once wrote that since the time of what 20 Carbonnier. Jean. AuDroit civil: Introduction les personnes, la famille, l'enfant, le coupleAy. Vol. Quadrige, 2004, p. 21 Yves Gaudemet. AuLa justice y lAoheure des algorithmesAy. Revue du droit public, 3 . : 651-664. Hasanuddin Law Rev. : 119-143 was called the fetishism of the law, much water has flowed under the bridges, and no one seriously believes in the automated judge anymore. Predictive justice has become a concrete reality today, though the idea itself is not novel. Visionaries from Gottfried Wilhelm Leibniz23 to Hans Kelsen,24 along with proponents of formal logic,25 have long aspired for the law to embody mathematical precision through deductive logic. "The 18th-century concept of the judge as the 'mouthpiece of the law' closely mirrors the contemporary notion of a robot-judge. Historical figures such as Nicolas de Condorcet26 in 1785. Pierre-Simon Laplace27 in 1814, and Simyon Denis Poisson in 1837,28 endeavored to determine the probability of a judicial decision being Extending this analysis further, it could be argued that algorithms and trials operate under similar logics and pursue identical outcomes. On one hand, a trial involves applying rules to resolve a dispute. on the other hand, an algorithm is defined as a set of instructions designed to solve a problem. Bridging the gap between rules and instructions, and between disputes and problems, leads to the inference that a trial can be likened to an Predictive justice tools are computational tools designed to forecast the possible outcomes of future legal proceedings based on jurisdictional data, particularly the history of adjudicated cases. This is done using large volumes of case law data, big data, processed by artificial intelligence. However, the choice of the term "justice" may be 31 If the term is meant to designate the virtue of justice, then it concerns a moral quality, which is something that can predicted. If it refers to the judicial institution, including its buildings, staff, magistrates, clerks, then the term AupredictiveAy is irrelevant. The current artificial intelligence can "simulate" human intelligence, as noted in the 2017 report by the French Council of State,32 it appears that AI has not yet achieved the capability to fully replicate human legal reasoning. Indeed, despite some states advancing 22 Noberto Bobbio. Essais de thyorie du droit. LGDJ. - Bruylant, (Coll. La pensye juridique 1. , 38. 23 Wilhem Leibniz. Le droit de la raison, (Librairie philosophique J. Vrin, 2. 24 Kelsen. Hans. "Qu'est-ce que la thyorie pure du droit. " Droit et sociyty 22 . : 551. 25 Georges. Kalinowski. "La logique dyductive. Essai de prysentation aux juristes. " PUF, 1996. 26 De Condorcet. Nicolas. Essai sur l'application de l'analyse y la probability des dycisions rendues y la plurality des voix. Cambridge University Press, 2014. Accessed June 24, 2024. https://gallica. fr/ark:/12148/bpt6k1057808s/f11. 27 Pierre Simon De Laplace. Essai philosophique sur les probabilitys, (Paris. Bachelier. Imprimeur-Librairie. Accessed June 24, 2024. 28 Simyon Denis Poisson. Recherches sur la probability des jugements en matiyre criminelle et en matiyre (Paris. Bachelier. Imprimeur-Librairie. Accessed June https://gallica. fr/ark:/12148/bpt6k110193z. 29 Barbin. Evelyne, and Yannick Marec. "Les recherches sur la probability des jugements de Simon-Denis Poisson. " Histoire & Mesure . : 39-58. 30 Jean-Baptiste Duclercq. Les algorithmes en procys, 131. 31 Lasserre. Valyrie. "Justice prydictive et transhumanisme. " Archives de philosophie du droit 60, no. : 311-320. 32 Conseil dAoEtat. Etude annuelle 2017. Pouvoir public et plateformes numyriques : accompagner lAo. La documentation franyaise, 2017. Accessed June 24, 2024. https://w. vie-publique. fr/rapport/36918etude-annuelle-2017-du-conseil-etat-accompagner-luberisation P-ISSN: 2442-9880. E-ISSN: 2442-9899 further in the process of digitizing justice,33 the emergence of a robot judge remains a distant prospect34. From our perspective, while we do not foresee algorithms replacing judges in the near future, they could undoubtedly serve as valuable tools for assistance. Evolution of the Legal Framework The development of predictive justice in France was facilitated by a legal framework applicable to predictive justice has been shaped by several key legislative and regulatory initiatives, emphasizing the country's approach to balancing technological innovation with legal and ethical considerations. Notably, the French law of January 6, 1978, on information technology, files, and freedoms has evolved over time to address the integration of automated data processing in judicial decisions. Initially, it prohibited basing any judicial decision implicating an assessment of a person's behavior on automated personal data processing intended to evaluate personality However, in 2004, the Council of State interpreted this rule in a way that allows such automated processing results to be considered among other factors in making a decision, thus not outright banning the use of algorithmic analysis in judicial contexts but limiting its influence. This stance towards algorithmic justice is further reflected in European regulations, such as the General Data Protection Regulation (GDPR) and the proposed AI regulation by the European Commission. The GDPR permits processing sensitive data, if necessary, for courts acting within their judicial capacity, while the proposed AI regulation classifies high-risk AI systems under strict legal requirements concerning quality, transparency, and human oversight, addressing legal professionals' legitimate concerns. Overall, while algorithmic justice cannot replace human judges, it can support decisionmaking processes when combined with other elements at the judge's disposal. The development of a legal framework that reconciles the use of algorithms with justice's demands is fundamental and has become indispensable today. 35 From our perspective, while we do not foresee algorithms replacing judges in the near future, they could undoubtedly serve as valuable tools for assistance. 33 In the United States, for example, evidence-based sentencing, which has developed in the context of criminal law, must be distinguished from predictive justice in that it does not aim to replace the judge. The software acts as an additional expert that assesses the probabilities of an offender's recidivism and provides expertise to the judge. Similarly, the British experience with Online Courts, which are completely dematerialized jurisdictions since it is the software that proposes the legal solution that will be examined by the judge who will validate it or not. these jurisdictions only decide on relatively minor civil cases with a threshold of up to A25,000. 34 Marc Clyment writes along these lines: AuWhile the substitution of the judge by the machine remains a distant prospect, which can fuel interesting but very abstract reflections, it is easy to identify today significant evolutions in the profession of administrative judge compared to what it was before the 2000sAy. Marc Clyment. AuAlgorithmes au service du juge administratif: peut-on en rester maytre,Ay 2455. 35 Rambaud. Romain. Alya Hafsaoui, and Caroline Bligny. "Une justice algorithmique pour les ylections politiques?" Actuality juridique Droit administratif 25 . : 1323. Hasanuddin Law Rev. : 119-143 In conclusion, although predictive justice is making inroads into the legal field, practical experiences suggest that this concept still navigates the boundary between speculative fiction and tangible application. The vision of a fully predictive judicial system, while enticing, remains an ambitious goal yet to be realized. Predictive Administrative Justice: A Mixed Evolution Regardless of the effectiveness of predictive justice or the accuracy of its predictions compared to actual judicial outcomes, the deployment of predictive justice tools undoubtedly offers advantages and signifies a positive development . Nonetheless, it is crucial to acknowledge that the benefits of predictive administrative justice do not overshadow its potential flaws and risks . Promising Advantages The integration of algorithms into administrative justice has sparked considerable Currently, "the world seems to be divided between technophiles and Technophiles highlight the inevitable nature of technological progress and see in the rise of these toolsAo greater transparency in justice as well as a democratization of the law. For technophobes, on the other hand, the law will always resist algorithms due to reasoning, the complexity of which will always elude machines. thus, the machine represents a concerning impoverishment. "36 However, to transcend these dichotomous and somewhat ideological debates, an analysis of French experience shows the tangible benefits that predictive justice tools can provide across various stakeholders: To the public service of administrative justice . administrative judges . , litigants . , and lawyers . Benefits for the Public Service of Administrative Justice The adoption of predictive justice tools significantly affects the quality of service delivered by the public service of administrative justice. This quality is assessed, firstly, in terms of accessibility, which encompasses physical access, the dissemination of clear and pertinent information, transparency, and the simplification of procedures. Secondly, it is measured by the improvement of the service offered, including attentiveness to citizens' needs and delivering prompt services within brief periods. Given these considerations, the public service of administrative justice must embrace these new predictive justice tools, as it, like all public services, is obligated to fulfill the "constant demands for the highest quality of service. " 37 From this vantage point, the digitization of administrative justice could offer a faster and more economical public service, particularly through the implementation of tele-appeals. This system permits litigants to submit their appeals online, eliminating the need to physically visit court premises or to dispatch documents by mail, thus relieving court 36 Marc Clyment. Algorithmes au service du juge administratif: peut-on en rester maitre, 2454. 37 Reny Chapus. Droit administratif gynyral. Tome 1, (Paris: Montchrestien, 2. , 797. P-ISSN: 2442-9880. E-ISSN: 2442-9899 clerks of many repetitive tasks such as logging requests, categorizing, and sending Economically, the utilization of tele-appeals procedures has the potential to substantially decrease the operational costs of the public service of administrative justice over time, especially in terms of printing, archiving, and communication expenditures. The dematerialization of exchanges enhances and increases the accessibility of administrative jurisdictions, allowing almost instantaneous interactions between the parties and said jurisdictions through the rapid dissemination of investigative measures. 38 It follows from the above that the use of predictive justice tools enables better case management before administrative courts,39 as they exert a decisive influence on the final decision. The quality of the decision rendered by the administrative judge largely depends on the quality of the resources deployed by the public service of administrative justice to ensure accessibility, speed, efficiency, and timely control of ongoing cases. However, the public service should not over-rely on these tools. Human oversight should always be part of the process. In case of lack of human oversight, it is possible for potential errors or biases embedded in the algorithms to go unchecked. Also, reducing complex legal decisions to mere data points can weaken the nuanced understating that is necessary in a large of court cases to preserve the fundamental rights of litigants. Benefits for Administrative Judges From the perspective of administrative judges, the benefits of using predictive justice tools extend well beyond the potential efficiency gains in streamlining administrative The complete digitization of contentious administrative proceedings 40 enables these digital tools to utilize all the data from every case filed in administrative courts. This capability offers a significant improvement over merely analyzing data from past Predictive justice tools provide administrative judges with swift, accurate, and comprehensive insights into administrative jurisprudence, reducing the time they need to spend on research while enhancing their understanding of their colleagues' jurisprudential practices. Such tools contribute to the stabilization, harmonization, and 38 Sauvy. Jean-Marc. AuLe numyrique et la justice administrativeAy. Les annales des mines, 3, . : 44-47. 39 Boyer-Capelle. Caroline. "Gestion des dossiers et quality de la justice administrative. " Revue francaise d'administration publique 3 . : 727-738. 40 For a comprehensive study on the dematerialization of trials, see: Duclercq. Jean-Baptiste. "Les algorithmes en procys. " Revue franyaise de droit administratif 1 . : 131. The automation of the legal process refers more to a process, more or less artificial or human, of mechanical application of procedural positive law but also of creation or modification of internal directives, practices, organizational and operational modes aimed at litigation. For this author, a distinction must be made between the automated algorithmic process, the semi-automated algorithmic process, and the non-automated algorithmic process. The automated algorithmic process refers to all the procedural rules applicable to the dispute produced by an algorithmic decision-making channel without human validation. The semi-automated algorithmic process refers to all the procedural rules applicable to the dispute produced by an algorithmic decisionmaking channel with human validation at the end of a binary choice . alidation or invalidation of the algorithm's resul. The non-automated algorithmic process refers to all the procedural rules applicable to the dispute produced by a human decision-making channel at the end of a choice for which the algorithm constituted a simple decision-making aid. Hasanuddin Law Rev. : 119-143 convergence of jurisprudence across various courts nationwide, thereby increasing the predictability of justice and legal certainty. Moreover, by minimizing the time required for extensive research on similar facts and legal principles, predictive justice algorithms enable judges to allocate more time to addressing new or complex issues, where their expertise is most valuable. For judges, predictive justice algorithms serve as valuable tools that support decisionmaking by providing relevant and essential data, thereby enabling judges to make decisions more efficiently and within the reasonable timeframes required by the right to a fair trial. These tools prove particularly useful for administrative judges in handling straightforward or repetitive cases, such as those involving damage assessments, the application of scales, or predefined frameworks. Predictive justice software offers judges a comprehensive, organized, and graphical overview of the entirety of existing litigation. This includes not just the landmark decisions of the jurisdiction's plenary assembly but also the everyday practices of the various chambers of the administrative court, including regional first-instance courts. Access to a broad spectrum of decisions across different administrative jurisdictions helps to alleviate the isolation and solitude often experienced by administrative judges as they process requests, conduct jurisprudence research, and arrive at final decisions. It does so by providing instant access to a vast knowledge base of jurisprudence, facilitating real-time, quantitative, and qualitative analysis of all similar cases previously adjudicated by their In effect, the predictive algorithm renders the administrative judge more informed, reducing reliance on intuition, enhancing objectivity, and improving cost-effectiveness through shorter judgment times due to more efficient case backlog management. Without usurping the judge's decision-making role, predictive algorithms significantly ease their workload. This is not science fiction: it is the core purpose of artificial intelligence to create machines that can replicate aspects of human intelligence, thereby assuming certain tasks. Despite these advantages. AI tools can challenge judicial independence. Judges may feel pressured to conform to predictive analytics recommendations. They can also become over-reliant on these tools. This reliance can restrain their creativity and undermine their critical thinking skills, which are essential for interpreting and fairly applying the law in unique cases. Benefits for the Litigants The utilization of predictive justice tools undeniably benefits litigants by granting them insight into their legal futures, thereby ensuring more predictable and less arbitrary These tools also empower litigants to make well-informed decisions on the most effective means of resolving their disputes. Providing litigants with estimates of their chances of success in court and the expected duration of the trial, based on statistical data from predictive justice tools, is clearly advantageous in terms of both time 41 Thomas Cassuto. AuLa justice y lAoypreuve de sa prydictibilityAy. AJ Pynal, 2017, 334. 42 Benoit Plessix. AuVers une justice administrative prydictive?Ay, 89. P-ISSN: 2442-9880. E-ISSN: 2442-9899 and financial resources. With access to precise and reliable statistics, litigants can make more informed decisions about whether to pursue a potentially long and costly legal Predictive justice software offers litigants essential case-related information and the ability to track or even influence its progress, from filing requests and exchanging briefs to submitting documents, receiving public hearing notices, or the case being assigned to a judge. The transparency facilitated by predictive justice software provides litigants with real-time visibility into aspects of their case that were previously opaque, allowing for "instantly cross-referenced results of all relevant information to predict the likely outcome of a dispute, facilitated by the rapid collection and processing of decisions from all administrative judges. " 43 As Eric Sadin notes that a cognitive deepening is established, signaling the emergence of an era of measurement and quantification of every organic or physical unit, surpassing the mere factual knowledge of things, towards a qualitative and constantly evolving evaluation of individuals and situations. 44 Predictive justice software enables litigants to independently analyze administrative jurisprudence in real-timeAiby each case, solution, party, judge, argument, and reasoningAiwithout needing a specialist intermediary. This direct access to jurisprudential data provides litigants with transparent and objective legal insights, circumventing the potential errors or omissions that might arise from human judges' summarizations of jurisprudence. The use of predictive justice tools argues in favor of litigants, enabling them to have control over their contentious future, guaranteeing a more secure and less random legal These tools simultaneously offer litigants the opportunity to make an informed decision about the most suitable path to resolve their disputes. Providing litigants with an indication of the likelihood of success in a trial and its foreseeable duration, thanks to the statistical data provided by predictive justice tools, is undoubtedly beneficial in terms of time and finances. Investing time and money in a trial would be simpler for litigants if they had precise and reliable statistics that would prevent them from embarking on a lengthy and expensive procedure. However, predictive tools can perpetuate the biases that are already present in historical Decisions rendered based on these tools can therefore lead to unfair outcomes, especially for historically marginalized groups. Furthermore, litigants who cannot access or have limited access to, or do not understand of these tools are disadvantaged, which can aggravate inequalities in the justice system. Benefits for Lawyers Lawyers, much like judges and litigants, stand to gain significantly from predictive justice These tools have the potential to streamline their workload, enabling them to dedicate more time to crafting their legal strategies rather than engaging in extensive 43 Benoit Plessix. AuVers une justice administrative prydictive?Ay, 87. 44 Sadin, yOric. La vie algorithmique. Critique de la raison numyrique. yOchappye (L'), 2015: 29. Accessed June 24, 2024. http://digamoo. fr/sadin2015. Hasanuddin Law Rev. : 119-143 45 Predictive justice algorithms offer lawyers the capability to review case law and draw comparisons with previously adjudicated cases. This allows them to assess their likelihood of success, potential compensation awards, the reasoning behind judicial decisions, and identify the most persuasive arguments for their cases. For example, a lawyer could enter details of a specific case into a system and receive detailed predictions, such as a 77% chance of success against Mr. A from firm B before Judge C, with an average case duration of 8 months. 46 To attain these insights, law firms utilize predictive justice tools that leverage machine learning to evaluate the potential outcomes of cases and estimate likely compensation amounts across different In France, for example, legal tech startups like Case Law Analytics47 and Predictive48 have created algorithms designed to forecast the results of legal proceedings. On the downside, the reliance by lawyers on this kind of tools can harm their legal strategies rendering them ordinary and predictable. Lawyers could over-rely on predictive tools and their suggestions rather than trust their experience, creativity, and professional judgement. This could jeopardize the important role that lawyers have in developing unique and innovative court decisions through the legal arguments they make and the legal strategies they elaborate. Despite the benefits offered by these toolsAisuch as improved accuracy and objectivity in information gathering, expedited trial processes, and eased judicial workloadsAithe adoption of predictive justice technologies is not without its challenges. Numerous Shortcomings The integration of predictive algorithms into the French justice system, particularly within administrative justice, brings not only positive developments but also raises significant A primary challenge is the potential commercialization of administrative justice . Furthermore, this integration introduces several additional issues and limitations, including subjectivity and uncertainty in predictive analysis . , risks of opacity and 45 For a comprehensive study see: Nourissat. Cyril. "Justice prydictive et profession dAoavocat: entre fantasme . et ryality . " La Semaine juridique. yOdition gynyrale . : 878. Bruguys-Reix. Byatrice, and Ashley Pacquetet. "La justice prydictive: un AoutilA pour les professionnels du droit. " Archives de philosophie du droit 60, no. : 279-285. 46 The primary issue emerges when dealing with a unique dispute, if the reference is a single dispute: what prediction will the machine provide? Will it indicate a 100% chance of success or a 0% chance? In fact, applying predictive justice tools to cases without clear precedents or comparable historical data poses significant challenges. 47 This is an application that relies on artificial intelligence and on a fine legal expertise to quickly analyze the risks associated with a contentious case or a contract. This application also allows for the precise measurement of the influence of a particular element of a case on the judge's decision or how to best adjust a clause in a contract. It quantifies the judicial uncertainty, anticipates the judge's decisions, and determines the amount that litigants can expect in different types of disputes. 48 This service, launched in France in September 2016, offers law firms and legal departments of companies a tool capable of estimating the chances of success of a legal proceeding in all branches of law, including administrative law. This tool also helps optimize litigation strategies by identifying and prioritizing the elements that can positively influence the outcome of a dispute, the arguments that can most strongly influence judges, and on which it would be opportune to rely. It can also provide an estimate of the compensation to be obtained in the context of a litigation and can provide a map of the most favorable jurisdictions according to the field involved. P-ISSN: 2442-9880. E-ISSN: 2442-9899 , threats to judicial independence and impartiality . dehumanization and standardization concerns . , and concerns related to computer security and personal data protection . Commercialization of administrative justice The primary concern with the adoption of predictive algorithms is their potential to commercialize administrative justice, reducing it to a mere commercial transaction or Moreover, the use of predictive justice tools may compromise the principle of equal access to public justice services. Only litigants with access to algorithmic predictive justice tools can benefit from a comprehensive and enlightening advice for managing their cases. Conversely, those without access to these tools may face an invisible, standardized, and mechanized justice, lacking personalization and tailored for the majorityAos outcomes. 49 This situation creates a dichotomy where justice becomes a privilege for those who can afford the technological means, leaving others with an impersonal and rigid justice experience. Subjectivity and uncertainty in predictive analysis The reliability of predictive algorithms remains uncertain. While designed to analyze case law and identify the most relevant elements for decision-making, there is no standardized methodology or universally accepted approach for conducting such analyses. 50 The process of categorizing the collected data relies heavily on the discretion and personal judgment of those operating the system, thus introducing a degree of subjectivity that underscores the importance of human intuition and individual perspectives in the classification process. The impartiality of algorithms in trials also presents a concern. It raises the question of whether algorithmic trials can achieve greater neutrality and impartiality than human-led Initially, one might argue that algorithmic trials cannot surpass the inherent impartiality of human judgment. The ability of a party to request the recusal of a judge serves as a fundamental mechanism to ensure impartiality. Yet, algorithmic trials face accusations of bias and an inability to appreciate the nuances of individual cases, casting doubt on their capacity to administer justice impartially. The concept of an "algorithmic trial" appears contradictory when considering that human judgments input into algorithms cannot guarantee the same level of unbiased execution by computational systems. 49 Yves Gaudemet. AuLa justice y lAoheure des algorithmsAy, 654. 50 Clyment. Marc. "Les juges doivent-ils craindre lAoarrivye de lAointelligence artificielle?. " Recueil Dalloz 2 . : 102-106. 51 Frison-Roche. Marie-Anne, and Serge Bories. "La jurisprudence massive. " Recueil Dalloz 39 . : 287. Accessed 25 June 2024. https://mafr. fr/media/assets/publications/frison-roche-m-a-bories-s-lajurisprudence-massive-1993. 52 Jean-Baptiste Duclercq. AuLes algorithmes en procys,Ay 138. Hasanuddin Law Rev. : 119-143 Opacity and Misinterpretation Risks Navigating through "black boxes" that compile legal decisions without fully understanding their analytical processes can lead us towards a situation similar to the financial assessments long produced by rating agencies. The rating exists, bearing its impact and influence on the rated entity's reputation and creditworthiness. Yet, the specifics of how these ratings are assigned remain obscure. These ratings merely represent a likelihood of default, and we are aware of the infrequent yet significant repercussions stemming from rating agency inaccuracies. The predictive justice software's inability to provide a clear-cut answer regarding a lawsuit's outcome could have detrimental consequences. It would be quite damaging to trust statistics provided by predictive justice software only to find out later that a particular decision had been misinterpreted and misunderstood by analysts. Basing a theory solely on statistics can be misleading. Relying only on probabilistic calculations tainted by the presence of spurious correlations, without considering the causality mechanisms that can lead a judge to rule based on certain factual data, results in inevitable misinterpretations. This is the danger of predictive databases and, more broadly, all algorithms: one must ensure their correct and unbiased functioning. 53 In most cases, the obtained result heavily depends on the values assigned by the software designers to the collected data. Therefore, we must abandon the notion that predictive justice tools are neutral and impartial. Risk for Judicial Independence and Impartiality In the domain of administrative justice, the digitalization of the process leads to the "weakening" of the trial. 54 This would challenge the principles of independence, impartiality, and social acceptability of the trial. Regarding the principle of independence, using algorithms in trials can lead to certain forms of dependencies, especially for administrative magistrates, with the emergence of litigants' power to monitor the work of magistrates. The information provided by the predictive justice software allows litigants to calculate the duration of each step of the procedure, and even remind, if necessary, the magistrate of a potential oversight in their inquisitorial duty, such as a missed formal notice. Algorithmic trials contribute to the decline of the social acceptability of trials. Indeed, although the social acceptability of the trial is not a "guiding" principle of the trial, it is, therefore, a factor in the effectiveness of judicial decisions, just like legislative texts,55 and an indicator of the formal quality of justice. 56 This social acceptability of the trial is conceivable, on the one hand, through the transparency of the rules governing the drafting of the judicial decision and, on the other hand, through the clarity of the reasoning behind this same decision. Resorting to algorithms in trials risks jeopardizing 53 Bruno Dondero. AuLa justice prydictive: la fin de lAoalya judiciaire?Ay 537. 54 Jean-Baptiste Duclercq. AuLes algorithmes en procys,Ay 137. 55 Ray. Jean-Emmanuel. "Droit public et droit social en matiyre de conflits collectifs. " Droit social 03 . : 220. 56 Bonnotte. Christophe. "LAoacceptability sociale est-elle un indice de la quality de la justice administrative?" Revue franyaise d'administration publique 3 . : 689-700. P-ISSN: 2442-9880. E-ISSN: 2442-9899 this dual requirement. Indeed, while algorithms contribute to enhancing the trial's speed and efficiency, litigants are ultimately humans, who, beyond the dispute's resolution, expect even algorithmic justice to be understandable and accessible to their human However, even if we agree with Judge Antoine Garapon that the modes of reasoning embedded in algorithms are perfectly explicit and controlled by the judge,57 there is a fear that the demand for transparency will not make the algorithms comprehensible to human intelligence. Beyond the litigants, full transparency of algorithms, which would involve publishing source codes, raises several computer security issues. If these codes were known to hackers, it would increase the chances of intrusion. In this case, the pursuit of computer security would mean that judicial secrecy would significantly expand at the expense of the imperative of transparency. Regarding the clarity of the rationale for decisions, we argue that a clear and comprehensive justification for decisions made by algorithms is not necessarily an indicator of intelligible algorithmic justice. Conciseness will no longer be a drafting quality but an admission of impotence. Using predictive justice algorithms could challenge the judges' duties of impartiality and Indeed, if the concept of impartiality dictates neutrality, conversely, a lack of neutrality implies a lack of impartiality. Impartiality is defined by the absence of bias or A judge who has previously given an opinion on a case and subsequently judges the same case could at least appear to be biased. 59 The European Court of Human Rights and the Council of State believe that the appearance of impartial justice is essential as it ensures the applicant's trust. By favoring an objective examination of each situation, they examine the exact circumstances of each case to decide if the judge visibly upheld the duty of impartiality and independence. If so, any doubt in the applicant's mind alone is It is up to them Ae and them alone Ae to prove that this doubt is indeed 60 Thus, the equation becomes: Accumulation Identity of Person Identity of Dispute = Legitimate Suspicion Regarding the Judge's Impartiality In any case, the judge's impartiality is always presumed, and it is up to the litigant who challenges it to provide contrary evidence. However, this proof is very difficult for the litigant because it seems almost impossible to determine a particular judge's opinion unless they imprudently express their bias. 61 But a court's lack of impartiality cannot 57 Marc Clement. AuLes juges doivent-ils craindre lAoarrivye de lAointelligence artificielle?Ay 104. 58 Duclercq. Jean-Baptiste. "Les algorithmes en procys. " Revue franyaise de droit administratif 01 . 59 Mouannys. Hiam. "L'impartiality devant le Conseil d'Etat: la continuity d'une jurisprudence liye y l'office du juge du concret. " Presses de lAoUniversity Toulouse 1 Capitole. LGDJ 2018: 281-298. https://books. org/putc/800?lang=en . 60 CEDH, 1er octobre 1982. Piersack c/Belgique, nA8692/79, syrie A. nA53. CE, 5 avril 1996. Syndicat des avocats de France. Rec. 61 For example, by using, in the reasoning of the decision, injurious terms . or example, excessive, racist, or revisionist remarks: Civ, 2yme, 14 septembre 2006, nA04-20. Bull. II, nA222 or vexatious towards Hasanuddin Law Rev. : 119-143 result merely because a judge or panel has ruled repeatedly against one party or in favor of an adversary,62 or if they're called to rule on recurring disputes between the same 63 The fact that a judge's position on a legal issue presented to them is predictable does not challenge their impartiality. 64 This means that the disqualification of a judge cannot result solely from previous decisions made for or against a party or a specific group of people with similar characteristics. 65 However, the fear stems from the fact that analyzing judges' personal characteristics and decisions will lead to evasive strategies that will favor individual confrontations and provoke recusals. We know that predictive justice tools are widely used by law firms eager to discern the psychological and ideological profile of the judges they will face in order to assess their chances of success, the influence a particular argument might have on a judicial decision, or to challenge their impartiality. This is what was done in France by the company Supralegem, which, based on the analysis of decisions made by various administrative judges, claimed to be able to predict with an accuracy between 90 and 99% the chances of success, depending on the judge in charge of the case, of appeals filed against decisions requiring a foreigner to leave French territory, thereby deducing that some of these magistrates were apparently biased. The decree implementing "open data for judicial decisions" was published on June 30, marking a significant shift by making judicial and administrative decisions accessible online, as outlined by the Lemaire Law of October 7, 2016, and the justice reform law of March 23, 2019. These decisions must be made available online within two months for administrative judgments and six months for judicial decisions after being provided to the court registry. The responsibility for organizing this publication falls to the Conseil d'yOtat and the Cour de Cassation, pending the completion of the Portalis portal and the automation of a system to anonymize names. This change allows legal professionals access not only to published case law but also to thousands of daily decisions from courts across France, except for those made in closed While this enhances legal knowledge and access, concerns arise over privacy protection for individuals mentioned in these decisions and the security of judges. The decree includes measures for "anonymizing" sensitive data, but identification risks remain due to potential distinctive elements within the decisions. Furthermore, the broad access to decisions raises questions about the potential for predictive justice and "profiling" of judges, despite the 2019 law's prohibition against analyzing or predicting judges' professional practices. Compiling all decisions could reveal judgment patterns, a capability within the reach of legal tech algorithms. the litigant or their counsel . or example, contemptuous remarks about the professional skills of a party's lawyer: Civ. , 3yme, 4 juillet 2007, nA06-13-. 62 Civ. 2yme, 27 mai 2004, nA04-01. Bull. II, nA258. 63 Civ. 2yme, 14 octobre 2004, nA2-18. Bull. II, nA457. 64 Civ. 1yre, 18 mai 2011, nA10-10. Bull. I, nA89. 2011, p. 65 Vigneau. Vincent. "Le passy ne manque pas d'avenir: Libres propos d'un juge sur la justice prydictive. Recueil Dalloz 20 . : 1095-1103. 66 Michael Benesty, "L Aoimpartiality de certainsjuges mise y mal par l Aointelligence artificielle. " Village de la justice 25 . Accessed 27 June 2024. https://w. village-justice. com/articles/impartialite-certainsjuges-mise,21760. P-ISSN: 2442-9880. E-ISSN: 2442-9899 Finally, there's a concern about maintaining the humanity and unpredictability of judicial decisions, and the freedom for judges to innovate and invent law. The exhaustive knowledge provided by open data poses a challenge to the unique human intelligence in justice, questioning if there can still be room for groundbreaking judicial creativity in the age of open data. Similarly, the use of predictive justice tools can call into question the neutrality of the The fact that predictive justice software produces reliable, objective analyses devoid of human subjectivity does not indicate their impartiality. Legal prediction is not It largely reflects the arbitrary choices of the software designers so that the result cannot be considered bias-free as it largely depends on social and ideological biases attributed to the collected data. A study conducted in the United States in 2016 found that the COMPAS software,68 used to calculate the recidivism risk of defendants or convicts, had a low reliability rate as it disadvantaged the African-American population. The software embedded their biases into the algorithm by associating social traits, such as skin color and address or profession, most common in this population with recidivism For lawyers, using predictive justice tools is not without risk. It is regrettable that these tools might deter lawyers from physically attending hearings, especially when the information provided by prediction tools argues in favor of their clients' appeals succeeding, undermining the quality of adversarial debates in hearings. Similarly, lawyers might fear being consulted less when litigants get used to using predictive justice tools Regarding judicial decisions, one of the major challenges of applying open and big judicial data is the reuse of these by companies and law firms exploiting predictive justice tools. Dehumanization and Standardization Concerns From the administrative judge's perspective, the dangers are multiple. The first danger is the dehumanization of administrative justice. Does a judge without ethics, without professional conduct, without responsibility, and without humanity still deserve the title of "judge"? Entrusting certain jurisdictional tasks to software may lead the judicial institution towards a dehumanized justice where humans no longer, or scarcely. Indeed, even if a human judge can be prone to error or emotion, they have the ability to rectify and correct their mistakes, which isn't the case for judgments rendered by predictive justice software. These, which aren't infallible, will reproduce the biases of those who programmed them without any fear of making a mistake that should be corrected. 67 Basile Ader. AuL'open data des dycisions judiciaires et le droit au juge,Ay Lygipresse 2020. Accessed 27 June 2024. https://w. com/011-50826-lopen-data-des-decisions-de-justice-et-le-droit-aujuge. 68 For correctional offender management profiling alternative sanctions. 69 For a comprehensive study see: Barraud. Boris. "Un algorithme capable de prydire les dycisions des juges: vers une robotisation de la justice?. " Les Cahiers de la justice 1, no. : 121-139. 70 Benoit Plessix. AuVers une justice administrative prydictive?Ay 94. Hasanuddin Law Rev. : 119-143 The use of predictive justice algorithms could lead to a standardization of judicial decisions and undermines the creative role of the administrative judge, placing upon them an obligation to conform to judgments rendered by their peers. In this case, will the judge be compelled to justify any divergence from the norm resulting from algorithms? At the very least, there's a concern that the use of predictive justice tools will encourage imitation and conformity on the part of judges, posing a risk to their freedom and a source of conservatism and rigidity in decisions. 71 This solidification of case law reduces the role that administrative case law has played in shaping and adapting administrative law to legislative, economic, and social changes. The perils might be more about how the judging function itself is exercised. There's a risk that judges unlearn how to judge. In other words, to analyze individual cases, to search for appropriate case law, and then to think, reason, and provide a concrete solution to the dispute. That's why the judge must remain in control of both the question posed and the interpretation of the results given by algorithms and the implications thereof. 72 What judges understand about the hierarchy of norms and the relationships between different national and international legal orders, an algorithm does not seem equipped to grasp. With the advent of judicial big data, the computer virtually offers the judge access to thousands of pieces of information without knowing anything themselves. This omniscience risks relieving judges of their duties to search, reason, analyze, critique, reflect, and think. This passive role may turn the judge into an automaton, mechanically conforming to the results of predictive algorithms, judging automatically and through imitation, without reflection, without seeking to innovate. Moreover, the results produced by algorithms risk being repetitive and amplified, and any "atypical" decision, even if justified, may appear unacceptable unless specifically reasoned. 73 We must also recognize and feel the danger of vicious circles and feedback effects, as computer scientists well understand them. This influence of predictive justice on the judge's role causes litigants to lose all confidence in the judicial institution. Moving from equality before the law to a two-speed justice, "litigants who turn to legal-tech will be discouraged from entering into litigation and will be offered settlements based on average statistics and conformist legal decisions, while those who have not used these tools will approach the judge and benefit from individualized judgments, perhaps innovative, based on the specific circumstances of their case. "75 The French Court of Cassation consistently holds that legal certainty does not guarantee a right to unchanging jurisprudence,76 as the evolution of case law is part of the judge's duty in applying the law. The use of algorithms should not prevent jurisprudential 71 Garapon. Antoine. "Les enjeux de la justice prydictive. " La semaine juridique 1 . : 47-52. 72 Rouviyre. Frydyric. "La justice prydictive, version moderne de la boule de cristal. " RTDCiv. Revue trimestrielle de droit civil, 2 . : 527. 73 Mynard, yOloi Buat, and Paolo Giambiasi. "La mymoire numyrique des dycisions judiciaires. " Recueil Dalloz 26 . : 1483-1489. 74 O'Neil. Cathy. Algorithmes: la bombe y retardement. Les arynes, 2018. 75 Benoit Plessix. AuVers une justice administrative prydictive?Ay 95-96. 76 Civ. 1re, 21 mars 2000, nA 98-11. Le Collinet c/ Compagnie d'assurances Rhin et Moselle. P-ISSN: 2442-9880. E-ISSN: 2442-9899 evolution or lead to a "sterilization" of civil liability law. This is particularly important in the context of compensating for bodily harm. The Dintilhac nomenclature, designed to categorize these damages, is not meant to be exhaustive or rigid. Its creators emphasized that it should serve as a flexible guide, open to the inclusion of new categories of damages as needed. The Court of Cassation acknowledges additional damages when justified by specific cases, such as compensating a victim's spouse for the inability to have biological children, a form of damage distinct from companionship loss. Furthermore, the Court of Cassation has affirmed the indicative nature of the Dintilhac nomenclature in decisions regarding the fear of imminent death and the waiting period suffered by relatives, which are not covered by existing categories of suffering or emotional loss. In conclusion, while algorithms could potentially streamline the process of bodily damage compensation, there's a significant risk if their application leads to automatic processing by AI systems. Judges must retain control over their decisions, using algorithms merely as decision-support tools. The idea of replacing judges with "robot judges" is not a current However, there's concern that such guidelines could heavily influence judicial decisions, risking standardization and hindering the evolution of case law. Optimistically, these guidelines could promote more equitable treatment of victims and more focused debates on specific damages, serving as a basis for dialogue among parties and between judges and the legal community, enriching rather than standardizing judicial thought. Another limit to the predictability of administrative justice lies in the reversal of jurisprudence so intrinsic to the entire history of administrative jurisprudence and a factor of its rooting, its development, and its renewal. The reversal of jurisprudence, a result of the judge's imagination and creativity, is by nature unpredictable. It suddenly arises to face an extraordinary situation that requires a reaction from him which will then precisely consist of taking a tangent, to deviate from everything that was judged until now, and therefore to imagine something new and unprecedented. 78 On the other hand, the prediction algorithm is not programmed to react to the unexpected, to the It predicts solutions based on probabilities calculated from decisions already made in similar cases, lagging behind an ever-evolving human administrative jurisprudence, suddenly innovative, deviating from the average, and unpredictable. which cannot be anticipated by a computer, no matter how powerful. Jurisprudence is ultimately a flexible, living matter in which each case is assessed concretely and comprehensively depending on the circumstances of the case under the sovereign judgment of the judge who must continue to exercise his functions independently by applying to the dispute the relevant texts and jurisprudence, and he must do so in consideration of the facts and circumstances specific to each case in the context of a debate that must remain public and contradictory. 77 Cayol. Amandine. "L'indemnisation des dommages corporels y l'heure de l'open data. " Dalloz IP/IT: droit de la propriyty intellectuelle et du numyrique 03 . : 164-171. 78 Benoit Plessix. AuVers une justice administrative prydictive?Ay 101. Hasanuddin Law Rev. : 119-143 Computer Security and Personal Data Protection We know that, on the one hand, digitized judgments contain data that is personal,79 significant, and often sensitive, and on the other hand, court decisions are public data. provided by law concerning the administrative court, pleading hearings are public:80 this means that judgments pronounced publicly become accessible to anyone after their pronouncement by simply requesting communication from the registry. But court decisions are not ordinary public data for at least three reasons. First, because of their origin . udicial institution. These decisions relate to the exercise of the judicial function and cannot be equated with administrative documents. Then, these decisions contain personal information such as the names of the parties and their addresses as well as the names of the judges who issued the judgment. 81 Finally, because of their subject . he rights of litigant. For all these considerations, the dissemination of these decisions calls for special protection. Who protects this data against the risks of misuse, abuse, or threats to their security? Who protects these personal data from external "hackers"? Furthermore, the French legislator, since 1978, adopted a forward-looking position by enshrining in Article 10 of Law No. 78-17 of January 6, 1978, on computing, files, and freedoms, the following: "No court decision involving an assessment of a person's behavior can be based on an automated data processing intended to evaluate certain aspects of their personality. No other decision producing legal effects concerning a person can be taken solely on the basis of automated data processing intended to define the profile of the person concerned or to evaluate certain aspects of their personality. This prohibition of profiling algorithms concerning court decisions was strengthened following the amendment of Article 10 by the law of June 20, 2018, and also by the interpretation provided by the constitutional council. 79 Article 4 of Law No. 2004-63 of July 27, 2004, on the protection of personal data, defines personal data as 'all information regardless of its origin or form and that directly or indirectly identifies a natural person or makes them identifiable, with the exception of information related to public life or considered as such by law'. On its part. EU Regulation 2016-/679 (General Data Protection Regulatio. defines in its Article 41 personal data as 'any information relating to an identified or identifiable natural person'. 80 Article 51 . 81 In application of the provisions of Article 53 . paragraphs 2 and 3 of the law relating to the administrative court, each judgment indicates the names, qualifications, and addresses of the parties as well as the names of the members who have rendered them and of the clerk. 82 If this interpretation is issued regarding the provisions of the law relating to administrative decisions, it seems to apply a fortiori to judicial decisions. See Cons. Const. June 12, 2018. No. 2018-765 DC, (D. June 15, 2018, obs. Janue. No. 65-72, spec. No. 70: 'The individual administrative decision must be able to be subject to administrative appeals, in accordance with chapter one of title one of book four of the code of relations between the public and the administration. The administration, when addressed in these appeals, is then required to make a decision without relying exclusively on the algorithm. "exclusive recourse to an algorithm is excluded if this processing concerns one of the sensitive data mentioned in paragraph I of Article 8 of the law of January 6, 1978, that is personal data 'which reveal the alleged racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership of a natural person', genetic data, biometric data, health data, or data concerning a person's sex life or sexual orientation. " On the issue, see: Loic Cadiet. AuLAoopen data des dycisions de justice,Ay (Report French Ministry of Justice. November 2. , 24. Accessed June 24, 2024. https://w. fr/sites/default/files/migrations/portail/publication/open_data_rapport. P-ISSN: 2442-9880. E-ISSN: 2442-9899 This respect for personal data mentioned in court decisions during their processing by predictive justice algorithms is due to the fact that making these decisions available to the public is not a trivial operation as it involves essential principles on which the legal orders of the State are based, such as the publicity of case law, the independence of justice, and respect for privacy. Another major problem results from the lack of control over predictive justice algorithms in the event of infringement of personal data. Indeed, even if we can control the 'feeding' and the product of algorithms and sanction potential infringements on the protection of personal data at these two stages, who controls the algorithms themselves, who controls the machine itself, who controls its ability to deliver exactly what it claims to reveal? It must be acknowledged that there is an ineffectiveness in controlling 84 In this case, public authorities must not remain passive. They must intervene to control algorithms and regulate this new market. This requires, it seems to us, the creation of an independent administrative authority responsible for controlling the quality of algorithms or even a certification or approval of them, made up of scientific experts in the field. Conclusion While predictive justice tools offer significant potential for enhancing the efficiency and consistency of judicial decisions, they cannot yet supplant human justice. Justice transcends mere calculations, emphasizing reasoned deliberations. Central to this process is the unique and sovereign judgment of human judges, who exercise discretionary powers. This discretion allows judges the flexibility to navigate, make choices, and select from multiple legally viable solutions, underscoring their critical role in the justice system where human insight and judgment are irreplaceable. That said, it cannot be denied that predictive justice tools have provided administrative judges with undeniable prospects for progress, which they must seize to work effectively in the service of litigants while being vigilant about the inviolability of the fundamental principles of administrative justice. Therefore, predictive justice tools should be adopted with caution. These tools must consider the essential nature of justice that requires reasoning, discretion, and human sensitivity to ensure the justice system remains fair, impartial, and capable of adapting to the unique aspects of each individual case. Compliance with Ethical Standards - Conflict of Interest: The authors declare that they have no conflicts of interest. - Funding: The authors did not receive support from any organization for the submitted - This article does not contain any studies involving animals performed by any of the - This article does not contain any studies involving human participants performed by any of the authors. - Data Availability Statement: Research data are not shared. 83 Gaudemet, "La justice y l'heure des algorithmes. " Rev. Dr. Pub 3 . : 651. 84 Cadiet. Loic. AuOpen et big data, procys virtuel, justice prydictiveAAy Op. Cit. , p. Hasanuddin Law Rev. : 119-143 References