litigation analytics and prediction sector, the French Government has banned the publication of statistical information about judges’ decisions – with a !ve year prison sentence set as the maximum punishment for anyone who breaks the new law. Owners of legal tech companies focused on litigation analytics are the most likely to su"er from this new measure. The new law, encoded in Article 33 of the Justice Reform Act, is aimed at preventing anyone – but especially legal tech companies focused on litigation prediction and analytics – from publicly revealing the pattern of judges’ behaviour in relation to court decisions. A key passage of the new law states: ‘The identity data of magistrates and members of the judiciary cannot be reused with the purpose or e!ect of evaluating, analysing, comparing or predicting their actual or alleged professional practices evaluating, analysing, comparing or predicting their actual or alleged professional practices.’ * As far as Arti!cial Lawyer understands, this is the very !rst example of such a ban anywhere in the world. Insiders in France told Arti!cial Lawyer that the new law is a direct result of an earlier e"ort to make all case law easily accessible to the general public, which was seen at the time as improving access to justice and a big step forward for transparency in the justice sector. However, judges in France had not reckoned on NLP and machine learning companies taking the public data and using it to model how certain judges behave in relation to particular types of legal matter or argument, or how they compare to other judges. In short, they didn’t like how the pattern of their decisions – now relatively easy to model – were potentially open for all to see. Unlike in the US and the UK, where judges appear to have accepted the fait accompli of legal AI companies analysing their decisions in extreme detail and then creating models as to how they may behave in the future, French judges have decided to stamp it out. Embed