
We’re two data journalists who investigated the data mining algorithm that decides which French households get audited for benefit fraud. AMA about the use of algorithms in public services and discrimination.
by LeMonde_en

We’re two data journalists who investigated the data mining algorithm that decides which French households get audited for benefit fraud. AMA about the use of algorithms in public services and discrimination.
by LeMonde_en
2 comments
Hi everyone! We’re Manon Romain and Adrien Sénécat, two data journalists at Le Monde. We conducted an investigation with Lighthouse Reports into the data mining algorithm that the CAF, France’s social security agency, uses to decide which households get audited for benefit fraud.
The CAF can’t audit everyone, so it uses this algorithm to identify cases where the risk of fraud is higher by assigning a “risk score” to each household. It says this system is neutral – but through the use of statistical correlations, the algorithm often ends up targeting the most vulnerable groups.
This audit system is opaque, and according to the agency, a majority of the most thorough home checks are triggered due to the algorithm. Our investigation shows that the system significantly relies on discriminatory criteria (age, family situation, entitlement to the disabled adults’ allowance, and economic vulnerability) that are prohibited by law. It’s the first time these practices have been proven, due to a previous lack of transparency.
**Read our articles about the investigation here:**
[How an algorithm decided which French households to audit for benefit fraud](https://www.lemonde.fr/en/les-decodeurs/visuel/2023/12/05/how-an-algorithm-decides-which-french-households-to-audit-for-benefit-fraud_6313254_8.html)
[The use of opaque algorithms facilitates abuses within public services](https://www.lemonde.fr/en/les-decodeurs/article/2023/12/05/the-use-of-opaque-algorithms-facilitates-abuses-within-public-services_6314051_8.html)
[Thomas Piketty: ‘Anti-poor ideology ultimately leads to a general deterioration in the quality of public service’](https://www.lemonde.fr/en/opinion/article/2023/12/10/thomas-piketty-anti-poor-ideology-ultimately-leads-to-a-general-deterioration-in-the-quality-of-public-service_6329290_23.html)
**AMA about our investigation, our methodology, the use of algorithms in public services, discrimination… and more!**
In a scale from 1 to 4, based on race and/or nationality, how discriminatory is the algorithm?