The CJEU’s SCHUFA judgement (C-634/21) in 2023 clarified that producing and transmitting a credit score can itself amount to an automated decision under Article 22 GDPR where the score is determinative for contract outcomes. This ruling has now translated into concrete enforcement. In 2025, both the Austrian and Hamburg DPAs issued decisions that apply these principles directly to the credit and financial sectors.
Austrian DPA Prohibits KSV1870 Scoring
In September 2025, following complaints initiated by noyb, the Austrian Data Protection Authority prohibited KSV1870 from processing the complainant’s personal data to calculate and transmit scoring values where those values are used for an automated decision, unless the complainant consents.
A consumer’s online application to Unsere Wasserkraft, an energy supplier using KSV1870’s RiskIndicator, was initially confirmed and then rejected within about a minute. The DPA reconstructed the process and found there was no human involvement in the rejection workflow. It held that Article 22 para. 1 GDPR applied to KSV1870 because its score was a decisive criterion relied on by the energy provider, and that Article 22 para. 2 lit. a GDPR applied to the energy provider’s automated screening. Both entities were found to have breached transparency duties.
Key findings:
- Unlawful automated decision making (ADM): By generating and transmitting a RiskIndicator score that directly led to a contract rejection, KSV1870 carried out an automated decision within the meaning of Article 22 para. 1 GDPR.
- No valid exception: None of the Article 22 para. 2 GDPR exceptions applied — there was no contract between KSV1870 and the consumer, no legal authorisation, and no explicit consent.
KSV1870 is now prohibited from processing the complainant’s data to calculate scoring values used for automated decisions absent the complainant’s consent
Unsere Wasserkraft’s automated accept-reject workflow constituted ADM, but the DPA accepted Article 22 para. 2 lit. a GDPR as “necessary for entering into, or performance of, a contract” in this setting, while reprimanding transparency failures, including insufficient information about the use and logic of ADM.
This decision matters beyond the parties: it confirms that a credit agency can itself be the controller making an automated decision when its determinative score is relied on for contract outcomes.
Wider Context: noyb Actions
Beyond enforcement by DPAs, noyb has intensified its focus towards automated credit scoring activities. noyb coordinated access requests by 2,440 individuals and evaluated more than 40,000 query records concerning CRIF, with over 28,000 scores transmitted, and has raised concerns about data sources and transparency.
In its analysis, noyb highlighted several critical shortcomings in CRIF’s practices:
- Opaque data sourcing: CRIF is said to aggregate data from address publishers, telecoms, banks, and online platforms in ways that data subjects often cannot trace.
- Lack of transparency: According to noyb, individuals frequently lack awareness that their data is used for credit scoring, and explanations of the scoring logic are often cursory or inadequate.
- Potential GDPR violations: These practices could breach GDPR principles of lawfulness, fairness, and transparency, failure to provide meaningful information under Articles 13–15, and reliance on unlawful automated decisions under Article 22.
These coordinated actions by civil society actors and regulators signal mounting pressure on opaque credit scoring systems across Europe.
Hamburg DPA Fine
On 30 September 2025, the Hamburg DPA fined a financial company EUR 492,000 for failures to provide meaningful information about automated credit card application rejections.
The authority notes that applications were rejected through automated decisions “ohne menschliches Eingreifen” – without human intervention, and that the controller responded with boilerplate rather than intelligible information about the logic, as required.
Lessons Learned
- Scoring can be ADM: Where a determinative score is relied upon for contract decisions, the scorer can fall within Article 22 para. 1 GDPR.
- Exceptions are narrow and must be evidenced: Article 22 para. 2 lit. a GDPR may justify ADM in specific, demonstrated contracting workflows, as the Austrian DPA accepted for the energy supplier in this case.
- Transparency is non-negotiable: Controllers must provide clear, intelligible explanations of the logic, variables, and consequences of automated decisions on request. Boilerplate is insufficient.
- Human oversight is essential: End-to-end automated pipelines raise risk; ensure effective post-decision review and escalation are available.
With Austria prohibiting an unlawful scoring practice in the KSV1870 case and Hamburg imposing a substantial fine for transparency failures around automated rejections, automated credit decision-making is a clear enforcement priority. Companies need to ensure any ADM fits within Article 22 para. 2 GDPR and that transparency obligations are fully implemented, or face orders and fines.
Early involvement of DPOs or specialist advisors is not cosmetic; it is risk-controlling.