Automated decision-making and profiling are finally being considered before the Court of Justice of the European Union (CJEU). Article 22 GDPR states that “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” – but its provisions raise many doubts to the legal doctrine – and to the referring court in the SCHUFA case. The problem with this case law lies in the opacity of inferences or predictions resulting from data analysis, particularly by AI systems – inferences whose application to everyday situations determines how each of us, as personal data subjects, are perceived and evaluated by others. The CJEU has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. However, the effectiveness of the application of the GDPR to inferred data faces several obstacles. This has to do with fact that the GDPR was designed for data provided directly by the data subject – and not for data inferred by digital technologies such as AI systems. This is the difficulty behind the Advocate General’s Opinion.
© 2001-2024 Fundación Dialnet · Todos los derechos reservados