Skip to ContentSkip to Navigation
University of Groningenfounded in 1614  -  top 100 university
About us Practical matters How to find us E.H. (Eveline) van Beem

E.H. (Eveline) van Beem

PhD studentshe/her
Profile picture of E.H. (Eveline) van Beem
Telephone:
E-mail:
e.h.van.beem rug.nl

Explainability of AI systems in European digital field crop farming from a legal perspective

My research addresses the issue of explainability of Artificial Intelligence (AI) systems, specifically in the agricultural sector. AI, while beneficial in various fields, can be complex and opaque, potentially infringing on fundamental rights. These ‘black-box’ models have led to the development of Explainable AI (hereinafter: XAI), which aims to make AI decision-making processes understandable to humans.

In agriculture, AI applications are increasingly important. However, these models may lack transparency, which is crucial for farmers to trust and adopt AI systems.

The EU AI Act aims to regulate such AI systems based on their risk level. It includes rules on explainability, which could be interpreted as XAI requirements, but their interpretation and measurement remain unclear.

My research aims to explore the theoretical meaning and practical application of explainability as a legal requirement for (high-risk) AI systems in the EU. With doctrinal and empirical research, legal theory and practical implications are combined. I explore how explainability can be operationalised throughout the AI lifecycle of AI systems for field crop farming, to determine what the practical reality needs.

Last modified:29 October 2025 4.44 p.m.