Autonomous systems should be fairer and more democratic

Autonomous systems that utilize AI are ubiquitous, ranging from Google search queries to smart traffic lights. However, we must be careful using them, says philosopher Herman Veluwenkamp. ‘When autonomous systems are developed, too little attention is paid to the question as to whether the choices they make are fair, and in many cases there is a lack of democratic control.’ He hopes to change that by means of a rating model.
Text: Thomas Vos, Corporate Communication UG / Photos: Henk Veenstra
‘I worked in the IT sector for many years. I was a software developer for major corporations in the telecommunications industry. Sometimes, we’d discuss ethical concerns, but only during coffee machine chats. Afterwards, I’d continue performing my duties as usual,’ Veluwenkamp says. ‘Users [of our products] had no control whatsoever over how we used their data.’ Which increasingly came to bother him.
Radical choice
Veluwenkamp, who was appointed assistant professor at the Faculty of Philosophy two years ago, made a radical choice: he left the IT industry and embarked on a philosophy degree at the UG. He then completed a Research Master’s programme, and was awarded a PhD in 2020. Even now he is haunted by one question: how can we ensure that autonomous systems make choices that are fair, and how can we ensure that citizens are in charge of this process?
Bridging the gap between philosophers and technicians
Veluwenkamp seeks to combine two matters in his research project. ‘On the one hand, we must consider important values such as fairness. “Fairness” may have many different definitions, which may contradict each other. Take, for instance, a job application system. Is it fair to have all the candidates go through the exact same process, or is it fair to use a system that accounts for disadvantaged backgrounds to promote equal opportunity? It’s important to take a good look at the context in which the concept is being applied. On the other hand, you have to ask: how do we incorporate this into the development and application of autonomous systems? In this way, I’m really trying to bridge the gap between philosophers and technicians.’
High degree of urgency
Veluwenkamp has made a conscious decision to focus on autonomous systems. He believes this is the most urgent area because these systems operate independently and lack transparency: ‘Many people have a hard time understanding how exactly autonomous systems work. As a result, we are inclined to ascribe all sorts of qualities to those systems that they don’t actually have. Since they seem to operate autonomously, we tend to think that they are neutral and fair. However, this is a misconception. Public organizations such as municipalities should consider this critically.’

More than just a technical decision
After all, the development and application of autonomous systems are not just technical matters, as Veluwenkamp emphasizes. ‘We can influence the degree to which the systems operate fairly or unfairly, and we must be able to be accountable to others in that regard. Right now we are not seeing such accountability. For instance, I can see it in the Municipality of Groningen’s Data and Technology Ethics Review Committee, of which I am a member. Decisions regarding tenders and the design of autonomous software are generally made on the basis of technological aspects. Naturally, that is correct, but political and ethical choices are involved as well. It is quite conceivable that a GroenLinks-PvdA-affiliated politician should want software to meet rather different requirements than a PVV-affiliated politician.’
Smart traffic lights
For this reason, Veluwenkamp is working on a model allowing democratic control of autonomous systems. He is doing so by means of a case study of smart traffic lights, financially supported by the SIDN Fund. ‘Those traffic lights use cameras to make all sorts of decisions that have a direct impact on our lives. For instance, they decide whether cyclists take priority over drivers during a downpour, or whether a truck carrying sugar beets is given priority during the sugar beet harvesting period, etc. At present, it is hard for me, an outsider, to understand how these traffic lights make decisions, and what those decisions are based on. Furthermore, it is hard for citizens to file objections against these decisions, and no politician can be held accountable. There is no democratic control. We don’t know how fairly these systems operate. I should note, though, that this is probably true for many autonomous systems. The Dutch childcare benefits scandal is a good example. This is intriguing to me, precisely because such systems affect so many people’s lives,’ says Veluwenkamp.

Rating model
Veluwenkamp explains that the model uses a star rating system to assess democratic control of autonomous systems: ‘The model outlines the conditions to be met by autonomous systems and gives them a rating from one to five stars. The more democratic control, the more stars. A higher score can be awarded if you allow citizens to file an objection, provide an insight into the values underlying the system, and organize citizen panels to gather public input. The goal is for the model to be broadly applicable within public organizations.'
Collaboration
Veluwenkamp also seeks to raise awareness by inviting other faculties to collaborate with him: ‘In association with researchers at the Faculty of Arts who study linguistic models, I taught workshops to a wider audience. The question we discussed was how society should take into account the ethical consequences of using ChatGPT. Researchers at the Faculty of Arts were able to tell the audience more about the technology behind the linguistic model, while I provided an ethical perspective.’ He also collaborates with Nynke Vellinga from the Faculty of Law. Veluwenkamp explains: ‘She conducts research on legal liability in relation to self-driving cars and examines how democratic can be enshrined in law. That is highly valuable.’
The collaborations with other faculties were established through the Jantina Tammes School of Digital Society, Technology and AI, . Veluwenkamp was accepted as a Jantina Tammes Scholar earlier this year. The Jantina Tammes School seeks to bring together researchers who focus on similar themes, even though they represent different disciplines.
More information
Last modified: | 24 September 2025 4.37 p.m. |
More news
-
17 July 2025
Veni-grants for eleven UG researchers
The Dutch Research Council (NWO) has awarded a Veni grant of up to €320,000 each to eleven researchers of the University of Groningen and the UMCG: Quentin Changeat, Wen Wu, Femke Cnossen, Stacey Copeland, Bart Danon, Gesa Kübek, Hannah Laurens, Adi...
-
08 May 2025
KNAW appoints three professors of UG/UMCG as new members
Professors Jingyuan Fu, Lisa Herzog, and Helga de Valk of the UG have been appointed members by the Royal Netherlands Academy of Arts and Sciences (KNAW).
-
22 April 2025
Impact | Asking questions and listening to another person's story
In the coming weeks the nominees for the Ben Feringa Impact Award 2025 will introduce themselves and their impactful research or project. This week: Makke Marij Bakker, Annika Zweep, Adriaan Beelaerts van Blokland & Lisette Kamping nominated for...