“What exactly are you worried about?” When she gave her first lectures on data protection some 15 years ago, associate professor in IT law Aline Klingenberg was often obliged to explain her own concerns. But as time passed, the question became easier to answer. ‘These days, the consequences of not handling personal data with care are clear for all to see. We have the technological ability to collect and use all kinds of personal data, but having the ability to do something doesn't necessarily mean that you should.’
Text: Nynke Broersma, Communications department
New Year's Eve, 1999. The champagne was on ice, the clock on the television was counting down to midnight. Ten, nine, eight, seven, six, five, four, three, two, one…. Would the entire computer network shut down now that all of the systems had been reset to 00? Looking back, we can laugh at the idea. But at the time, it was a real concern, says Aline Klingenberg. Her first job revolved around the ‘millennium bug’ as it was called, and also with the law that preceded the predecessor of the GDPR. It marked the start of Klingenberg’s specialization in privacy, data protection and electronic government communication.
After having spent a few years working for the Municipality, Klingenberg returned to the UG in 2001, this time as a lecturer. One of the questions that she enjoys puzzling over with her students is: can you answer modern questions using old jurisdiction?
The blocking of Donald Trump’s accounts by Twitter, Instagram and Facebook is a recent example of one of these ‘legal conundrums’. Whereas traditional media are obliged to appoint an editorial board to ensure that any facts that are published are accurate, communication via social media provides a direct connection between sender and recipient. ‘The question that now arises is: Should social media companies also be held accountable as journalistic editors? Twitter has already taken on this role by denying Trump his platform’, says Klingenberg.
Can you legally oblige a company such as Twitter to remove a message? ‘If a platform knows that a message is illegal, you can do that,’ explains Klingenberg. ‘But only in retrospect, after a complaint has been made, for example. You cannot censor in advance. Twitter isn’t automatically responsible for messages that appear online, which is actually quite logical. The postman isn’t responsible for the contents of the letters that drop onto your door mat. But just imagine if the postman kept tabs on the letters that you send and receive. If you read a brochure about running shoes, the postman would send you more mail on the subject. In theory, social media are independent transmitters, but in practice, they behave quite differently. This is what makes it such a complex legal question.’
As things stand, social media companies are allowed to deny someone access to a platform based on their own private terms and conditions. They cannot be forced to do so. But is this the direction that we are heading in? Klingenberg: ‘If it is, it would have to be laid down in law. The argument could then be: social media platforms have become so big and powerful that they can now be seen as a public commodity, just like water and electricity. Following this logic, we should regulate them strictly, despite the fact that they are private companies.’
It would be entirely possible to drive along the Diepenring at 100 km per hour or to practise driving figures of eight on the Grote Markt. Your car is capable of it, but the government has restricted this kind of use for safety reasons. But that isn’t always how things work with IT, continues Klingenberg. ‘We are more inclined to think: “Why not, if it’s possible?” But the fact that you can do something doesn’t necessarily mean that you have to do it. We have the technical know-how to collect all kinds of personal data and use it to make algorithms to help with combating fraud, for example. But sometimes it’s better to think: we know we can make them, but it’s in the general interest not to use them. It could easily lead to racism and other forms of social inequality.’
Picture this: Someone steals your driving licence and uses your identity to register hundreds of cars. As a result, countless traffic fines and car tax bills are filed under your name. You go to court to have the false registrations under your name removed from the computer systems with retroactive effect. But the Dutch court rules: “No, we will not do that. The legal interests of the system take priority over the legal interests of the individual.” This is what happened in the Romet case.
In Klingenberg’s mind, the Romet case is a prime example of what she calls people’s ‘inflated, magical belief’ in IT systems. ‘The commonly-held idea is: if a law has been laid down in a particular system, it must be right and it should be enforced to the letter. Even if that law is far too strict and impossible to enforce in practice, or if people make mistakes when complying with the system. The Romet case dates back to 2004. At the time, the Dutch court said that the system took priority over the individual. The case dragged on for years, until the Dutch State was rapped over the knuckles by the European Court of Human Rights. The fact that the child benefits affair was able to take place shouldn’t really come as a surprise to us.’
“What exactly are you worried about?”, is a question that Klingenberg has heard repeatedly over the past 15 years. ‘But there is a lot more societal attention for privacy these days. People now know what can happen if you don’t take proper care of personal data. Take the Cambridge Analytica data scandal. Or compromising photos and videos, which can haunt people for years.’
So Klingenberg’s answer to the question “What exactly are you worried about?” has not changed during her 15 years as a lecturer. ‘Taking care of personal data is in the general interest: it ensures that everyone is given an equal opportunity in society.’
On behalf of the University of Groningen, postdoc Nynke Vellinga has become a member of CCAM, a new European partnership for Connected, Cooperative and Automated Mobility.
Different from previous years but still surprising, fun, healthy, and for the whole family: join Groningen’s take on this year’s national weekend of science, organized by the University of Groningen (UG) and Hanze University of Applied Sciences...
The process of collecting information about public spaces via mobile phones continues to further impact the privacy and rights of citizens. This is the conclusion of PhD student Gerard Ritsema van Eck’s research. Ritsema van Eck will be awarded a...
The UG website uses functional and anonymous analytics cookies. Please answer the question of whether or not you want to accept other cookies (such as tracking cookies).
If no choice is made, only basic cookies will be stored. More information