LØRN Case #C1181
Dark patterns and the importance of design in regulation
How is personal data used as a currency, and at what exchange rates? How does the Norwegian Consumer Council work with managing consumer rights to privacy in a global marketplace? We welcome Finn Myrstad, Director of Digital Policy at Norwegian Consumer Council, where he leads the development of more ethical digital policies, to engage us in this topic. This is a part of the podcast series we are creating together with BI Business School to explore main challenges and opportunities to create good AI, as part og BI Business School course Responsible AI Leadership. Our co-host is Samson Yoseph Esaias, Associate Professor at the Department of Law and Governance BI.

Finn Myrstad

Director of digital policy

Norwegian Consumer Cocunil

Samson Yoseph Esaias

Associate Professor - Department of Law and Governance BI.

BI

"I feel it's definitely a case of I wouldn't be doing my job if I didn't try to think 5 to 10 years ahead"

Varighet: 45 min

LYTTE

Ta quiz og få læringsbevis

0.00

Du må være medlem for å ta quiz

Ferdig med quiz?

Besvar refleksjonsoppgave

Tema: Digital etikk og politikk
Organisasjon: Norwegian Consumer Cocunil
Perspektiv: Storbedrift
Dato: 220602
Sted: OSLO
Vert: Silvija Seres

Dette er hva du vil lære:


What is the mission of the Norwegian Consumer Council and what’s the work you do in relation to digital policy?

Dark patterns. What are they and why do we need to worry about them?

Surveillance-based ads. What is the problem and why is it a good idea to ban it?

The Grinder complaint. How it started, and what is it about?

Mer læring:

“Freedom to Think: The Long Struggle to Liberate Our Minds” by Susie Allegre. 

Del denne Casen

Din neste LØRNing

Din neste LØRNing

Din neste LØRNing

Dette er LØRN Cases

En LØRN CASE er en kort og praktisk, lett og morsom, innovasjonshistorie. Den er fortalt på 30 minutter, er samtalebasert, og virker like bra som podkast, video eller tekst. Lytt og lær der det passer deg best! Vi dekker 15 tematiske områder om teknologi, innovasjon og ledelse, og 10 perspektiver som gründer, forsker etc. På denne siden kan du lytte, se eller lese gratis, men vi anbefaler deg å registrere deg, slik at vi kan lage personaliserte læringsstier for nettopp deg. 

Vi vil gjerne hjelpe deg komme i gang og fortsette å drive med livslang læring.

En LØRN CASE er en kort og praktisk, lett og morsom, innovasjonshistorie. Den er fortalt på 30 minutter, er samtalebasert, og virker like bra som podkast, video eller tekst. Lytt og lær der det passer deg best! Vi dekker 15 tematiske områder om teknologi, innovasjon og ledelse, og 10 perspektiver som gründer, forsker etc. På denne siden kan du lytte, se eller lese gratis, men vi anbefaler deg å registrere deg, slik at vi kan lage personaliserte læringsstier for nettopp deg. Vi vil gjerne hjelpe deg komme i gang og fortsette å drive med livslang læring.

Vis

Flere caser i samme tema

More Cases in the same topic

#C0061
Digital etikk og politikk

Glenn Weyl

Professor

Princeton

#C0147
Digital etikk og politikk

Hans Olav H Eriksen

CEO

Lyngsfjorden

#C0175
Digital etikk og politikk

Hilde Aspås

CEO

NCE iKuben

Finn Amundsen

CEO

ProtoMore

Utskrift av samtalen: Dark patterns and the importance of design in regulation

Velkommen til Lørn.Tech - en læringsdugnad om teknologi og samfunn. Med Silvija Seres og venner.

 

Silvija Seres: Hello and welcome to a Case by LØRN and Norwegian Business School. This is a part of the series we are creating together to explore the main challenges and opportunities to create good AI as part of a BI course on responsible AI leadership. I'm Silvija Seres from LØRN and my role here is to ask all the simplifying questions and paint the pictures with very bright colors. My co host who will be asking the more academic questions and putting this into context of the new course that he's co building with another professor is Samson Esaias and he's the associate professor of Department of law and governance at BI. Our joint guest is my dear friend from Forbrukerrådet. He's the director of digital policy there and in the second will tell us a little bit more about himself. So welcome, both of you.

 

Finn Myrstad: Thank you.

 

Samson Esaias: Thank you.

 

Silvija: So we have about 30 minutes to talk about something as fancy as dark or deceptive patterns and design and how do we regulate it and how do we make sure that people know that they're being out-designed in the digital space? Not only is their data being used against them long term, but also there are many psychological patterns that are being applied very consciously by people who develop these new digital products and services in order to keep people in the game, in the service and drag them stronger and deeper within their market. So somebody who has been studying this for many years, documenting and analyzing, is Finn and he will help Samson explore this space. We always start these podcasts with a very short introduction for our guests. So Samson, if you could very briefly say who you are and then why did you want to invite Finn as a guest lecturer to your course? And then please tell us a little bit about yourself.

 

Samson: I'm an associate professor at the Department of Law and Governance here at the Norwegian Business School. So I'm a lawyer by background, and I have been working on law and technology for almost ten years now. So I've been associated with the Norwegian Research Center for Computers and Law at the University of Oslo for several years. And I worked with many issues relating to legal issues related to technology and also completed my PhD on data protection and competition law, looking particularly into control over information and information flows and how that basically plays into power, both economic and social, social power. And as you mentioned, Silvija, we are now developing a course at BI on responsible leadership. Our goal in developing this course is to produce as many managers and leaders every year who are going to be in charge of the next wave of new technologies. I think a central element of those technologies is that these people will be leading. And our idea of developing this course would be how do we equip these next leaders with the knowledge and tools in such a way that they can develop and use the technology responsibly, in a way that respects existing legal rules, but also ethical principles.

 

Samson: That's why the work that Finn and his team do at the Norwegian Consumer Council is very important because they have published influential reports in terms of how companies use deceptive design to force users to make choices that are not really in their own interest, or to make choices that run counter to their interests, either in terms of sharing their data, but also spending time on the service or spending money on those services. So in that sense, it kind of paints a good picture of how not to do things or how to do things responsibly. I think we can start with those examples that show how not to do things. My hope is that we can use those examples as a stepping stone to kind of paint a positive picture of how to do things properly. So I think we can we can we can start there.

 

Silvija: Very good. And Samson, very briefly, where are you originally from and what's your most eccentric hobby?

 

Samson: I'm originally from Ethiopia, so I moved here to Norway in 2010 to study my master's, actually. And ten years later, I'm still here.

 

Silvija: Ten years later, you have a baby.

 

Samson: Yeah, I actually do.

 

Silvija: Very nice. Yeah. And maybe that's your most eccentric hobby as well.

 

Samson: Yeah. Nowadays I think spending time with the kids is part of a hobby, part of work a bit. But I don't know. I'm not sure if it's exotic, but I like to dance as well. So my friends and my wife tell me that I move by every music I feel. I get that sense. So that's perhaps one thing. And sometimes when I'm listening to really nice music, I can't wait to get to the elevator to kind of move a little bit. So that's perhaps something that I do enjoy also.

 

Silvija: That's very nice. Perhaps it is perhaps a bit exotic, at least in Norway. So I think that's a wonderful hobby. Well, what do you feel?

 

Finn: Well. So I moved to Oslo from Brussels actually, but I'm originally from west London, west of Norway, but I lived five years in Brussels and a year in London before that, before I moved back to Norway. 2013 when I started my job here at the Norwegian Consumer Council.

 

Silvija: What are you by training? I don't know.

 

Finn: No, good question. I have a masters in political science and European politics, but I also have an executive MBA from a few years back. I've worked in the private sector, in the non-profit sector, and so I have a mixed background which is actually quite helpful in the job that I have now, where it's basically to be a watchdog on behalf of consumers, which is everyone and more or less in Norway and around the world. We try to expose tactics that are not great but also promote a safe market, because consumers want a lot of these services. So you want to try to get the best out of the market. And the way we do that is obviously to do reports research. We do a lot of advocacy communications. So we meet with companies and we do research as well, which is I guess also one of the reasons why I'm here.

 

Silvija: Very good. So, Finn, both of you are actually great examples of this opportunity to mix subjects and these kinds of T shaped individuals where you have one original professional identity, but then you build in breadth. Digitalization is a very important part of this kind of horizontal skill building for both of you are more advanced in thinking about the future of digitalization in our country than most technologists. I often feel that we technologists just build and don't worry about scenarios going forward.

 

Finn: I feel it's definitely a case of I wouldn't be doing my job if I didn't try to think 5 to 10 years ahead. Where are we heading? How can we identify risk and how do we evaluate what's important and what's not important? You can look at today and you'll have something that's a big problem today, but is it going to be a big problem in the future? And are there things on the horizon that are not a problem today but could be in the future? And I think that's where we're trying to make a difference by working very closely with international partners. Just this week, we launched a huge report on gaming and I think we ended up having 18 countries participating. And then today, the United States joins and we had around 35 organizations calling for regulation of gaming because we've identified how certain aspects of gaming are very predatory, very exploitative, and a lot of people are very vulnerable to these tactics. So that's just one area where we're trying to improve. And it's a huge sector, right? It's bigger than Hollywood. I think it generates somewhere between 150, $280 Billion a year, which is a lot of money. It's $115 billion a year and growing. And that's an industry that's not really been looked at by anyone. So that's how we try to be on the forefront of protecting people, because gaming should be a fun activity. It should be safe when you're doing it. And a lot of users aren't today.

 

Silvija: I think what you talked about is exploitative and predatory tactics in product development. It is something that much of business marketing is based on. They want you to buy and they need to build this desire to buy more into you. And that's a part of business. At the same time, if you have a lot of help from AI and data applied to human psychology, it can be overly efficient, and this is what we are trying to figure out. So I'm going to leave the floor to Samson to kind of play ball with you. I want to just start with two examples, two personal examples that I associate with you. And one of them is a stunt that you did some years ago where you were reading for, I don't know, 8 hours or 31 hours non-stop in order to get through one of these terms of usage, terms of service documents that we all accept when we accept services by these big or small digital providers. And I think to me, that really brought home the idea that I'm not it's not just me being lazy or stupid, not wanting to read that text. I don't know enough law. I don't have the time to understand what I'm actually accepting when I accept them. And so when you have the world's best lawyers creating these terms of service in order to counterbalance all the regulation that's trying to stop them from using the data, etc., I think we have a real kind of competitive issue going forward. And then the other example that I remember that you talked about was a doll and it was a doll, I think with a camera or something like that.

 

Finn: Yeah, a microphone.

 

Silvija: A microphone that was recording far more than it really had to. And sending some of this data where you thought that what you had is a little kind of half smart doll that could talk to your kid.

 

Finn: That was a doll basically targeted at kids. It was connected to the Internet and you could talk to it. And we basically discovered that that data was being sent to a contractor in the US, which was being voice printing. And your voice is unique, so they're voice printing children. This company was also working for the US military and intelligence services. The data was being transferred to lots of different places. Children could be a victim for targeted advertising based on this, and also it was very easy to hack. The good thing about that story was that it became such a symbol of what was wrong with connected devices at the time, so that you actually now are proposing a horizontal law on cybersecurity. And they quote this example as one of the main reasons why we need protection in this area. So that's really that's why I'm hopeful that this work is important because even though that was 2016 and we're in 2022 today and takes a long time, you know, we are moving forward slowly on a lot of these areas. And that's what I'm hoping also with the work on deceptive design that we've done where we've had four reports looking at these things where I'm hoping now the issues are seriously discussing how to regulate this. And that's also why we're here today, but also because regulation will always be behind, right? So that's why I think discourse is really important. This discussion is really important because companies will always be at the vanguard, always be in front of regulation, and we will always be running behind. So companies that are responsible, I really believe, will have a competitive edge in the future. But that also requires authorities to crack down on the companies that are breaking the law. And we need to have an interplay here. Yeah.

 

Silvija: Samson, the headline that you defined for this conversation is Dark Patterns and the Importance of Design in Regulations. So if I'm thinking very simply, I'm thinking dark patterns is exemplified by what we just described here. And then I'm not sure what design and regulations are. Is it regulating the design? So I'll let you help us understand where we want to go now.

 

Samson: Yes, I think yeah. I think perhaps the first thing is to see if he can give us good examples, specific examples of what we mean by dark patterns or deceptive design. And then, of course, the next part of the talk would be okay, is design actually taken seriously in regulation? So we talk about, for example, control over data, but should we perhaps focus on how companies design products? Of course, data protection law affects design, but perhaps not that much. So I think it would be interesting to hear Finn's perspective in terms of what we all should do in relation to addressing these design problems we see. So that is the second part. But I think it would be interesting to start with what are dark patterns and if you can give us examples that also intrigued him personally.

 

Finn: Yeah, thank you. And I'll try to answer the first question and then please repeat the second question if I don't respond to that now. So dark patterns are deceptive to science and can be a lot of things, right? I think a lot of people recognize the pop ups that we get every day, that's a generally a dark pattern because we are incentivized to click that button. It's blue blinking out. And it's more difficult to say no to this tracking, for example. That's a classic dark pattern. Another dark pattern could be when you're booking a hotel and you're being constantly bombarded with different messages or someone booked in your region right now, that's something called a social proofing dark pattern. You can have this stressful dark pattern where they say it's urgent that you book within 10 minutes or a day because it's running out. They're triggering the psychological fear of missing out often in combination with scarcity. All these things are dark patterns that are now more and more covered in literature.

 

Silvija: Can I just add one personal example there? Sometimes when I just need to get out of whatever I'm doing and thinking about and into a bubble, I play a game trying to sort the board of some sort of mathematical vegetables. And if you are missing just a few points in order to pass the game, it will offer you to buy five more steps. And this costs money. And finding a way to avoid that is almost impossible. There is a big button that says continue and then it takes you to purchase and you have to find a little cross somewhere in the corner that allows you to just give up. And it's actually quite aggressive, I believe, and it's been like that for years now. So do we have any influence over this kind of thing?

 

Finn: I think that's a good example. And I could just add to that. I mean, we've had reports covering Facebook, Google, Microsoft and Amazon. And for example, with Amazon, they make it really hard for you to unsubscribe from the Amazon Prime video, but the prime membership that gives you lots of benefits and we documented step by step how it was super easy to sign up, but super difficult to get out. And they used all these psychological tricks. The fear of missing out, the social proofing, what we call conform shaming, shaming you into not leaving. We documented how Facebook was using similar tactics by hiding the privacy friendly option when they wanted users to turn on face recognition technology, which is highly invasive. We also documented how Google were using and this isn't privacy setting, right? They when you set up your Google account and you often do that after you purchased your mobile phone, for example, and they have 80% or 85% of the mobile market in the world. And once you've broken the seal of their phone, it's hard to return it. Right. And then you're prompted by Google to set up your Google account. And in that process, they make it very easy for you to switch on location sharing by default, because the click flow is designed in such a way that you're very likely to do it. Also, the way they portray your choices and how they communicate, what choices you have and the impact of those choices that actually led us to file a legal complaint against Google three and a half years ago in multiple countries because it was in breach of the GDPR requirements of data protection requirements related to that kind of collect more data than you need that it should be. You should collect more data by default information requirements, etc. But sadly, the enforcement on this scale has been very, very slow.

 

Samson: Yes, I think if you can also explain this a bit, I've read some of the reports on deceptive design and the Facebook example on facial recognition technology and this shaming that you mentioned. Could you elaborate on that example? I think that's quite interesting.

 

Finn: So just the face recognition example is very interesting because face recognition, again, your face is unique. It's like your fingerprint or biometric fingerprint, it's a fingerprint in your face. With that technology, you can identify people on pictures. You can track if someone suddenly stops smiling in pictures. And if you combine that with other data that you're collecting like Facebook, you can easily identify if someone is feeling depressed. For example, you can see if they gain weight over time, if they lose weight. And you can infer a lot of things about people's life just based on face recognition. These are so controversial that Facebook actually turned this feature off last year. We've been complaining about this for years, but they have been collecting and training their algorithms based on all our faces. And the way they got permission to do this was super, super deceptive because they first of all, they once you got this confirmed a button up on the screen, which you often get on your phone when you're on your way home or on the bus or somewhere, and you want to check messages from friends and they know this that blue button that you got up was the accept button. If you wanted to say no, you had to click on an almost invisible button similar to what Silvija just described. That said, more options manage your data settings, which no one knows what it means, and then they use skewed language to basically portray this initiative as something that is ok.

 

Finn: So if you're a Facebook lawyer, don't assume it. But they basically said we're going to use this technology to fight identity theft, to identify who's in the picture and to help people with visual impairment, see who's in the picture. And then they wrote, If you don't turn on this feature, we won't be able to help identity theft and see and help people with visual impairments. So I felt bad and they used confirmshaming. What they should have written was that we can use this technology to map your friends, map your feelings, of course, also to do this identification that they also did. They received enough criticism that they turned this feature off. But the problem was they've now trained their face recognition algorithm based on billions of people's pictures. They still have the algorithm and they still preserve the right to use it in the metaverse. So we could say they've illegally obtained the pictures of all their users or most of their users by using a dark pattern, and they're going to profit from that in future products.

 

Samson: I think there has been a discussion that when you do something illegally, should you also delete the models that you have developed based on this illegally obtained pictures? So we haven't reached that point, but I think that's also quite an interesting discussion. 

 

Silvija: Can I just play with that idea one round? Because I don't think we understand, for example, in China, how much data is being collected from everybody in public spaces, among other places. There are lots and lots of cameras, but also in industrial processes, in digital virtual settings like finance, etc. and how important it is to use this data to train these algorithms over time and the advantage you gain by training these algorithms earlier faster than other actors. Players in this field might give you such a competitive edge that you might be impossible to catch up with over time. That's what makes this such a paradox and such an impossible dilemma, because we would like some of our suppliers, some of our welfare providers in Norway to be competitive, right? So we can continue to get very good services for free rather than pay to people from other parts of the world. But without the data, without the ability to train their algorithms on this data, they won't be competitive. So, can you both just say a couple of words about how you regulate this? Where you were, as you say, it wasn't very easy to remove some of these patterns, relatively, because they are not a part of your jurisdiction. So the only people you can regulate are the local ones. But you're also kind of punishing them in terms of their competitive ability. Just help me think about this.

 

Finn: So we've actually filed a complaint against a company called Grindr, a dating app because we mapped that they collect the data and then passed it on to potentially hundreds of third parties. And you could say that if I was sharing data with Grindr or another dating app and they only kept it in a dating app for the specific purposes of matching me with a person and stuff like that, that would have been okay. But the fact that they shared it with hundreds potentially or thousands of potentially third parties is a breach of trust. So I think that in that case, when you obtain data, actually legally it should be deleted. So that was actually one of the requirements we had because that would reduce that competitive edge that you describe, Silvija. At the same time, I do think that there's a huge market in the future for sharing your data with companies that you trust. So I can definitely foresee that a health tech actor who says that they need my data for this and this purpose but are very clear on what purposes they're not going to use it for and not share it, for example, with third parties, because I would be interested in sharing my health data if that would prevent me from getting cancer.

Right? But if that same data is being shared with Google, who could then use it to profile me? Because we know about our cancer categories in the advertising industry and that it prevents me from getting health insurance or even a mortgage that I can buy a house because a financial actor deems me as a risk because I use a specific dating app, then it comes back to bite you. I really think that there's a very nuanced approach where we can have sort of data sharing and it's not either or. But I think the wild west of data sharing that we see today because of the advertising industry's need for data, if you can make that end, and then we can maybe have a discussion on how to have legitimate data sharing, where I actually think a lot of people would like to share data that is unlikely to be misused because it also has societal effects. Right. If I share my DNA with you because you're running a DNA company, I'm also giving up my family's DNA and my daughter's DNA, for example. And they haven't consented to it. So there is a collective aspect as well on data, which we also need to take into consideration is not only an individual choice.

 

Finn: So what I'm hoping is that we can ban certain practices from ever being born into place. And one such example is we would like to ban what we call surveillance based advertising because we don't see some benefits to it, but we definitely see so many downsides of the data being shared in such a way than facial recognition in public spaces. I think we should just ban that outright because once the damage has happened is really hard to repair and then have strict enforcement of the rules that we have because that will also create a more level playing field for all the companies that are actually trying to abide by the law. And I think a majority of them really want to do that. And I speak to Norwegian companies and European companies all the time, and they're very frustrated that the big tech companies, for example, don't get regulated the way they should be because they are paying a lot of money to have good systems in place. They don't collect more data than they need to, but they're basically being punished in the competition field because other companies just achieve more data. And I would call it illegally.

 

Samson: Now just one point in terms of this level playing field. I mean some companies, especially those operating or at least founded outside of the European Union, perhaps might be gaining a competitive advantage because they can collect as much data they want and they can use it in whatever for whatever purposes they want. Yes, that might be perhaps true in some sense, but as far as companies start providing their services in Europe, I think the idea is that they should abide by these rules that are aimed at creating this level playing field. And one important thing that's been happening in the last few years is, of course, data protection is not just about protecting the individual, it's also about competition. And we see increasingly now, for example, from Europe and competition authorities are also paying attention to how companies actually use data. So we have a case from Germany where the Competition Authority said, well, Facebook actually collects and uses data in breach of the GDPR, and we're not going to wait until the Data Protection Authority to actually take action because it's distorting competition in the advertising market, because if you are an advertiser, a company that wants to provide advertising, then unless you also disregard the data protection rules, you won't be able to provide the same level of targeted advertisements as Facebook. So the data protection authorities might fine for the breaches. But for us this is a breach of competition law as well and competition law that the companies say that Facebook has to stop combining data from websites as well as from from the subsidiaries they have. So we have to also use all the possible toolsets we have. So we should not just restrict our toolset to data protection law, but also to consumer protection, which deals with design in a lot of respect, but also competition which which has actually much more stronger toolkit in terms of enforcement, including breaking of breaking up of companies, but also imposing quite high fines for breaches of the rules. 

 

Silvija: Samson, you wanted to talk also about the state of regulation. So now you just said we need to combine different kinds of law sets here. You talked about data protection. You talked about competition. I'm sure there will be lots of ethics and God knows what else. So what's your thinking? What are the right steps to to regulate this the right way?

 

Samson: So what's the state of regulation? Do you think that we have enough or adequate regulation? Is the problem that you have uncovered in your report? Is it occurring because of lack of regulation or is it because of the lack of enforcement? If it's because of a lack of regulation, what kind of laws do you think would address the problems?

 

Finn: I think we have some laws in place that could help. I mean, the data protection protection regulation here in Europe is pretty good in theory, but it's not being enforced. So we're still waiting for even an initial response to our complaint against Google, which was three and a half years ago. And if that is the rate of enforcement we're going to have, we need to look at other remedies. So consumer law is another aspect. We have filed a complaint against Amazon using European consumer law there as well. Enforcement is slow. And then even if Amazon gets fined by the Norwegian Consumer Protection Authority, it will only apply to Amazon. So it won't create a proper precedent for other companies to respect. So I think that we're taking part in the EU expert group on dark patterns or deceptive design in the last two years. And we have been discussing whether we could have certain bans in place also on certain types of practices, because some of them are really obvious, right? They are like, for example, you could say it should be as easy to leave a service as to join a service that has an impact. It's not today. It's very easy to identify these things. So I think we could have specific bans on specific practices, but also something that is a bit more future proof, because I'm afraid that if you ban one practice, another one would just pop up somewhere else. So you need to have a flexible framework and we need to find ways of more swiftly enforcing the law. And there we can also use technology. I think Princeton University have created robots that can detect certain types of dark patterns and they crawl like thousands of websites. So you can also use modern tools in enforcement to do this. But also enforcement agencies then need to have the correct mandate to also penalize companies swiftly. But I think this also has to come in combination with a carrot, right? There needs to be a carrot, but I think there is a carrot in it because the fact that companies that are breaching are being punished is a carrot for everyone else. But it has to also be in combination with requirements, for example, with regards to interoperability. So if you're in a service and you feel that Amazon is breaching your rights to choose, right, because there's a pattern, where do you go? So and same with Facebook, same with Google. So you need to have the requirements for interoperability and portability so you can take your data out or use a competing service but still be in contact with that big service, for example. I also think that the work that both Silvija and you are doing is important. Education, of course, is a base here. You know, I think we need to have a focus on design and deceptive design in marketing schools in particular, because I think that a lot of talk patterns are being taught in schools by people who are not aware that they are ethically very questionable. I think also you need to look at incentive structures in companies, but I'm not quite sure how you regulate that because I think today a lot of designers and marketing departments have KPI on reducing churn, for example. And how do you reduce churn in an expo, say, on Amazon just a few weeks ago, how they reduce churn on Amazon Prime by creating dark patterns. And of course the marketing department was heavily rewarded for that. They reduced churn by up to 10%. So some of these things are difficult to regulate, but I do think specific bands could help.

 

Silvija: Right. We have 5 minutes left. What would you like to focus on in those five?

 

Samson: So I think I can also talk a little bit about the new legislative initiatives at EU level. And to what extent do you think, for example, the new legislative instruments, including the Digital Services Act and Digital Markets Act, actually touch upon some of the issues that you talked about. Do you think that design, this concept of dark patterns is receiving the attention it needs?

 

Finn: Yeah, good question. So I think that the two year proposals, Digital Markets Act and Data Services Act are a good step in the right direction because it gives the European Commission tools to enforce anti-competitive behaviour. For example,the DSA gives requirements to increase transparency from the companies. They theoretically give access to independent researchers and regulators. It requires companies to have risk mitigation strategies. And obviously if they do this properly, you can probably avert some of the worst disasters and it gives an opportunity for regulators to enforce better the laws we have and to understand how the companies work. So that's good. And the challenge was that there was a huge discussion at the European level on a ban on dark patterns, which was great. But sadly at the last minute, from what I've heard, the text isn't even final when we're recording this. From what I've heard, the ban on dark patches was heavily emptied out. So I don't really know what's left there because from what I've heard is that the dark patterns that are covered by consumer law or data protection law will not be covered by the DSA, which you can argue then is more or less everything. So I think we'll have to wait for future initiatives. And I think here there's a fitness check of EU consumer law. Not a very sexy topic for listeners outside the EU bubble, but there is an opportunity to revise EU law coming up and there's a serious discussion there about having some sort of ban list, for example, on certain patents which could bring this discussion onwards. Because I do fundamentally think that it's good to not have dark patterns in the long run, because I do think it creates trust. I mean, if you experience as a consumer that you're being locked in a service, it creates a lot of animosity. You start hating the service if you don't have anywhere to go, and there is no way of voting with your feet. But I do think that if you have a healthy, competitive environment, because we see that, for example, in cloud storage services where I think it was orient streaming services, I think it was Netflix who introduced if they saw that the user was inactive for a long period of time, they stop taking their money because one of the dark patches have been it's hard to leave and just take keep taking your money even if you don't use the service. I think Dropbox had a similar feature and in both those two areas there's cloud storage services and in streaming there's actual real competition. So there is a strong argument to be made that in a competitive area you would see less dark patterns because consumers are fed up with them. But the problem is, if the really good deceptive designs as a consumer, you don't notice because they are deceptive and manipulative and you don't know you're being manipulated. So there's still a room for enforcement and sort of regulation in this era because you cannot leave it to the consumer alone to deal with these things.

 

Silvija: So shall I do my summary? And then maybe you can. You can fill up with what I've missed, but I've actually learned a lot. Every time I listen to you or speak with you, I learn a lot. And I think this may be the last point that you made that we consumers are unable to take on this fight. We need people, both regulators and enforcers, that have the mandate and the interest and the knowledge. And then there has to be some kind of a big carrot for the people who do things right, because they are the necessary competition. Like the example you gave with Netflix or Dropbox. It's the same thing with cloud services. There are three mega providers. It's Google, it's Amazon and it's Microsoft. And you know, where do you go if you want something different or something else? And so we need at least one of them to be doing things in a way that doesn't have a complete lock in, etc.. So, there are some very interesting ideas about banning surveillance marketing. If I can throw in a recommendation for a book. I think Surveillance Capitalism by Shoshana Zuboff would be a really interesting follow up for the students in this conversation and this idea that there are very important collective aspects. So even though you tried to hide your newborn child from social media, they discovered her. And even though I only share my own DNA with some sort of a health service, I am actually doing something for my children's privacy as well. So how do we think collectively about this? Lots of super interesting work to be done for students. What do you think? 

 

Samson: Yes. This was, as always, quite interesting to listen to. And I mean, I have also used many of the examples from their reports in classes. So they kind of really give you a vivid picture of what is really wrong or the wrong way of thinking about design. And so that is quite interesting. But also it's quite known that the law kind of always lags behind the technology in the business practices. And as earlier mentioned in an earlier podcast, I think one of the ideas is to actually engage with technology while it's still developing. So the work that the Data Protection Authority is doing in terms of the sandbox, where you actually engage it and shape it before it's too late, is very, very important. And the work that Finn and his team are doing is equally important in terms of raising the awareness of what is happening in practice and the advocacy work and the communication that they have done. And they have been successful actually in removing products from the market before the law actually comes into place. So with the doll, I think the product was actually removed from some countries before the legislator could actually take action. So I think this kind of discourse advocacy is as important as any measure we have. And I just want to commend Finn for all the great work he's doing and his team is doing at the Consumer Council.

 

Finn: Thank you very much.

 

Silvija: I also think that our two points are actually something that I want to underline what you said, slow regulation. You know, the world is changing so fast and our democratic processes, which are very good and very kind of consolidating, are too slow to regulate this at the speed of development of these companies. So how do we play that correctly? And maybe there is a new need to think about developing new regulations. A company I admire very much is Norske Veritas. They work with energy and shipping and so on, and they have this energy status report every year. And one of the biggest conclusions is that there is enough technology to save the world, but they are technology optimists but are regulation pessimists. And you remind me of that quote. There is so much technology in all these fancy marketing and services. But we are lacking regulations.

 

Finn: I'm really happy that we're having this discussion. I think it's really important that we engage across our different bubbles, our different areas of expertise. I think we need to talk more together. We had a conference, I think it was in March together with Amnesty and the labour union in Norway, a first step of gathering organizations. I think we gathered eight organizations to discuss 15 different challenges, everything from sort of labor rights to human rights, but all with a focus on how technology impacts those different areas. And I think that's a discussion we need to extend to academia, to regulation, to policy, and start talking together, because I think we need to be preventative. And I mean, we have the precautionary principle in my world, and I think we really need to have a discussion which is based on what we know and what we sort of think of the future, so that we can actually then separate the good things from the bad things. And, you know, going forward with AI, big data, super smart computers and also everything being connected in the house, often without displays or voice assistance, invading our homes, dark patterns. If that tells what will happen in the voice assistance and we won't even see them, it will be written into the code in a way that we're going to be super hard to detect. So, for example, in dark patterns, you know, this is a discussion that's not going to go away unless we find solutions to it.

 

Silvija: It's just going to be bigger and more important. And so thank you both for enlightening us, inspiring us, maybe even scaring us a little bit. I think we all need to be scared a little bit. And also understanding the huge potential for good contribution, both from students and for people that already are in very interesting jobs at the moment. Thank you both.

 

Finn: Thank you very much. Thank you.

 

Du har nå lyttet til en podcast fra Lørn.Tech – en læringsdugnad om teknologi og samfunn. Nå kan du også få et læringssertifikat for å ha lyttet til denne podcasten på vårt online-universitet lorn.university.

 

Quiz for Case #C1181

Du må være Medlem for å dokumentere din læring med å ta quiz 

Allerede Medlem? Logg inn her:

Du må være Medlem for å kunne skrive svar på refleksjonsspørsmål

Allerede Medlem? Logg inn her: