LØRN Case #C1166
AI in public sector
In this episode of #LØRN Silvija and Christian Fiesler talk to Heather Broomfield , an internationally recognized resource on AI and digitalization in public sector. This is part of our series on Artificial Intelligence created with BI. They will among other things discuss the opportunities that public sector in the Nordics hold in digitization and developing scalable and imporved public services.

Heather Broomfield

Phd student

University of Oslo

Christian Fieseler

Professor of Communication Management


Varighet: 39 min


Ta quiz og få læringsbevis


Du må være medlem for å ta quiz

Ferdig med quiz?

Besvar refleksjonsoppgave

Tema: Digital etikk og politikk
Organisasjon: University of Oslo
Perspektiv: Storbedrift
Dato: 220406
Sted: OSLO
Vert: Silvija Seres

Dette er hva du vil lære:

Why are the nordics leading the way in digitization ? 

What role does the citizen have in challenging the public sector when it comes to AI?

What are the challenges with AI, from a public sector viewpoint.

Del denne Casen

Din neste LØRNing

Din neste LØRNing

Din neste LØRNing

Dette er LØRN Cases

En LØRN CASE er en kort og praktisk, lett og morsom, innovasjonshistorie. Den er fortalt på 30 minutter, er samtalebasert, og virker like bra som podkast, video eller tekst. Lytt og lær der det passer deg best! Vi dekker 15 tematiske områder om teknologi, innovasjon og ledelse, og 10 perspektiver som gründer, forsker etc. På denne siden kan du lytte, se eller lese gratis, men vi anbefaler deg å registrere deg, slik at vi kan lage personaliserte læringsstier for nettopp deg. 

Vi vil gjerne hjelpe deg komme i gang og fortsette å drive med livslang læring.

En LØRN CASE er en kort og praktisk, lett og morsom, innovasjonshistorie. Den er fortalt på 30 minutter, er samtalebasert, og virker like bra som podkast, video eller tekst. Lytt og lær der det passer deg best! Vi dekker 15 tematiske områder om teknologi, innovasjon og ledelse, og 10 perspektiver som gründer, forsker etc. På denne siden kan du lytte, se eller lese gratis, men vi anbefaler deg å registrere deg, slik at vi kan lage personaliserte læringsstier for nettopp deg. Vi vil gjerne hjelpe deg komme i gang og fortsette å drive med livslang læring.


Flere caser i samme tema

More Cases in the same topic

Digital etikk og politikk

Glenn Weyl



Digital etikk og politikk

Hans Olav H Eriksen



Digital etikk og politikk

Hilde Aspås


NCE iKuben

Finn Amundsen



Utskrift av samtalen: AI in public sector

Velkommen til Lørn.Tech - en læringsdugnad om teknologi og samfunn. Med Silvija Seres og venner.


Silvija Seres: Hello and welcome to Case with LØRN and BI Norwegian Business School. This is a part of the series we're creating together to explore the main challenges and opportunities in creating good AI as a part of a BI course, called responsible AI leadership. Our guest today is Heather Broomfield, and she is a PhD fellow at the Institute for Public and International Law at the University of Oslo, where she's researching the data driven public sector. And then my co-host, as in the whole series, is Christian Fieseler, a professor of communication management at BI. A very warm welcome to both of you. So we do the same little dance and sing in every series. So first you have to introduce yourself, and Christian, we're going to do very briefly your introduction to make sure everybody remembers absolutely the right pronunciation of your surname, because I can't. 


Christian Fieseler: That's actually the first time that a German surname was referenced as exotic. But yeah, I'm Christian, part of the faculty at BI and part of a research group concerned about finding good, sustainable ways of managing digital transformation. And to pronounce my name and a very Germanic name - it’s Fieseler!


Silvija: Very good. And Heather, who are you?


Heather Broomfield: I'm Heather Broomfield. I'm actually originally Irish, but I've lived here in Norway now for 20 years, give or take, so I'm kind of probably a little bit Irish Norwegian now, and I work in the digitalization agency, but at the moment I'm doing a PhD in public sector in fact, 75% and then 25% still working in the digitalization agency. So that's an alum background in digitalization, public, private and research sectors. So that's Lee in a nutshell.


Silvija: And as your exotic hobby, I have to ask you, what brings you to Norway? Yeah.


Heather: It was many years ago I was working with Norwegians on EU projects. I used to do EU research projects, manage EU research projects, and they asked me to come and work in Norway for a year. And then you know the old story about working for Norway for a year. You meet the husband and two children later I'm still here.


Silvija: I keep saying this but they don't believe me. But I think that Norwegian men are the best export article Norway has.


Heather: And yeah, I don't know if I'd say that's my husband, but yes, I agree.


Silvija: He's nodding. So yeah. So we are allowed to say that. Okay. I want you to also comment a little bit on the expert group to open government data and the European Commission's public sector information expert group. That's a mouthful. What's that.


Heather: Well both of them are about open government data because that's actually where I really came from. I suppose in the area of data driven I led Norway in an open data sharing initiative that started 11 years ago. It evolved into the common data catalog and guidelines and all those kinds of things about raising and improving data sharing in Norway. So through that, I was invited to both the European Expert Group on this area, Public Sector Information Group, which is all about data sharing. And then the OECD also has a huge initiative on open data. So I sit there too, and they're all about the sharing of best practices and learning from each other and how we can improve data sharing, but also do it responsibly. And it's particularly around open data.


Silvija: Excellent. So, Christian, you decided we should talk to Heather. Why?


Christian: I like the research and actually Heather's work. I'm obviously not the only one, but I find it really fascinating to, on the one hand, look at the way that the Nordics are using digital technology in public services and public administration. But the wonderful thing about Heather is the work is also to put it into context, right? To essentially look at what the Nordics are doing? Well, where can the rest of Europe and arguably also the world learn from Norway and the Nordics? You can always become better. I think that is something which wonderfully comes across in our work. Essentially, this idea of Norway, the Nordics already made great strides in using technology. Using it for making public services arguably oftentimes more accessible, more transparent, more easy to use. But we can still do better. I think that your work is really instructive, at least to my understanding, or to my reading on where we are, where we stand, and where we can go next, essentially to make something which is really, really fantastic.


Silvija: I'm going to summarize this very briefly, Heather. Just so you know, I often kind of translate things into very layman's language. And what I heard Christiane say now is that there is something about the Nordic governance model in the public sector that helps us do digitalization, automation, data driven future society very, very well. It might be knowledgeable workers, it might be lack of corruption, it might be a really good digital infrastructure. And what we'd like you to help us understand is why do you think we are really good at this? And then what he's also saying is that we can get even better and that might have to do with governance, sharing of data, understanding of the state driven digitalization that's very different from the Chinese. So why are the Nordic countries good at doing digitalization and how does their public sector contribute?


Heather: You hit the nail on the head, because Christian studies were good, but we can be better,  and that's I think the core of this conversation today. How we would like it to grow. And as we say in Norwegian, we know that's the foundation that we're good but we can be better. So what are we good at? We're definitely and what are our unique strengths? I mean, it is our data. It is the data. We've come back a lot to that, I would imagine, in our conversation because it also gives us this unique data context.We coordinate and cooperate is the way to put it. It's not necessarily a top down approach that we have here in Norway, which is a real strength. It often slows things down, let's be honest about it, but it is a real strength, common components of our infrastructure and the way we cooperate to provide that. Those common components as well, I think are a huge positive part of what we do.


Silvija: I understand what you mean when you say we have the data. You mean we have this historical data, like having the registry, but also all the data, it's structured and it's decent and it's reusable now. I think that maybe there are very few countries other than Iceland that have such a good coverage of people's data like we have in Norway and Nordics.


Heather: I mean, this goes back to the 1970s. So this is right from the foundation of the welfare state.So it's not just that we have it now, but we also have that historical data which is a huge opportunity for us in Norway. But it's also because we have this data that's also an extra responsibility that I think we don't always, always remember here in Norway as well, is that it brings questions with it that maybe other countries don't need to consider because other countries have not done this. I would say as well, I mean, you mentioned Kraft Register. Here is an example. So Sweden and Denmark have a very similar system as well. So it's not just the Norwegian cohort, but we can have the cohort of the Nordics as well, which gives us even extra data from which to work from. So it's a huge, huge opportunity and advantage to us, but of course as well it is and can be, I don't like to use the word threat because that's not what I mean. I just mean it's an extra responsibility.


Silvija: So we need to share it more. We need to get even better at creating platforms for sharing data. That was the first one. We have the data and actually more structure and more quality than we are really aware of and that should be used as a strength more progressively. The second thing I heard you say is we cooperate and coordinate what you mean, that we are able to maybe cut across the sector silo principle.


Heather: I think in a cooperative way. So for example, we have the scatter system where many of the leading public sector bodies could come together and coordinate and agree on common ways forward, that in other countries, I don't think exists to the same extent. Also the organisation that I work for, a digitalisation agency, we tried to coordinate, we don't instruct work, we try to bring everyone together and build things on consensus and that's a very important thread through the Norwegian public sector and something that I know Norwegians hold very, very dear.


Silvija: I think that this is a point worth dwelling upon a little bit longer because the system we have in Altin, which we all take for granted by now, is, as far as I know, quite unique. And it is unique because of the breadth of the services that it connects in order to give us customers or citizens a really easy tax and other kind of financial management experience. And I think that it's amazing how quickly we get used to this. And it takes only one meeting with the tax system of another country, especially further south in Europe, where I have to spend some of my summers or try to apply for a building permit and see how they manage their data versus Norway. It's a completely different world. So we have some amazing structures here.


Heather: Yeah, it is a completely different world. The services that we have here in Norway and the type of opportunities we have around digital services. And these services as well are only possible because of the data sharing that's that we have in Norway and that there is trust in the system. So the citizens do actually share the data very openly with the public sector.


Silvija: And this might even be a virtuous circle because of the fact that I know that the tax authorities have good data. It makes my tax returns very transparent. But I know it's transparent for everybody else as well. So I'm not so worried about the unfairness or lack of transparency in the tax system that I would be in other countries. We know it works and we trust it. And so it can get even better if we allow it to grow in a way.


Heather: Yeah, exactly. But the thing is, with a data driven public sector, we need to maintain that trust. We want to be able to build more services and meet the public with even better services. That's why we really have to think through what data driven is going to mean for us here in Norway, how we avoid the type of scandals that other countries have experienced, how we make sure we don't become too invasive, how we make sure that because of this enormous amount of data that we have, that we don't overuse it. And maybe to put it that way, and that's a lot of what my research is about. It's about making sure that we minimize these unintended consequences from the use of data as well. Because one thing is making sure that the data gets shared, sharing it isn't enough. We have to be using it right? And we have to make sure that as much as possible, because we can't foresee every eventuality, obviously, but we have to do as much as possible, try to prepare and ensure that we do this responsibly.


Christian: I think we can talk about other countries. And where other countries do have problems. What type of worst case would you like to avoid in Norway or is there anything in other countries where you think there's really not a good example of how public administration use or public service uses data? Is it kind of like in your case, something where you would say I would avoid that or would try to avoid that happening in Norway.


Heather: I think there's a few well-known examples now that are pretty well known examples. And I just want to kind of as well say that these in themselves are unintended consequences. The potential impact wasn’t foreseen. So I think it's important to point out this isn't new.


Christian: I think that's a really important point that you're raising. There's never I guess nobody is really planning in public or what they do now with algorithms. Right. But it is oftentimes it's outcome of systems build up on systems.


Silvija: I want to challenge you on that because I'm thinking Cambridge Analytica was owned by the British state in the mother company at least. But it is an unintended consequence. It's just like Christian is saying, because it was never intended to be used in Western societies for Trump and Brexit. But it's funny how technology ends up being used. You know, it's not good or bad, but it's the people who use it that can decide to abuse it if they have strong enough political or financial motives.


Heather: Yes. And it's not apolitical. That's a huge issue here. None of this is neutral. It's never neutral. We need to think properly around these things. But if we think about just some examples, the case in the Netherlands where they were trying to predict welfare fraud and it was the courts found that it was an invasion of privacy. It went against the European Convention of Human Rights. And this was done by the state. This was the tax authorities of the welfare authorities as well.


Silvija: What was the story? Just give us two sentences.


Heather: The story was that the Welfare Administration in the Netherlands used an algorithm to try to classify and predict who might be committing welfare fraud. And due to civil society. And many external activists who really started looking into this discovered that this was unfair practice, that it was an invasion of privacy. And as I say, it went against the European Convention of Human Rights. So that's been through the whole court system. But I also want to say this is hugely important. How many countries are having these experiences, but they're learning from it, right? In the Netherlands now, there's actually a role within the system in the Netherlands with responsibility for human rights to check and to work with public administration in the Netherlands and how they use artificial intelligence to protect human rights. In Scotland, for example, how they are learning from from this type of of problems and issues that have come up with AI and the public service when they wrote their AI strategy only recently, that huge citizen citizen participation process around that where they went out and they really discussed dilemmas and potential dilemmas with the citizens. So there's a lot of learning that's happening here as well as we go. This is really in the early phases and we do need to learn from each other as we go. I suppose some other examples then. In the states there's two particular examples. They're around racism. One would be predictive policing and that's racism, but poor neighborhoods. So obviously the predictive police system sent police into the poor neighborhoods and into into predominantly black neighborhoods, which was discriminatory. And then we have the recidivism or re-offending in the court systems. That's the comp system and algorithm that was used in the court system, but that bias was found to be in the system, racial bias. So a black person would be considered more likely to reoffend than a white person. And that's, that's the COMPAS system, which is quite well known as well. So, there are a number of these examples. Robodebt and Australia is another one again that was welfare fraud and again went through the court system and the Australian Government was found to be at fault. 


Silvija: Yeah. So I have two questions here then I'd like to hand it over to Christian again. One is I don't understand completely this whole issue of a privacy intrusion because in some ways don't we need the state and the public public sector to optimise for the for the masses so would it be possible to have a smarter management of this. You discover cases and then you have people checking them and discovering, is this real? Is this not real? We have to be super cautious. But I think the whole point of having a transparent and fair and efficient system is that people really can't cheat. If everybody can cheat and get away with it in the name of privacy, then we might not believe in the system anymore. So I'm trying to provoke the horror image that you have in your mind from the film Minority Report where everything is predicted,  but we still want to use A.I. in some way.


Heather: Of course. And no one is suggesting that. No one is suggesting that we don't try to prevent welfare fraud. And we do need to catch welfare fraud. And that's not the question. But I mean, Philip Alston talks very much about this when he talks about actually this very case as well and saying that there was a particular group who were unfairly prioritised and were unfairly chosen. And that's the problem. And actually the Siri case itself wasn't about the judgment, wasn't about that they were actually using this and they were making decisions based on this data. It was that there was a particular group who were predominantly classified. And that was in poor neighborhoods, people from poor neighborhoods. So that's the balance we need to strike here: how do we make sure it's not too intrusive at the same time and how do we make sure we don't discriminate? It's not the people in the wealthy parts of Oslo or Amsterdam who were chosen, it was the poor neighborhoods. And that's the balance we need to strike. And there's a lot of work going on in this. And I think that we can do it, but we have to be aware of it first. And it's actually these examples and these experiences that are making us aware of these potential problems.


Silvija: I guess part of the problem here is biased data because data is created by past experiences, which have also probably been unfair. So it's a way of recalibrating the way that we use these data and being really human about interpreting the outcomes.


Heather: Yeah. And understanding the questions that we're asking of the data and when the data because you know depending on your question, all data is biased, right? It depends on your question and how you're asking that question. So that's why we need to, as you mentioned in the introduction, that's why we need more social scientists and the public sector. We need more public administration experts involved. It's not that this is to stop anything. It's not it's just to know the questions we need to be considering and we won't have the answers overnight. And that's a real mind shift for solution oriented people, I'm a solution oriented person naturally. And many people actually need to start thinking, well, are we understanding the problem correctly? And then the solutions will hopefully come from that.


Silvija: I also want to ask you or perhaps I'll ask Christian actually that question. So, Christian, your students are studying management, and some of them might actually go into public sector management. And what do you think? What would you like them to remember from this conversation? How should they learn from Heather in terms of doing their tasks?


Christian: I’m pondering actually something which Heather said before and which I find really, really important. Heather, you've said that organizations do fail, right? Or software does fail, but people normally are learning. Organizations normally are learning. And what I find important to remember but that comes with a caveat in a second when we hear these scandals, we have data scientists walking out of large companies and saying this is not correct. We are discriminating by race, by gender and everything. It shouldn't have happened. But then again, we are living in a messy world, right? Mistakes do happen. I don't want to deny that harm has been done, but I personally am of the opinion that this is not necessarily a situation where the emperor has no clothes. So normally I think all of us as human beings, we have this tendency to become better, nobody really wants to develop an AI which does harm. What I think would be great for students to remember or for everyone to remember is, I think not only the question, how can we avoid every mistake we should avoid mistakes, but how can we also learn from mistakes? And how can we find ways to do it, to essentially become better. To put it bluntly, to figure out the technology as we go. Because as soon as you roll it out at scale with people who are vulnerable or previously underserved, of course you are also entering new territory. Again, I'm not saying that harm is not real. Somewhat equally important question is also how can we learn from that so this doesn't happen again? So kind of like how we can find systems of learning to become better?


Heather: Yeah, I completely agree. And that's what this is about. This is about learning, right? It's about working out what we need to learn here. And I think the public sector needs to get more comfortable with these unintended consequences. We need to be able to talk about what we need more openly, because there's such pressure on us as well to deliver. We have such pressure to deliver. But sometimes we do actually need to take a little step back sometimes as well to go ten steps forward. And sometimes we will need to ask some questions along the way that maybe we're a little bit reluctant to because we're so conscious of delivering.


Silvija: So first of all, I think this is the first time that I've heard the Germans say we are living in a messy world and that's okay. So what I think you're both pointing to is something super important here. It's not just a messy world. It's a world that's changing so fast that we can't rely on established systems of understanding, of public management, of management theories. Christian, I think that only if we run as fast as the world is changing can we hope to catch up and to learn and to develop. And I have two questions for both of you. One is that our democracy is designed to go slowly. It has all kinds of security mechanisms built in based on consensus. And you don't take risks because in the public sector, you'll be on the front page of every newspaper if you've done something that didn't end up the way it was supposed to end up. So how do we establish a management culture and how do we provide the necessary freedoms for the people who manage this in the public sector to actually use these systems of learning that you mentioned, Christian.


Christian: That is a very good question, I think. At some point I have to also pass that over to Heather. I think my answer may be overcomplicating it a little bit. But I think it's also a question of how much trust do you have as an organization that uses these systems to be allowed to experiment or to figure it out right now, trying to more immediately answer that question as easy as it may sound and I know it's not easy, but I think it comes down to a topic we also previously discussed. This idea of whenever you design something for somebody else who might also be very different from you  like different gender, different demographics, different income. I think we need to stay empathetic about that. We are building systems which impact lots and lots and lots of people. That we are developing something which is at scale and which doesn't have any easy recourse mechanism as a human conversation would normally have. So I think it comes down to having enough trust to figure things out and also maybe in the run up to try to establish some sort of trust in society or your populace and then approaching that with a healthy dose of empathy with whatever you develop.


Heather: I'll try and build it because that is a very difficult question. I agree that trust is continuously built, it's fluid. So issues like they had in the Netherlands damage trust.    They cause questions to be asked as well, which is not a bad thing at all, that there are external questions asked. I think maybe in the public sector in Norway, because we have trust and because we generally do a pretty good job, we don't necessarily get the external questions, we don't necessarily engage actively with the citizens that other countries might do. And civil society, for example, in this area isn't active in Norway and maybe we need more questions from the outside because we cannot depend just on ourselves to get these questions. So I think that's something to really try to encourage.  That we go to the public, in democracy, we go with the dilemmas to them to ask them questions and get their input as well. I think we need to be closer to the citizens. I think we need to be open not just on our failures, but also we talk a lot about algorithm registers. That's a very big thing. So there needs to be openness and transparency about what algorithms exist in the public sector. I would also like to see that, but I would also like to see a register or an information that tells us what hasn't gone ahead and why, because I think that's a trust building thing as well. Has it not gone ahead because of a specific creep factor that they felt was too invasive, it was too expensive, whatever it was? I think that that would be a really big contribution to learning, but also to citizen trust and our democracy to be open about that.


Silvija: Not just what has not gone ahead, but perhaps also what has gone ahead and been stopped.


Heather: You could have three different areas. You can have the algorithms that are out there and open and used and they can be evaluated or whatever. The ones that went into production and realize, okay, we need to stop. And then the ones that were stopped long before they ever could be implemented, so I think that would be a help as well. This is complex. We've got to learn by doing. But we have to be open. I think there's a lot worse newspaper headlines based on things that we didn't think about than the ones that we did think about, and that we can stand over and say, okay, there was an unintended consequence there, but we had all this work in place to try to help you. We did our best rather than just being naive to this and just just going ahead. Does that make sense?


Silvija: Christian, how would you like to summarize what new ideas are you taking with you now?


Christian: It's a bit embarrassing to say, but everything is how you say. The ideas that are so interesting to me are what you just said. This idea of systematically keeping records of what works and what doesn't work and for what type of reasons; I think it was a really interesting point. Maybe also in public service you do some internal experimentation and then think it’s not something which we want to go ahead. First of all, great choice that you as an organization are so flexible to do that, but also to then also use that as a source of learning. Documenting that, making it available where I think we may be going a little bit back into this idea of sharing. Right, sharing data, but not data in the sense but more maybe sharing expertise on governing these types of new technologies. I think that's a really interesting thought for me, which just is a little bit ruminating right now in my head.


Silvija: If I can build on that. One thing that you said implicitly, Heather, is that there is actually quite a lot of activity in this direction in Norway, in Nordics in general. And, I'm wondering if it has to do also with progressive leadership, because many of the people on the top traditionally are not necessarily data heads. Yet there seems to be some sort of energy, some sort of an optimism, some sort of constructive willingness to try and digitalize every department, every public institution, not just the one you're in. They want people to share their data. They want other people to use their algorithms and data. So there seems to be something happening at the kind of high levels of leadership in public sector, which I didn't, to be honest, expect so quickly.


Heather: Yeah. It's gone mainstream, right? I mean, when I started working with open data nobody was talking about data. Initially there were some people talking about it, but now it's gone mainstream. Everybody, the dogs on the street are talking about this and that is super. That also brings with it a pressure to deliver, which is also great, and we have to deliver. So there might be sometimes difficulty within the system to maybe ask questions and maybe slow things down. So we also need to build that into it. So leadership also needs to be aware that this has to happen. Yes, this has to go ahead. There will be challenges along the way and we need to be ready for them. It's not just a case of getting the data shared and this beautiful world will emerge because there's a lot more, many more issues to take into consideration. So it's not to put a dampener on things at all, but just to say we need balance here. And I think one of the greatest potential threats to a data driven government is not embracing that complexity and just going ahead.


Silvija: Very cool. So, Christian, did we cover the most important points? Did you get the ideas that you wanted to get out of Heather's head out?


Christian: I'm smiling a little bit because I'm learning so much now from you, from Heather. I think we also have really learned something really, really interesting. I think the short answer is yes and no. So I think there's so much more to learn from her. And I think we will also share her writing. But I think we also got a really, really good idea about the complexities, but also the pathways of using AI well.


Silvija: Cristian said something super important in one other conversation we had. And I love that attitude where he says, there are challenges and there are the things we have to learn about quickly enough. But we have to learn, we have to go, we have to build and we have to use the advantages we have. Because if we are not doing that, then maybe we are not doing our job as public servants either. We have this point with some other areas, but also in public service. I guess the most important thing is, you know, go for it and then learn quickly enough and yes.


Heather: Yes and have the right people in the room. Because ultimately this is about people. If we have the right multidisciplinary competence together working on these, we are going to minimize these unintended consequences. And that is the ultimate goal. We need to use this for good, for within the public sector and for the best for society. And that, of course, is a very general term. And we need to understand what's meant by that and on a much deeper level. But I would recommend bringing in your source. Obviously, you need to have it if you're in the private sector, it says you need your legal, technical and domain expertise. And the public sector has got quite good at that. But within the public sector, we also need that social science, that public administration expertise. So bring back the good old public administration experts into the system. And I think if we do that, we will manage to and hopefully learn and go and build, as Christian says, because that's ultimately what we want to do. Right.


Silvija: Excellent. Thank you so very much, both of you, for an inspiring and perhaps a little challenging as well conversation.


Du har nå lyttet til en podcast fra Lørn.Tech – en læringsdugnad om teknologi og samfunn. Nå kan du også få et læringssertifikat for å ha lyttet til denne podcasten på vårt online-universitet lorn.university.


Quiz for Case #C1166

Du må være Medlem for å dokumentere din læring med å ta quiz 

Allerede Medlem? Logg inn her:

Du må være Medlem for å kunne skrive svar på refleksjonsspørsmål

Allerede Medlem? Logg inn her: