LØRN case C0251 -
LØRN. RESEARCH

Chris Yiu

Leder for Technology and Policy Solutions team

Tony Blair Institute

Reguleringer for fremtiden

In this episode of #LØRN, Silvija talks to Chris Yiu from the Tony Blair Institute, about how we improve the dialogue between those who change the world with new technology and those who want to respond to it with politics and regulation and how we can give politicians the knowledge they need for the future. Chris is Executive Director of the Technology and Public Policy team at the Tony Blair Institute for Global Change. His work focuses on improving the dialogue between those changing the world with new technologies, and those seeking to respond to it with policy and regulation.
LØRN case C0251 -
LØRN. RESEARCH

Chris Yiu

Leder for Technology and Policy Solutions team

Tony Blair Institute

Reguleringer for fremtiden

In this episode of #LØRN, Silvija talks to Chris Yiu from the Tony Blair Institute, about how we improve the dialogue between those who change the world with new technology and those who want to respond to it with politics and regulation and how we can give politicians the knowledge they need for the future. Chris is Executive Director of the Technology and Public Policy team at the Tony Blair Institute for Global Change. His work focuses on improving the dialogue between those changing the world with new technologies, and those seeking to respond to it with policy and regulation.
Facebook
Twitter
LinkedIn
Email

51 min

Choose your preferred format

Velkommen til Lørn.tech, en læringsdugnad om teknologi og samfunn med Silvija Seres og venner.


SS: Hello and welcome to Lørn.Tech. My name is Silvija Seres and the topic today is regulation for the future. I have a very exciting guest with me, it is Chris Yiu from the Tony Blair Institute, welcome Chris!

CY: Hi there, nice to be here!

SS: It's really a pleasure to have you here. We had a little warm-up chat and I'm so excited to go into several of the topics you described. But before we do that, could you please tell us a little bit about who you are?

CY: Yes. I look after technology policy at the Tony Blair Institute. But my role is very much to think about the intersection between technology, politics and policy makers and how we bring those worlds a little bit closer together. My background is mostly in public policy. I worked for the UK government for a period of time. And also, in consulting and before this job I was in Uber running operations for a number of UK cities. So, I've seen technology from the implementation side, and I've also seen it from the kind of government and public policy side. And now just trying to kind of bring those worlds together.

SS: What is Tony Blair Institute?

CY: So, Tony Blair institute's is a non-profit organization. We're based in the UK, and the mission is to make globalization work for the many, not the few. We have about 200 staff around the world, and we focus on a range of topics from improving the quality of governance in developing countries, particularly in Africa. We have streams of work focused on building tolerance between global religions, and helping communities to live side by side. And we have to work more focused on sort of western countries around public policy and how you help politicians grapple with a world which is changing very rapidly. That is hard to understand sometimes. A lot of the concerns now are expressed through the U.S. Presidential elections and through Brexit. We try to help political leaders understand this and shape a way through, which rather than riding the anger tries to actually solve the problems.

SS: Not a painkiller policy, but rather a cure policy?

CY: Exactly. And I think it's harder to do the job of actually addressing the concerns, doing the analysis, identifying the solutions. It's much easier to just throw your hands in the air and say it's all terrible, and as a political candidate I just want to blow the whole thing up. Unfortunately, at the moment all of this is emotional residents, but it doesn't solve the problems.

SS: So, tell us a little bit about why regulation is so hard. Actually, before we get there. I just want to tell you a little story. I was at Web sum in Lisbon about two weeks ago. And the most exciting presentation I heard was by Chris Wiley, the whistleblower from Facebook. And he was so passionate about the need to regulate both those who develop this technology and many industries that will be changed by the technology in completely new ways. With a new language in a way, new problems. And he said, “first they sued me, and now they want me to mansplain technology to them”. Tell them how the internet really works. And he says that it worries him, because these are the politicians who are regulating the internet and they don't understand how it works. So, two things. One is understanding how the thing works, but second is also understanding the real effects it will have on society. How do you do that?

CY: Yeah. But before I answer that, we were at Web sum as well actually. And it is too many streams to keep on top of them. But Tony did a really excellent session on the main stage where he talked about global politics and technology.

SS: What was his main point?

CY: We talked about this question of the populist era and the impact that it has on modern politics. And on his question of technology, it was around this idea that if you want to make progress on any of the big issues that matter to people like housing, healthcare, transportation and law and order you've got to have technology at the heart of your public policy. It's the only way. Sure, you can make incremental improvements by making small changes to the status quo. But if you want to go big and really address these issues, it is a huge potential to use technology to improve outcomes.

SS: Can I ask you something just for clarification. Because I want you to go a bit deeper into that. My problem is that very often people talk about technology. And of course, we're for technology, and of course we will use it. The same way we talk about education, of course we are for education. But I think you said something very important. It's not just about incremental changes. It's not just about running faster. It's about understanding how you can do it differently.

CY: Yeah. And this is kind of the focus of what Tony was saying, and the focus of a lot of our work has been around the gap in understanding. The people who are changing the world with new technologies on the one hand, and the people who are trying to respond to that with policy and regulation on the other. They talk straight past each other. Partly they don't speak the same language.

SS: Why is that?

CY: So, in a lot of the interactions that I have with politicians. I think they don't necessarily appreciate the complexity of what they're dealing with. And I think some of that certainly my experience in the UK, I think a lot of times maybe an MP, you wouldn't expect them necessary to be highly technically literate. But they also have enormous pressures on their time. Maybe they don't have the capability in their private office to really get on top of this either. So, I think you have this situation where they might not prioritize it. And I think sometimes they also have sort of mental shortcuts that we all do. They try to use common sense as a way to navigate, and the trouble is that unfortunately when common sense collides with the internet it doesn't always survive. So, you have that on the one side. Where the politicians maybe don't understand the technology with all the opportunities and challenges as well as you might hope they would. And on the other side with a lot of the technologists, the change makers. And all the technology people I know in that world are well-intentioned. I don't think there's a lot malicious behavior. I do think though that a lot of them are very naive about the reality of governing and the sense of urgency. How messy and complicated it is to govern a country, the compromises that you have to make in order to pass legislation. The reality of balancing all the different constituencies. And so, you have these two world viewers that are so different. And you see a lot of them talking about the backlash against technology. And you can understand it. If I'm a political person, and I see this kind of turmoil and changes, and I kind of get the sense that people are angry, but I don't understand any of the technology. The easiest thing to do is to attack it. And by the way, all of my peers are attacking it as well. So, if I don’t, I look weak. If I'm the only person saying, we should take our time and think about this. everyone else is saying it is just terrible and we must stop it all, politically it's very hard for me to do. So, we try our best to bring those worlds together. To produce analysis and insight which policy people can understand, that’s not super technical. It doesn't require you to have a PhD in AI or something. You just need to know enough about the issues and the trade-offs, the opportunities, the challenges and the questions you should be asking so that you can have a good dialogue with people in the industry. And on the other side, when we talked with people in the technology world we just try to help them understand that one thing is the data and the models and the science that they love, but there's also the reality of how complicated society is, and how actually they exist within a framework and a set of values, and you can't really separate those things out. So that's what all of our work is about. And yeah, it's hard, but I think it's important, and I think a lot of this debates really suffers because unless you really have a good sense the dynamics and the business models of the technology companies, it is difficult. Because you’re basing your response not on objective fact or the real situation but based on your prejudices and gut reactions.

SS: You had a really nice explanation to me about how to regulate these policies for the future when we don't really know the exact effect of these technologies on society and on businesses, but we know it's going to be big and disruptive. You recently published a report about this I think called “About regulation”?

CY: Yeah, we did. It's called “a new deal for big tech”, and it's all about how you build regulation that is fit for the internet age. And in there we back it up and we're saying that before you can answer this question about the right way to regulate, you have to understand the economics of the internet, and you have to understand the business models in the incentives of technology companies. And so, what it does is it just unpacked actually the ways in which the internet has fundamentally changed the cost structure of businesses. And what that does to market dynamics, incentives for innovation, why it is that actually a search engine is quite different to a social network that is quite different to a streaming media provider. A lot of people just lump it all together under the label of technology. And actually, that's not enough if you really want to understand how people behave, and the incentives they face. And that goes a long way to explaining a lot of the big macro trends that you see in the world at the moment. A lot of people talk about the decline of gatekeepers in many industries. This idea that it's difficult for me to get my product in front of customers, or get my voice heard because if I want to write a letter to a newspaper, it's got to be approved by the editor. Or if you want to get a program on television you have to go through a commissioning process. Now I can post around the internet. I can record a video on my smartphone and post it to YouTube with no intermediate or approval required. So, effects like that are radically changing pretty much every industry that we deal with. And once you kind of appreciate what's driving these changes in the world you start to see that there is a long list of things that are difficult. And we talk in the report about challenges like economic challenges, disruption of old industries, the way that technology is changing the nature of work and what it means for the future of jobs, taxation, competition but also cultural challenges. And privacy, whether we really understand the relationship that we have with some of the large technology companies. The impacts on well-being an addiction. We all have the list of issues. But what we say about all of them is that you've got to be really careful, because all of these things are trade-offs. They're not black and white. So, you can't just say “technology is terrible for privacy and therefore it must be stopped”. Because actually it's really efficient. And the truth is when you leave your home in the morning and you step out onto the streets you sacrificed some privacy in order to get some stuff done. But the issue is that the dynamics have changed a lot. So, gradually over time, we knew that when we use our credit cards in supermarkets, people were accumulating data on our purchases. And we know that cameras read number plates in the UK. But now what's happening online maybe as a whole order of magnitude more intensive.

And you've got to understand what price you are prepared to pay for different levels of efficiency versus privacy, and so on.

SS: I loved your starting point, because you are saying that first of all we have to understand their business models. And we can't even start opining about all of this until we understand how the business works. And then its effects. And I recently had a couple of really good conversations with people who do gamification. And you know, it's fascinating how we are scared of games and gamification. Well it's about manipulating people to get their money, or their time in a proportional amount. But if you are manipulating people to read more, or to train more, or exercise more, suddenly it's great. They are heroes. And I think every business is trying to manipulate, every policy maker is trying to manipulate. And I think we can't manipulate this correctly unless we understand the drivers that we're working with.

CY: I think it's exactly right. And I think the hearts of why this is causing so much trouble for some difficulty in the public debates. We boil it down and we say that the actual issue is a relatively small number of companies have got enormous power. And very little legitimacy attached to that power. So, it doesn't really matter which company were talking about. They will have some variant on community standards, or the rules. But the issue is who decides those rules, and at the moment largely a bunch of white men in California make the decisions about what's appropriate and what's not appropriate and how strongly they will inforce the policies and so on. The reality of course that they have such an outsized impact on the world around them and on society. It can't be appropriate for that intensity of power to happen without enough engagement and scrutiny. But it's a different sort of regulations than what we've been used to in the past. So, in lots of industries before the internet's used to have long and detailed rules and regulations, and you try to enumerate all the things that aren't allowed and all the things that must be done. And before the internet that worked in lots of cases. And it doesn’t matter if it is taxi regulation or the way they regulate banks. They're all some variant on this kind of rulebooks and licensing and so. The trouble with the Internet is that it has blown up all the assumptions about the way the businesses operate, and the business models. And it's also blown up the scale and the speed all of this happens in. So, you can't possibly hope to write a set of rules that governs every eventuality for what happens on the video sharing platform. What you can do though, is you can be really clear and say “as a society these are the values that are important and that we expect companies to uphold, and we're going to have a strong regulator which isn't trying to micromanage or prejudge every single tiny decision. But more going to have the technical expertise and the authority to hold people to account”. So, if your alerted to something where the behavior of a large company is completely contrary to the public interest. Then I should have enough authority to conduct an inquiry to get to the bottom of whether they've behaved in a reasonable way, and if they have kind of dismissed it, and if they haven't to impose penalty which is appropriate to the level of the transgression. But that should be done in a much more structured and realistic way. Obviously, there have been lots of issues and lots of things that have not been done as well as they should. But I think we sometimes are guilty of holding them to an unfair standard that we expect 100% affection. That we think there should be no offensive content on social media. And of course, it's not attainable. In any situation you will have false positives and false negatives. Human beings will make mistakes. So, you have a system which accepts that companies do their best, and if they do their best and genuinely try to improve and solve the problems that come along with their business models, I think we should support that.

SS: I think it's so interesting. I'm an old fan of Monty Python. And sometimes I think of Life of Brian when I think of Mark Zuckerberg among other things. It's one of the statements “I didn't lead them here, they just followed me”. So appropriate, because I think they do have an outsized effect because they do have so much understanding in so many dimensions and all these weak patterns that can be exploited commercially or politically. So, the danger is that it can be really easily abused. And as you say, I don't think we can protect ourselves from all of these potential abuses, because then there won't be any personalization. There won't be any data. And every country has a different set of cultural rules, often not very well captured even in basic law. I think things that work in China might be very different from things that work in the UK, or things that work in the US or Saudi Arabia. Yet they should be able in a way to make their own localizations of all of these global solutions. I met a man called Jacob Randall who wrote about Cambridge analytically in a really interesting way. He's in California and he's trying to develop models for corporate social responsibility related to ethics and use of technology. And his main thing was about understanding that if you use technology produced in China, you might be using Chinese models. And they suddenly trampled all these political borders. So how do you adjust those models locally to your sets of values? And I have no idea. I mean, we're talking about ethics algebra. How do you get people to be so aware of the most important of the non-negotiables?

CY It's hard, isn't it? What you definitely can't do is just leave it to the technology companies. So, you need a much more structured process of when they're setting their standards or their rules or whatever they call them. That should be done in consultation with governments and

with civil society. It should be a process that everybody is engaged in. And I think we need a much higher quality of discussion amongst politicians, and between politicians and the public about a lot of these issues. But one of the things that we confront in the work is that we say there are going to be some fundamental differences in terms of the core values. And yes, in an ideal world you would love a global solution to this, but you and I know that that's

never going to happen, or at least not in my lifetime. So, the question is what is the feasible scalar which you try to address the issue. So yes, there will be some cultural differences between different European countries for example. That in our view when you can take it all the way down to the fundamentals. The advanced liberal democracies of Western Europe and North America have enough in common that we would be able to build a consistent approach to this. That understands the business models and incentives. And anchoring this in values and accountability, rather than in very prescriptive rules. And in the extent that you can, you try to build some kind of transatlantic consensus around this. And I think that's all the more important given the discussion at the moment that China is heading in on in particular, but also Russian aggression on the internet.

SS: Cybersecurity and fake news you mean?

CY: All of it. Putin said, “Whoever controls AI rules the world”. So, that's a benign thing to say. But when you think about it, the companies that dominate our lives and run all the apps on our phones are largely American companies, and some big European companies. They are the products of advanced liberal democracies, and the values that anime our societies. And we want to say that these companies should uphold those values. They should champion them. And a lot of people historically have recalled from that and several technologies is neutral, thank you very much. And what I would say is no, it's completely okay. Totally all right to expect American and European companies to reflect and uphold the values of European and American societies.

SS: But I have to ask you, because to me there was a fascinating prehistory to Cambridge analytica, with the Scl company which was really as far as I understand funded by the British government. With the assumption that it would never be abused against the Western democracy. But then things kind of slipped. So how do we deal with that slippage?

CY: You put the accountability in place. So when we talk about the regulator, one of the concerns that people often have is that you put strong regulation on top of a lot of the technology, and then it can be abused by the kind of political leadership of the day or what have you. So, one of the things that we talked about is that you need to find a way to build some degree of independence. So, whilst it's legitimate for our governments to set the terms of reference, write the values that you expect companies to uphold. The job of holding people to account and deciding which inquiries to make and who to sanction needs to be outside the day-to-day political process. It's closer to a judicial process than it should be to a very political interference. But yes, some people will try to get around that, many politicians right now would say they’re not sure they want to put that much power at arm's length, because they've got very strong opinions about what should be done around technology. Certainly, the president of the United States.

SS: One of the experiences I have from being a board member is that you're not really free to admit what you don't know. There is some training that has to do with new regulations in every board. But if you start saying I'd really love to learn more about blockchain. It's like “you need to deal with that on your own. We as a company have to assume that you are competent enough to do this job as a board member, which really means you should really be done with education”. Imagine that those kinds of pressures are even harder on politicians. It

CY: Yeah. It's a great question. A lot of what we think about is two things. We try to produce material which actually is easily digestible by politicians, and by policy people rather than by technical people. And there's a way of doing that which is quite different to the way that you might communicate inside a big tech company. And the more opportunities that are rising for a dialogue between the technologists and the policy people in a way that everybody feels comfortable, that they are improving their understanding. Because the point you made around the politicians. I think the same is actually true for some of the technology companies that actually it's very difficult publicly. If you work for a technology company to come out and say “we don't really understand the implications of what we've built. We know that it's having XYZ and social impacts, but I couldn't really tell you right now whether that's good or bad or how much harm it´s doing”. You could never say that in public, because the moment you do the media will come down on you like a ton of bricks.

SS: Because they don't understand it either.

CY: Right. So you need a space where on both sides the politicians can say “help us understand how we should think about this and what questions we should ask”, and where some of the technology companies can say in good faith “we're struggling with these sorts of issues and we need help, and we need more guidance from the politicians”.

SS: You are asking for a very kind of brave sorts of humility here. But let's hope!

CY: I struggle to see any other way through. One answer might be if we wait long enough, then people who've grown up with the technology and are much more comfortable with it will be able to ask these sorts of questions, and the new generation of entrepreneurs will build more mindful companies and be more socially responsible because they get a wider view. But I'm not sure we can wait that long.

SS: Chris, I have to ask you what's your background educationally?

CY: Economics.

SS: I'm really fascinated. Because I work as a fellow in Oxford for a while and was fascinated by all the kids who went to do PPE. Politics, philosophy and economics. And it has to do with this diversity of perspectives, and I loved it. And I think we should have many more subjects like that. Because you are one of these hybrid people. You have this experience from Uber, you understand the platform technology, and then you understood the difficulties of regulating it. When you talk about regulators you speak with real respect and understanding. And that's the sort of perspective I sometimes Miss from leaders of Technology. And vice versa. So how do we get more of these hybrid people? And the second question is if we could dive a little bit into Uber?

CY: So, I I would love to see a lot more people who have got experience of policy and of industry. And as the companies get more mature, I think as you get bigger you bring in different sorts of skills. So, I think part of it is about the growing industry.

For a long time, a lot of people with good technical skills didn't want to work in government, because it was viewed by a lot of people as massively frustrating where you never make any progress. I think that's changing to an extent, but you know, a lot more needs to be done and there are good examples around the world of political leaders who have really said they’re going to try and transform the public sector. It's attractive for people in industry for public service effectively, using their technical skills, and then they are able to build up a more rounded view of the world. But turning to Uber I think you know..

SS: Let's just start at the start. You were responsible for Rolling Uber out in several big cities?

CY: Yes, so I was on the UK team, so cities like in Scotland and in the north of England. In all UK cities Uber is private hire generally. So, we talked about a particular regime of taxis and private hire cars which is very heavily regulated, but with legislation depending on which city you are in. The legislation might be as new as the early 2000s, or it might be as old as the 1970s.

SS: And it says things about logbooks and things that you don't really use anymore?

CY: Well, because the internet never minds smartphones. So, a lot of the regulation didn't really get it. You have the question that comes in the interpretation. And every local authority in the country has its own kind of interpretation of the rules.

And that makes it very challenging. Obviously, you stay within the law at all times. But the way that the regulators interpret the law might be different. So, one local authority might say that if you're using cloud-based technology and smart phones to process bookings that's like a modern interpretation of the law, which says the bookings must be made in advance.

Other places might say the law doesn't mention using a smartphone, so that's not allowed. And so, you have this continuous discussion about trying to explain to people while they might not recognize the device if they're used to making a booking by walking up to a minicab office in the streets and saying through the counter to someone “I'd like to book a car”. But actually, what's happening at the end of the day is the same, it’s just the modern expression of it. So, it's a really interesting case of a company that has to fight quite hard to explain why what it's doing is perfectly fine from a regulatory perspective. But also, it’s a positive improvement for the world around it. And obviously there's been written and debated about the mentality of the company overall, and whether it was too aggressive in many dimensions. I continue to believe that in the long run, you look around you in any large city and you look at the congestion, the pollution and the time wasted. There is tremendous potential from using technology to solve these urban problems. and Uber is there, but so as the electric scooter. Companies that are being deployed in lots of cities have better technology for public transport. All of these together could make life radically better for a lot of people.

SS: But the problem with regulating and developing or scaling this is that any disruption that creates new value also gobbles up a few people in the process. There will be some people losing their jobs, and it's incredibly easy to focus on those and forget about the edit value for all the rest.

So, would we be able to do this more easily if we somehow find a good solution also for the people being displaced? It's a similar thing like rolling out autonomous vehicles. If you want them to drive perfectly, you'll be waiting so long that many people will die by human caused accidents in the meantime. What's the right thing to do?

CY: Yeah. A few things happen. One is as you say, that for something like Uber or other ride-sharing companies the cohort of Taxi drivers who spend a lot of time and effort to get their taxi medallion or the taxi license in what city they're in, and you spend a lot of money on a particular taxi. If I were them and I put all that effort and money in, and then suddenly technology could effectively operate a similar service, I would be pretty frustrated. But then you weigh that against millions of people in a city, that their mobility is massively improved, and often for people who were most disadvantaged in terms of access to transportation. Services like Uber has superior. In areas where you haven't got a bus or a subway, or where traditional taxi companies wouldn't operate so it would be very expensive. But you are right, when you scale it up to something like self-driving cars or self-driving trucks that maybe will happen before cars. You can look at the data and it’s a really astonishing proportion of people in countries like the U.S. Their occupation is truck driver and they will all plausibly be automated out of work about the same time. Millions of people.

And you can provide those opportunities for them. So, to retrain, to pick up new skills and to find a different profession. Obviously, that helps with the impacts. I actually think one of the hardest challenges will be not a lack of jobs. We know for certain that we're going to need a lot more people in health and social care.

And yes, it’s a profession and you need training, but there's no reason why a truck driver couldn't care for an elderly person. Apart from the massive cultural stigma attached to going from driving a truck, to looking after someone in their home. And so, as a society that's the issue that we've got to get around. Yes, there's a kind of question of formal training and compensation. But the bigger thing is going to be different sorts of work. I know people can do it, but are they prepared to? And as a society will we respect that or will we kind of make it hard. And that's what political leadership is required, that’s not a technical thing. It's political and cultural.

SS: I think these two dimensions where people might be better than machines, robots or AI, one of them are the kind of dexterity task. It's a paradox and the toddler could do stuff that robot can't. It can't fold the towel as easily as a person or sort socks. But you know, things that are heavy, very repetitive they are great at. And same with the intellectual axis. They are great at analytics and rational stuff, but the social and emotional stuff we can't be replaced. And so, by having people accept that that's our humanity, and being proud of that. You know, this is actually who we are. We are not Hercules that can lift a tone and very few of us are the super irrational analytics machine. But the beauty of being a human and caring for another human and being able to navigate a very unpredictable environment, we should be proud of that.

CY: Exactly that actually should be one of the things we hold in the highest regard. Somebody who teaches young children or cares for elderly people. And it's a massive question of how society views those and whether we'll be able to make that transition in a way that's comfortable. I hope we can.

SS: So Chris, translating all of this into some sort of step-by-step policy making. Where do we start?

CY: That is an excellent question. Well, I'll tell you a few of the things that we're doing. So, some of what I think is about just in terms of the policy. We just need to identify and present some big compelling ideas about how you use technology to improve the things that people care about. Because at the end of the day in our world we can talk forever about the ethics of AI or the impact on competition policy and disruption. But when you go and talk to people as a politician on the doorstep, no one mentions technology. They might moan about how fast their internet is, but then what they really say is “I want my local schools be better” and “I want to be confident that the healthcare system will work” and “I want to be sure that I'll be able to find a good house and it will be affordable”.

We have to figure it out. We do the analysis and make the effort to say to the politicians that if they're serious about improving healthcare and keeping people as far as possible out of hospital, rather than doing the kind of salvage operation when people get sick, it was avoidable. Then this is the bundle of policies and technologies that we would use to shift our healthcare system tools one that is much more preventive.

But a lot of that when that happens at the moment, I think is very technocratic. So, when I think about a lot of the things you read about the impact of technology, on the one hand you've got a lot of very emotive content largely helped by traditional print media, which is only that technology terrible, it's destroying your children's brains. So, on and so forth. But then on the other end you've got excellent thorough analysis done by whoever World economic Forum OECD-police organizations like superb. No residents with a regular person. And in the middle, there's this completely unoccupied space which is to say, I'm going to take the analysis and the appreciation of what technology can do as well as its challenges, and I'm going to take the concerns of the people have the legitimate and I'm going to put two together and answer. As a society whether you are in Europe or in the states, we under index on that, and we could do something about it. So that's where we spend a lot of our efforts. I don't for one second think that it will be quick or easy, but I do think if you're serious about building a politics which gets away from the populism that has done so much damage, and gets us back to a place where people can be really confident and the future will be better than the past. I don't see another way through.

SS: I think we need another kind of rhetorical genius like Tony. Because I think what you're asking politicians is to go from short term problem fixing, very populistic to long-term visionary predictive value creation. The state has an engineer of growth. It requires some big bets. It requires some really brave mobilizing of people. How do we make space for this?

CY: Well, I think we have to find the political leadership. So, they've got to have access to the insights and the analysis and the ideas, but then you need the political leadership, the curation and the ability to turn that into a compelling story. Which are what politicians are good at. You give them the raw material and they express it in a way that wins public support. But one of the challenges when you think about the world at the moment is who's going to step into that role. A lot of people are put off politics at the moment because it is such a toxic environment. On the other hand, I think if somebody can make a new sweet spot of pro-technology, but also orogressive and compassionate and addressing people's concerns. If you get that right, I think there's the potential to be incredibly successful. The people or the party are the master that should control the political foreseeable future.

SS: A Canadian president comes to mind. Some of this involves also quite big turnarounds of some of the processes in the public sector. Not so many institutions or people. But I think even Obama tried. But then he got kind of completely held down by all the internal politics. Is it possible to ignore all the short-term internal politicizing?

CY: I would say with strong enough political leadership you can overcome a lot of it. And there are examples. Obama did make progress in some areas. In the UK we have this particular experience with the UK government digital service. Which was all about modernizing the way the government interacts with citizens online. And behind the scenes there was a lot of radical change in the way that they worked, the kind of the operating model.

SS: Which was, sandbox?

CY: Some sandbox. A lot more agile project delivery organized around teams of product managers, rather than big waterfall projects that you brought big external consultancy in to deliver. Much more focus on user research, and user-centered design rather than just building stuff that served the interests of the bureaucracy.

SS: And these works?

CY: Massively improved on what went before. But I think when you do the analysis obviously you need the good people, and the talent and everything else. But it would not have been possible without very strong political leadership, and somebody willing to say, “I'm going to force this through”.

SS: Was it a person or a group of people?

CY: It was a particular minister in the UK, Francis Maude was Minister for the cabinet office for a long period of time. He stayed in that job when other people before him had done it for a period of time and moved on, he kind of stuck with it. And he was prepared to use some political capital to get it done. When other departments were saying, well, I'm not sure about this. He had enough political strength to be able to just push it through and say it's not good enough, we're not going to let you carry on everybody doing different things. So, it is possible, but it's really hard. And If you're selected with a slim majority or if you're distracted. In the UK, the horrendous distraction of Brexit. You never got to have that kind of political focus on something. But you need that, certainly in the UK, the officials and civil service and up superb are running the ship as it is. And if there's a crisis spring into action and deal with it. But what we know from many decades of history is that when it comes to actually reinventing themselves and reinventing the organization's and completely rewiring the way that government operates, they are not really able to deliver. And we can debate forever about whether that's to do with the incentives, or the mindsets, or the expertise, but the fact of the matter is unless you force it through it's not really possible.

SS: I think it's really interesting and I think when you need somebody who dares to think very long term, and almost be a little bit long-sighted and have these blinders for the short term

distractions. And every now and then there are these politicians that do, and I admire them. Including Tony in these. Must be a fun place to work.

CY: Yeah. Certainly, gives me plenty to think about.

SS: So Chris, where do we go to read your best reports?

CY: So, all of them are online. Our website is Institute Global, so that's a great place to start, and then you can find our work on technology, but also other areas of public policy, and also the workflow mentioned earlier around. Issues like Africa, and global tolerance between religions and so on.

SS: Very cool. Chris Yiu from Tony Blair Institute. Thank you so much for coming and inspiring us for the future of politics.

CY: Thank you, it’s been fun.

SS: And thank you for listening.


Du har lyttet til en podcast fra Lørn.tech, en lærings dugnad om teknologi og samfunn. Følg oss i sosiale medier og på våre nettsider Lørn.tech


What do you work with?

Tony Blair institute is a non-profit organization. We're based in the UK, and the mission is to make globalization work for the many, not the few. We have about 200 staff around the world, and we focus on a range of topics from improving the quality of governance in developing countries, particularly in Africa.

What is the most important thing you do at your work?

I look after technology policy at the Tony Blair Institute. But my role is very much to think about the intersection between technology, politics, and policymakers and how we bring those worlds a little bit closer together.

What are the central concepts in your tech?

The focus of a lot of our work has been around the gap in understanding. The people who are changing the world with new technologies, and the people who are trying to respond to that with policy and regulation. They talk straight past each other. Partly they don't speak the same language.

Why is it exciting? What drives you here?

For a long time, people with good technical skills didn't want to work in government, because it was viewed as massively frustrating since you never make any progress. I think that's changing to an extent, but more needs to be done and there are good examples around the world of political leaders who have really said they’re going to try and transform the public sector.

Your own favorite example?

A report called “a new deal for big tech”, and it's about how you build regulation that is fit for the Internet age.

If people are to remember only one thing from our conversation, what would you like it to be?

Many people have put off politics at the moment because it is such a toxic environment. On the other hand, if somebody can make a new sweet spot of pro-technology, but also progressive and compassionate and addressing people's concerns, there’s a potential to be incredibly successful. The people or the party the master that should control the political foreseeable future.

Chris Yiu
Leder for Technology and Policy Solutions team
Tony Blair Institute
CASE ID: C0251
TEMA: DIGITAL ETHICS AND POLITICS
DATE : 181128
DURATION : 51 min
LITERATURE:
Institute Global
YOU WILL LØRN ABOUT:
Offentlig politikkReguleringer Ny teknologi Konsekvenser Teknologitrender og forretningsstrategi
QUOTE
"The focus of a lot of our work has been around the gap in understanding. The people who are changing the world with new technologies, and the people who are trying to respond to that with policy and regulation. They talk straight past each other. Partly they don't speak the same language."
More Cases in topic of DIGITAL ETHICS AND POLITICS
#C0061
DIGITAL ETHICS AND POLITICS
Radical Markets

Glenn Weyl

Professor

Princeton

#C0147
DIGITAL ETHICS AND POLITICS
Store ting som skjer på små steder

Hans Olav H Eriksen

CEO

Lyngsfjorden

#C0175
DIGITAL ETHICS AND POLITICS
Omstilling og innovasjon i praksis

Hilde Aspås

CEO

NCE iKuben