InfoSec Practice Lead
InfoSec Practice Lead
Choose your preferred format
SS: Hello and welcome to Lørn. My name is Silvija Seres, our topic today is information security or cybersecurity and my guest is Ryan Matterson, from Nagarro. Welcome
RM: Hi Silvija
SS: Good to have you here Ryan.
This is really a blind date for both of us because I don't know much about Ryan or Nagato and we'll go into your perspectives on information security. You said something I liked, which is information security has a value that can be measured so I'd like you to explain that to us eventually but before we do all that, I would really like you to tell us about who you are and what drives you.
RM: Sure, my name is Ryan Matterson. I’m a Canadian living in Oslo Norway for 3 years now. I have been working in information security for the last ten years and right now I am very passionate about helping business leaders to understand the risk that comes from using technology they already have or maybe investing in new technology and how to make better decisions about that.
SS: So we all hear that information security is paramount in this time where data is the new oil, gold, whatever and everybody knows that the boards, the CEO and everybody else should really care more about it and it should be at the top of our priorities but in order to do that, in any relevant way we need to learn more about cybersecurity and even thinking about learning about cybersecurity is still to most people enough to give them a headache. Tell us what information security is, why is it important and is there any way that one can measure one's own security level?
RM: That's a lot of questions Silvija.
RM: Let’s start at the beginning I guess what is information security? So, information security is a means to understand risk. I think one of the really important concepts that many people miss when they think about information security is the fact that security is not a binary state, it's not a zero or one. It's not the case that a system or an organization or a computer is either secure or insecure and rather there is always risk associated with not just information technology but everything we do as humans moving around in the world that comes with risk and so it's really about how we manage that and the security that may be appropriate for one system that has a high-value critical infrastructure for the society, for example, the security that is needed there may be much higher than the security that would be needed on a personal computer but the goal in both cases is not likely to eliminate all risk. It is to understand the trade-offs between the cost of security and the risk reduction that you get from it.
SS: But basically, if we agree that there is no perfectly safe state, as you said it’s not binary and then there are measures that can be taken to move you closer to safe air and those measures have costs?
SS: What measures are there and how do you think about their costs?
RY: Well if we think for example, something basic that everyone can relate to, it would be a username and password to log into an account and of course people who, anyone who’s got the
username and password for a few accounts which is everybody I would expect these days, they know that there are challenges with remembering all those and if they're adding two-factor authentication which is recommended for accounts you really care about like banking or a business remote access account, then this can add a cost in terms of your time, in terms of extra hassle for the user so that's an important consideration. What is the trade-off versus the risk reduction that we are getting from these types of controls and how can we a security people help users to be more secure without increasing the hassle that they have to go through to actually use the technology on a daily basis.
SS: Because basically, security has a convenience cost. But also there is a certain, all these gdpr stuff etc also has attractiveness cost. The more you limit people's ability to use information that's there, the harder it is to make super personalized services for example. So how should we think about priorities here?
RM: The priorities should come with the things that are the easiest to implement and they give the most risk reduction for the least cost whether that cost is a direct and monetary cost or whether that would be a cost of productivity of the user.
SS: Of freedom.
RM: Sure. Yeah, that's right.
SS: So you reminded me of a situation I was in some 30 years ago, early internet days and there was a politician that was very gang held on internet and then somebody tried to deflate their balloon a little bit and said well do you know that most of what's on the internet is pornography. And then this female politician said oh well then we’ll just close it down and we can’t live our lives anymore if we go for extreme security. You can't unplug your computer, close your phone, crawl into a hole where nobody will take a picture of you, etc. So we have to assume that there is connectedness and then somehow we need to learn how to navigate the risks. How do you teach people to do that?
RM: I think we can't blame the user so when I began my career in InfoSec 10 years ago, I was very technical. I was very much focused on things like ethical hacking and incident response, digital forensics and over the past 10 years up to now becoming the global practice lead at Nagarro InfoSec Consulting. I have come to be more and more strategic in the type of work that I do because I realize that security is really not a technical problem in the way that many people think it is. What I mean by that is it's not something that IT can solve or that a user can solve by themselves, we need help from business leaders and also politicians even to make these kinds of changes happen. I mentioned earlier that security has a cost and a value. One of the analogies that I really like to help explain the value of information security both socially and in business is that insecurity is like pollution and this has been, I don't know who said this first but it's been said by many people and that analogy is not perfect but it's one that I quite liked because it gets to a couple of important points. One is that digital insecurity in today's world is an economic externality. What that means simply, is the cost of insecurity is typically not borne, not paid by the organisation nor the actor that creates the insecurity that exposes data through insecure IOT devices, smart Home connected devices. They will often not pay the cost of the insecurity of those devices when they get hacked and the buyer also may not pay the cost and this is the same as with pollution. If a producer produces a very polluting vehicle the producer does not necessarily pay the cost of that pollution nor does the driver necessarily...
SS: ...all the health effects and all the long term...
RM: ...sure, it tends to be paid by society at large in different ways over time and we've then begun to regulate in ways that try to bring those costs back into the production process for the producers if we think in terms of pollution of the natural environment. So if we think about insecurity in this way we start to understand some of the key challenges of why it's so hard to address growing insecurity or instability and then that leads to end global infrastructure that everybody depends on and also why it's so important that we actually find ways to deal with that.
SS: You mentioned that in your early days, you were very focused on the techie side of Cyber sec. So things like ethical hacking and digital forensics, can you explain what those things mean?
RM: Sure, ethical hacking is also may be known as pentesting or types of ethical hacking could be called a red teaming but the basic idea is that you as a service to a business or perhaps a public sector organisation would simulate attacks or carry out certain tests that would find vulnerabilities that a hacker would use to exploit those systems for some gain, some purpose of their own, whether it would be to steal credit-card numbers or whether it would be to add that system to a bought net to a large group of compromised machines that the attacker controls and then they can rent out those machines either to send spam or to send a denial-of-service attacks to take down other websites and systems and so on. That was a lot of fun. I think it's fantastic to get paid to actually be able to hack into companies and organizations and not have any risk of yourself going to jail and maybe helping them...
SS: ...because you’re touching onto something. I'm sorry I'm sorry I'm interrupting, I'm terrible at that. Because I think many people try to understand why these hackers do what they do and you always think somebody who’s trying to have a huge game or a huge...but I remember reading the hacker manifesto sometime ago and I think people kind of underestimate the whole, let's just see if it can be done. I'm I smart enough to do this side of it? So is there a way to do more of what you do? Recruit the smart guys to the good side?
RM: Sure, I think it's really important to have everybody learning more about technology as a part of even primary education, not necessarily learning to code. I think there's maybe a bit too much focus on coding. What will be more useful is a broader view of the interactions of technology and society and culture and this risk versus reward trade-off that comes with technology because code is kind of the problem or certainly part of the problem. There are different researchers who have looked at how to calculate the number of bugs or security vulnerabilities per line of code and if we look at the number of lines of code being written that's expanding extremely. Rapidly we don't necessarily need more technology in every area, for example, we also have a lot to learn about how to more effectively use and manage all of the incredible technologies we've already created. We've only begun to scratch the surface in most cases...
SS: ...we are not worried about what's going to happen 10 years from now, we are worried about what you can do today.
RM: Sure! I'm very bullish on the potential of technology to really improve human societies and quality of life but I’m also very underwhelmed by how far away it is to you exploiting that potential so far just with what we’ve already got today.
SS: I was relatively recently on a web summit and Christopher Wylie was there and he is the whistleblower from Facebook and he had a flaming, glowing presentation; the best of the whole conference and there were 70 thousand people there. About the need for regulation of the data workers. He says, well all the other kinds of jobs put a lot of requirements on how you're supposed to exercise your skills and your profession and we haven't done anything like that with the data profession yet that's what's really driving our society both in terms of health, traffic and infrastructure and all others so I think what you're saying is, help techies to understand the social drivers, business drivers and understand that this is not just about some arrogant CEO or chairman of the board you should teach some new skills but it's about understanding why they do what they do and how you can help them do the right thing. You said that going back to regulation of the data profession, you said software is made by humans and it will always have bugs, what are the consequences?
RM: That there will be risk there so a bug that affects a functional requirement is going to usually get fixed but a bug that is kind of a security hole behind the scenes that doesn't actually prevent the typical user from using the software to do what they wanted to do with it to fulfill business function or to order a plane ticket or to send a message to a friend and so on, if the software does what it says on the tin then in most cases there is no strong financial incentive there for a business to fix that software and so we leave the door open to the exploitation of that by someone who may be able to extract additional value for themselves and in some cases that won't affect the user but it may affect again the society at large if that machine becomes one of thousands of machines that maybe it's a coffee maker, maybe it's a smart coffee maker that's connected to the user's smart home network that is connected to the internet for example, the user maybe can still make their coffee and so on like a thousand other users or a million who are all still making their coffee but maybe those machines are being rented out to attack a website that is some political activist group for example, that doesn't like the message of that site so they've rented out a million hacked coffee machines for example to take it down. That’s a fictitious example but certainly possible and we've seen part less already from these types of vulnerable iot devices so the challenge is again when it's an economic externality when the user can still make their coffee when they can still log in and send the message and get their email and so on. How do we put incentives in place so that the greater risk of all of these small holes to the society at large due to their resilience or the reliance on digital infrastructure? How do we control that?
SS: You are originally a hacker now you are maybe more of a kind of social developer. I'd like you to try to give a picture to people of what you do when you're a hacker. You try to find vulnerabilities. The image people have is that you have your black hoodie, you sit in front of some keyboard and then the magic happens.
RM: I am actually wearing a black hoodie right now. Full disclosure.
SS: I know, very cool.
RM: I think hacking is about the mindset of wanting to explore technology, being excited about technology and wanting to explore systems to understand how they work, the function, the business function or the use case that they're intended to fulfill and then look at all of the other things that could be possible with that system. So anyone who likes puzzles or challenging problem-solving type of scenarios in a systematic way, I think could be potentially good at this. It's even better if people have a lot of development backgrounds but these days it's becoming easier and easier to do so called hacking. I used to do demos where there would be a lot of command line stuff and black windows and code and so on and now I d...
SS: Fast typing...hehehehe
RM: Sure, yeah and now I do demos where I actually show people how you can use the search engines like Google and Bing to find leaked information from companies that they've heard of or maybe even our customers off in Norway and other countries and show them this is how easy it is to quote-unquote hack now. You can Google the right things and you will find usernames and passwords and you will find the personal data that you can use to access accounts or create fraudulent credit cards and these types of things so we're moving in a direction that this is getting easier and easier but I'd like to say a little bit about some of the stuff that's going on locally in Oslo if that's alright.
SS: That would be great.
RM: So some of the things that have been really important to me have been involvement in the local community information security community here in Oslo, since I moved to the city three years ago. So one of the things I've been doing is a lot of meetups like owasp and security researchers.
SS: What is the first thing you said owasp?
RM: It’s an open web application security project. They have a Norwegian chapter and they have fairly regular meet-ups that can be found on meetup.com There is also a meet-up called security researchers that happens a couple of times a year. There is one that I started a few years ago with some other members of the local InfoSec community called Oslosec. You can find that as well on meetup.com. That one is happening every month, on the third Thursday of every month. And we have a slack channel as well.
SS: Can one go there without being a coder?
RM: Sure, absolutely I'm not a coder, I've never been a professional developer so I've been actually doing different types of information security. I've written a fair amount of code over the years but the type of code that a security person would write is typically much shorter and specific to a certain problem or engagement or task exploit these kinds of things compared to what a professional developer would do so I think it's a common misconception that all the stuff is again very technical and very difficult to what we actually need in InfoSec are some more people that come from a bit of a different background like economics for example so we can start to understand better and communicate better these risk-reward trade-offs. The last thing I would like to mention is a conference that's coming up besides Oslo. It’s a conference that is part of a loosely related organisation that has international information security conferences that are nonprofits and open and trying to provide a bit of an alternative to mainstream information security conferences that tend to be more heavily focused on vendor talks and sales presentations and those kinds of things. So we very much want to give spaces for people who may be wanting to make a move into a new career path or students who are just getting started thinking about what is the next step when they're done with their education, maybe give them a chance for the first time to present at an information security conference their own research.
SS: When is it besides?
RM: It is May 23rd 2019 will be the first besides Oslo and the website is besidesoslo.no
SS: Very cool. Ryan, you mentioned you are a Canadian?
SS: I imagine you have personal reasons for coming to Norway?
RM: That’s correct.
SS: Not purely professional but often it is easier to ask foreigners about what Norway does uniquely well? So what is your perspective? Why is it fun to do this kind of work from Oslo or from Norway if any?
RM: It’s an interesting question, I think Norway is quite unique in terms of technology and specifically information security or IT security because Norway was the sixth country I believe connected to what was then known as the ARPANET. A global computer network that was the precursor of today's internet in 1971 and the reason for this was that the US wanted as close to real-time data as they could get about Russian nuclear testing and so the Norwegian seismic monitoring instruments that were close to the Russian border could potentially detect underground testing of nuclear devices and the best way to get this data directly to US monitoring facilities was to connect Norway or at least this seismic monitoring organisation to the ARPANET. So Norway was one of the first countries in the world on the internet and it's been a very rapid and an interesting ride since then. And today Norway is one of the top countries in the world for the adoption of digital payments for example and folks are very bullish on all sorts of new technologies and the upside that comes with new technology but at the same time Norway is a very trusting society and so we really struggle to see the downside and to manage the cost-benefit trade-offs that come with adopting new technology. So it's exciting and challenging at the same time.
SS: We’ve run out of time, I still want to ask you one question going back to what you said earlier. What fascinates me is this idea of having some objective way of measuring risk. Not just saying that there will always be some risk or can you give us a couple of sentences on how to start thinking about that?
RM: I don't try to measure risk objectively because it comes down to devalue. I do try to measure quantitatively when we say measure we mean a quantitative reduction in uncertainty but to say objective when you were talking about values in this way would imply that every organisation and every person has the same values when it comes to things like privacy or the reliability that they require from certain infrastructure and so on and that's something that is subjective because humans are reviewing the world each through our own lens. So I would say the focus should definitely be on quantitative approaches to understanding and managing risk because you can't answer questions like how much risk do we have, how much should we invest in reducing that risk, where should we invest it and so on?. You can't answer these questions without a quantitative approach but that approach is still going to be somehow subjective so that's the challenge to agree on a shared approach that is acceptable but still subjective.
SS: Is there, you mentioned these conferences and some meetups, but is there a book or a news channel of some sort that you would like to direct people to?
RM: For business leaders or people that are interested in thinking about cybersecurity risk from a strategic level, knows nothing about coding or technical stuff, I think the book how to measure anything in cybersecurity risk is a fantastic starting point and if someone is looking more from the technical side then there's just so many resources out there. That’s one of the great things about information security as a field is the barrier or the entry and access to really great information and research is so low so I’ll just say go on YouTube and take a look at some conference videos from some of the popular conferences like blackhat, defcon, shmoocon, any of the besides conferences, Derbycon. There are so many and find the topic within information security that excites you or a speaker that really gets you excited and inspired and then go down that rabbit hole and see where it takes you.
SS: See where you go. What would you like people to remember from our conversation if there is one idea?
RM: I think I can't remember who said this but I think security is too important to be left to the security people. So we all can and should do our part and for most people, all that means is basically being interested, that's the starting point and taking the time to ask questions when the opportunity arises.
SS: Excellent. Ryan Matterson from Nagarro. Thank you so much for coming and inspiring us to take a more active stance in Information Security.
RM: Thank you Silvija, my pleasure.
SS: Thank you for listening.
What do you work with?
I lead a team of information security consultants and ethical hackers at Nagarro. We help our clients to better understand and manage the information security risks associated with investing in and using technology.
What are the central concepts in your tech?
Security is not binary state, there is no such thing as secure or insecure in an absolute sense. Security is a means to manage risk.
Why is it exciting?
Humans are adopting new technology at an incredibly fast pace. New technology comes with new vulnerabilities so the stakes are higher than ever.
What do you think are the relevant controversies?
The ethical challenges of surveillance vs. privacy, regulation vs. cyber pollution and the sale of so-called cyber weapons.
Your own favourite example?
Although I have found and responsibly reported a couple of vulnerabilities in the last year, I don't do as much coding or hacking as I used to. These days I'm more focused on connecting and supporting others in the local community through initiatives like OsloSec and BSides Oslo.
Your other favourite examples, internationally and nationally?
I think the Security BSides movement is fantastic.
How do you explain your tech?
For the average citizen, infosec done right is a way to make the digital environments we inhabit safer and more productive. For businesses, it is a way for them to measure and manage the risk associated with using technology and to be socially responsible.
What do we do particularly well in Norway or in your country?
Norway has a long history of being kind to researchers who find and report bugs. Even though companies sometimes react badly, public opinion and the law has tended to be on the researchers' side. There is a downside to this, but is overall a good thing.