Soulful CXO Podcast

Master Class: Learn Proactive Incident Response Techniques | A Conversation with Sarah Armstrong-Smith | The Soulful CXO Podcast with Dr. Rebecca Wynn

Episode Summary

In this episode, you will learn proactive incident response techniques, crisis management, and cybersecurity secrets. Don't miss this engaging conversation filled with valuable insights and practical advice for enhancing security strategies.

Episode Notes

Guest: Sarah Armstrong-Smith, Chief Security Advisor for Microsoft [@Microsoft]

On LinkedIn | https://linkedin.com/in/sarah-armstrong-smith

On Twitter | https://twitter.com/SarahASmith75

Host: Dr. Rebecca Wynn

On ITSPmagazine  👉  https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/rebecca-wynn

________________________________

This Episode’s Sponsors

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

________________________________

Episode Description

In this episode of the Soulful CXO, Dr. Rebecca Wynn welcomes Sarah Armstrong-Smith, Microsoft's Chief Security Advisor. Sarah shares her extensive experience in enhancing security strategies and capabilities for major organizations. Her impressive background includes leadership roles in business resilience, crisis management, and cybersecurity at renowned companies such as the Microsoft, London Stock Exchange Group, Fujitsu, EY, and AXA. She has been on the front line of major incidents, including IT failures, data breaches, and fraud. We delve into having a holistic approach to business continuity, crisis management, and incident response, explore the importance of prioritizing people impact, critical actions, the decision-making processes during incidents, the shared responsibility model in cloud services, and the need for organizations to remain accountable for their data and user activities. This is a must-listen-to Master Class that will help leaders enhance the organization's security strategies through lessons from real-world incidents.

________________________________

Resources

Effective Crisis Management: A Robust A-Z Guide for Demonstrating Resilience by Utilizing Best Practices, Case Studies, and Experiences: https://www.amazon.com/Effective-Crisis-Management-Demonstrating-Experiences/dp/9355512716

Understand the Cyber Attacker Mindset: Build a Strategic Security Programme to Counteract Threats: https://www.amazon.com/Understand-Cyber-Attacker-Mindset-Counteract/dp/1398614289

________________________________

Support:

Buy Me a Coffee: https://www.buymeacoffee.com/soulfulcxo

________________________________

For more podcast stories from The Soulful CXO Podcast With Rebecca Wynn: https://www.itspmagazine.com/the-soulful-cxo-podcast

ITSPMagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

Episode Transcription

Master Class: Learn Proactive Incident Response Techniques | A Conversation with Sarah Armstrong-Smith | The Soulful CXO Podcast with Dr. Rebecca Wynn

Dr. Rebecca Wynn: [00:00:00] Welcome to the Soulful CXO. I'm your host, Dr. Rebecca Wynn. We are pleased to have with us today, Sarah Armstrong Smith. Sarah is the Chief Security Advisor for Microsoft, working with strategic and major customers across Europe to help them enhance their security strategies and capabilities. Prior roles included being the Group Head of Business Resilience and Crisis Management at the London Stock Exchange Group, the head of Continuity and Resilience Enterprise and Cybersecurity at Fujitsu and leadership roles at EY and AXA.

She has been on the frontline of many major incidents, including IT failures, data breaches, and fraud. She's a fellow of the British Computer Society recognizes one of the most influential, inspiring women in technology, a highly sought after speaker. Writes numerous articles and has written two books, Effective Crisis Management: A Robust a Through Z [00:01:00] Guide for Demonstrating Resilience by Utilizing Best Practices, Case Studies, and Experiences.

And her latest book, Understanding the Cyber Attack Mindset: Build a Strategic Security Programme to Counteract Threats. Sarah, it's great seeing you again. Welcome to the show. 

Sarah Armstrong-Smith: Oh, thank you, Rebecca, for inviting me. 

Dr. Rebecca Wynn: You have a vast background. How in the world did you even get started in cybersecurity? What did that journey look like?

Did you study cybersecurity in college? 

Sarah Armstrong-Smith: No, not at all. Um, I really didn't know what I wanted to do when I was at school. And like, when I came out, I didn't go to university. A lot of people think, Oh, you must've just had this career trajectory that went through university and you had it all planned out.

So my very first job actually was working in a fraud department. So I was actually doing fraud on fuel cards, funnily enough. And it really just got me interested in the insider threats. So that's how I started on the people side of security. Why people do what they do, but it didn't pay the bills. So I ended up working for a water.

Utility company, [00:02:00] and it just so happened to be, um, the late nineties and, uh, I was invited to work on this new IT program and that just happened to be the millennium bug and my role in that job was. Looking at what tests had to be done on the stroke of midnight. So all the recoding and designing of IT systems.

And I could ask him what if, yes, but what if the call center doesn't work? What if you can't get people into the office? What if the IT systems fail and it's a water utility? So what if we can't get water to people? I kept just this, what if, what if, what if, and I didn't realize at the time that that was business continuity.

It just felt like common sense. When someone said, this is a subject, um, this is, this is it, this is my career. This is where, and that's kind of where I say I start my career. So even though I'd had a few roles beforehand, this is my kind of line in the sand, if you like. Um, and so my role became about how do I protect the call center?

How do I keep it running? Thinking about all the networks, all the [00:03:00] services. And the funny thing was on the stroke of midnight, nothing happened. It's like, well, is that a waste of money? Was that good planning? And I went on, um, to look at putting plans in place for engineering and laboratories, but I was always drawn back to the IT.

I was always interested in that. So in 2001, I joined AXA financial services company and my role pivoted from business continuity into disaster recovery. And within a few months of me joining 9 11 happened. So that was all changed again. We'd gone from backup tapes to data replication and just started speeding up.

Um, and I was headhunted in 2005 into EY. Um, my very, very first day on the job, believe it or not, was a 7 7 London bombings. Um, and so I'd kind of then pivoted from business continuity, disaster recovery, into crisis management. And all of a sudden. EC level exercises and all of these things that happened.

And then in 2007, I joined Fujitsu and that was my kind of entry point [00:04:00] into cybersecurity in essence. Um, and so Fujitsu, they our largest service integrator, lots of hosting contracts, um, and sometimes companies would outsource their entire it. My role was to really think about what are we here to protect?

So bringing in all of that business continuity and disaster recovery, crisis management, and bringing it into the cybersecurity world. So I always describe myself as working on the business side of cyber. For that very purpose. It's really understanding the people, understanding the crown jewels and understanding all of that together.

And I said, I was at Fujitsu for 12 years and that kind of multiple incidents, multiple failures, multiple cyber attacks, and then, um, I actually joined, um, Microsoft in 2020. And true to form where these major incidents that keep following me around everywhere, um, as I joined Microsoft, we were just going into lockdown.

It's a global pandemic. So I spent the first 18 months of my career working online on teams, [00:05:00] but it was so interesting. I work across multiple sectors and multiple countries across Europe. So I was kind of working side by side with them when they're pivoting to hybrid working, mass acceleration to the cloud.

And all the way through the attackers were absolutely relentless. And we've kind of come out the other side of the pandemic. We then have the Russian invasion of Ukraine. We've seen huge, large scale cyber attacks. And it's just really kind of pivoted from there. So as I sort of say, I started over 20 years ago and it's just been this continuous journey, if you like.

Dr. Rebecca Wynn: It's interesting when we, we talk about incident response disaster recovery and business continuity and in companies, they always want to treat them separately. But I think they all tie in. Together. And it sounds like you think that way too, or know, that way too. How do you recommend companies to take a look at that?

Cause without business continuity, there is no true, , enterprise risk management, and there's no true, I would say long term recovery from incidents and disaster recovery. How should [00:06:00] they be viewing that more holistically to. , get out of them quicker and, and actually be able to save your company a lot of instances.

Sarah Armstrong-Smith: Oh, I think the, uh, real key thing is that people seem surprised when we have major incidents. So I look at, so I remember to 9 11, um, people talk about black swan events. So these events are so big of such magnitude. They could never have been predicted. No, I don't believe that. I don't believe in black swan events whatsoever.

And actually one of the things that even when I was a kid. Um, I started reading public inquiry reports and the first one I got interested in was when I was 12 years old and it was the public inquiry as a result of Piper Alpha. It's very similar to Deepwater Horizon, big oil rig in the North Sea. That exploded, killed people.

Um, the public inquiry report really was very interesting to me that, you know, these are major incidents don't just happen from nowhere. There's always a story. There's always a trajectory, [00:07:00] miswarning signs, reports that have gone wrong. It's all kind of hinges on the culture of the organization. Um, and it kind of just really dawned on me.

Um, with regards to the fact that how bad does it have to get before we take proactive action? Um, and you mentioned my first book, Effective Crisis Management, it's really looking at that. So I picked some of the worst events in the last 20 years. So the 9 11, Colonial Pipeline, um, Deepwater Horizon, even the pandemic itself, um, and even the recent pandemic.

This is not the last, this is not the first pandemic we had. We had one 10 years ago and that was the H1N1 pandemic. But we're always surprised. We're always surprised. Lessons get asked. How did we get here? And so that's really important to really think about, um, some of these major incidents, really learn lessons and what do we have to do to make a proactive change?

And that's a real bit that I'm truly trying to emphasize to people. Cause it keeps repeating the same problems. You're just going to have the same [00:08:00] mistakes, the same incidents, but arguably they're going to escalate. They're just going to get bigger and bigger until we repeat the cycle. We're going to be here a few years down the road.

It's going to be another industrial accident, another cyber incident, another um, huge scale incident where we're going to be back to square one again, asking the same question. 

Dr. Rebecca Wynn: One of the things I like to do here in the United States is I look at the congressional hearings. And, and read through those.

And a lot of people don't, but I read the ones that are from the House and from the Senate, because they usually have to do a very deep dive. And one of the things I see common thread is where, you know, there are people didn't want to believe somebody, or they are afraid to speak up and say something because they're afraid to, you know, the whole victim blaming.

I tell people, Even though people want to say the root cause is a human, I like to look and say, why didn't that human speak up? Why was there not a chain of command? Did, were they fearful that they lose their job if they say something? Was it basically, you know, Beaten out of them for a failure of a better term, [00:09:00] that speaking up doesn't get rewarded.

And I think you should reward people to speak up and people should be Be more free to be able to say, guys, I see this. It just doesn't seem right. Look, right. Cause I think your gut really tells you more times than not that something's off, but people don't have those lines to be able to say that or feel like they can say that.

Do you see that as well too? 

Sarah Armstrong-Smith: Absolutely. I think there's two parts to that. There's a victim blaming and a scapegoating as well. So I get really wound up, even when people in, we're talking about cybersecurity, people say the users are the weakest link. It's very easy to blame people. Whether it's a blame person that clicked on the link or we have a major incident, let's throw this CISO under the bus as a scapegoat.

Actually by doing that, it doesn't change anything. And I think that's a real challenge is we've got to really have a good, hard look at ourselves and have introspection. And that introspection is not a very easy thing to do, but [00:10:00] actually, um, you know, when we think about, um, if someone did make a mistake, they clicked a link, whatever they did, it's because the process does not work or the technology behind it doesn't work.

So in this day and age, we have to assume compromise. We have to assume failure. We have to assume the best will in the world, that all the training and all of this. things, people are going to make mistakes. People are going to do it. People, some people just don't care, you know, so expecting them to care and really then comes down to how well or not your, your organizational culture is.

And, and I think a really great example is just, you just alluded to Rebecca is the fact that, You know, what happens if something doesn't feel right, someone's applying pressure, I've noticed something that's wrong. I've done an audit and there's all these reds and ambers, but it's repeat of the same one I did six months ago, you know, a year ago, what is your attitude?

When people give you that information. So do you welcome it [00:11:00] with open arms and say, Oh, thank you. This is a real opportunity to learn, um, real opportunity to make change. Or you just kind of go, Oh, well, we'll just put it under the carpet and with everything else and hope no one notices because people start to see that pattern.

And then why would I hold my hand up? Why am I going to put my head above the parapet? Only to be shot down or only to be ignored. And the problem then is not only the fact that you, these things scale into a major incident, but also you're more likely to have a whistleblower and it gets to a point where I'm so fed up.

I'm, I keep saying the same thing. I keep getting ignored. I know these problems are out there. And I don't feel like anyone's listening to me. So who am I going to tell? Now we've seen really good examples of this historically, uh, particularly in the military and various different things. And that's incredibly dangerous when they take to social media, a journalist, um, even a regulator.

And I think this is the problem and it really does stem from, as I say, the culture of the [00:12:00] organization, the culture as it stands today. Um, what are we going to do about it? What's the culture we want ultimately? 

Dr. Rebecca Wynn: Yeah, I always find that interesting too. And a lot of times those who do speak up, to be honest, are the first ones that are on the layoff list.

Um, and I tell you, those are the people that you need to keep. Those are the people who, to me. You know, you always have to look at their attitude, but most of it is they really want to be on the greater good of the company. And those are the type people who you want to see, right? They're really the true analysts, right?

When they go ahead and see things along those lines. I know I was at a company doing a stint on consulting with them for six months, um, as a vCISO. So as they tried to determine what they really wanted to do with their company and. Basically almost my first day, they ended up, we're doing that annual tabletop exercise, which I'm totally against doing the annual, you know, check the box tabletop.

But all of a sudden they didn't have any playbooks. Um, they just went ahead and selected a bunch of people who basically had [00:13:00] titles to be as their instant response team. And then all of a sudden they came up with these, like, we got ransomware attack. We have this, we do that and all that kind of stuff.

And then I'm like, okay, Rebecca, you're the CISO. Now you lead the instant response team. And. I'm like, okay, well, who here is in charge of networking? Okay, what's happening on network and stuff like that? Then they got mad at me. And I'm like, your people in the room don't even have a playbook. You're not recording this.

They don't know who controls the architecture. What does the architecture look like? Is that even right? You know, where should we go for? I go, my whole thing is that you might, first thing you might have to do is unplug, but I need to know who can pull the plug. Right. So how do you suggest that people actually go about when you're creating an incident response team that you would create the team responsibly so that you can set them up for success?

Sarah Armstrong-Smith: I think the main thing to think about, I think some people have a plan. And they're so rigid to that plan. It starts at A and goes through to Z. And when it goes off [00:14:00] piste, they don't know what to do. They're so stuck because they're in this rigid plan and it must work to this plan. And I think that's one of the dangers is having a really regimented thing that doesn't really help anybody.

So it has to be a guideline first and foremost, it has to kind of be the real important thing. So the first thing I would say is you have to have people impact above everything first. And I've looked at so many business continuity, crisis management, incident response plan, but the impact to people is like halfway down.

You look at the technical impact, the operational impact, financial impact. Now inadvertently, um, that says we value profit over people. And people, whenever I say that to someone, it's like, Oh, absolutely no way. There's no way that we put profit above people, but when you put financial impact above people, it really does change the mindset.

So first and foremost, doesn't matter what incident, it's a cyber attack, fire, flood, your power's gone out, whatever the case may [00:15:00] be. I need to know what the people impact, whether that's our employees, whether that's our customers, our partners, whoever the case may be. So that's really important to have that mindset first and foremost.

The second thing is to really understand the critical actions that must happen. What I say to people, every action, Has a chain reaction, and it's a big difference between no action and inaction inaction is doing nothing and no action is a deliberate choice to do nothing. So whatever that is, whatever that action is that needs to happen.

Do you have the right information available to make that decision? If you don't, how are you going to get it? Because ultimately you can only make a decision based on what you know at the time. Now, it may be that in hindsight, when you look back, you do a post incident room, that was absolutely the worst decision you could have made.

But again, there's, there's a lot [00:16:00] of lessons to be learned there about who, who has the information? Are they aware of the information? So I'll give you a good example of ransomware attack. Um, you know, one of the kind of the key considerations we're not going to pay. We've got backup tapes. We're going to go to backup tapes.

So just saying we're going to go to backup tapes is only part of the equation. So I need to know what backups have I got? What does it cover? When was the last time it was tested? Is it successful? Does it enable me to do a full rebuild? Because a lot of backup tapes will just be the odd file or the odd system.

Can I recover end to end? Are those backups protected? Are they part of the ransom? And so all of these, these, what if these, a lot of these questions, I don't want to be figuring those questions out when I have a ransomware attack, the, um. threat actor is all over me. The media's on me, the regulators on me, I've got angry customers.

So I need to have done my due diligence in advance. I need to kind [00:17:00] of preempt what you're going to need to know, who needs to know and who is making the decision ultimately. So is this a hierarchical decision? Does it have to go all the way up to the top? Someone makes a decision and it goes all the way back down again.

Are we making it by committee? And therefore, who needs to kind of be involved in that? Now, as you said as well, if I take someone out of the equation, are they on holiday? Are they on leave? Are they sick? Who's empowered? To make that subsequent decision. And do they understand that? So when we think about those decisions, we're not just thinking about the first decision.

We need to know what the knock on effective. So if we're going to about to cause a chain reaction, not only do I need to know the second action, I also need to know the third and fourth, and that's really important, and there may be some things that cannot start until this. Other thing has finished and inadvertently what we're doing through there, if I understand this, I've done my due diligence.

I need to understand the critical path and that's really key in decision making. So when [00:18:00] I'm thinking about, you know, we're on the clock. Um, if I'm going to justify shutting the entirety of internet banking down, for example, The minute I pull the plug, everyone's going to know about it. I'm going to have social media.

I'm going to have people who are angry. The regulator is going to know it's going to be picked up by the media. So I then have to have all of my ducks in a row with, am I prepared? Have I got all of that into play? Because once you've made the decision, it's very hard to take it back. You started on this trajectory, you started on this plan.

And so the, the actual pre planning, as you were sort of saying, Rebecca as well, having those exercises. And taking people out of their comfort zone and kind of, kind of throwing some of these, um, weird and wonderful things at them. Because what I can definitely tell you from that experience is whatever you've planned for is never the incident that actually happens.

It's going to go so off piece that you're probably going to be relying on your plan B, C, whatever the case. [00:19:00] Um, and so we're going to have to kind of think about all of those different things combined and make sure that people are confident. Confident in their ability and they're empowered to make the decisions, ask the questions and don't just kind of go to this rigid plan, as I said.

Dr. Rebecca Wynn: Yeah, we see that quite a bit. And the one thing that, that always gets me with really, I say, we see a lot of startups. Middle sized companies is not even knowing your security architecture. It's hard to protect what you don't know. And I tell people, if you don't know your architecture, what ports, protocols, services are, are open, what should be allowed in your network, outside your network, it's really hard to really even to start on that plan.

And I see people put that off. Regularly, is that what you see too? What would be like the top three things that you think companies really need to have in place so you can be better prepared for this incident to even know who to even call, right? 

Sarah Armstrong-Smith: Exactly. And I think there is, there are so many lessons [00:20:00] learned from other organizations who have been through this.

And I think it's easy to kind of say, Oh, why would they attack me? The answer is why wouldn't they attack you? So the first thing is to have that assumed compromise, that assumed failure mindset that we were talking about. So the best one in the world, the technology, you might have to outsource all the training.

You still have to assume a threat actor is going to be able to get in. And if they can get in, what can they do? So I always talk about the really, if I break it down to its lowest denominator, the strategy is really simple. No matter what company, what size, what sector, um, the strategy is stop the access in and the exit out.

So when I talk about the access in, it's that really every entry point into the organization and then really understanding the vulnerabilities of any of those access points, um, and then kind of being quite open and transparent about where's the priority, what, where do I need to kind of have my investment?

Now that [00:21:00] priority investment will be determined by the other thing, which is the data, the exit hours, how do I. Think about stopping the data exfiltration. And that comes back to what we were talking about, the critical, um, data, the sensitive data, the crown jewels. And it's really about understanding your business inside out, as well as outside in, and I kind of get frustrated when a lot of people say, now the attacker understands your business better than you do, how can that even be?

This is your organization, they're your process, it's your technology. It's, it's all of those things. And so you really then kind of have to have that kind of inside, internal reflection about, um, where you are today, where you need to be, where those gaps are, what are you going to do about it? Um, because we don't have an infinite pot of money and infinite resources, as much as we'd like to have this magic wand and buy lots more cool [00:22:00] technology.

That's not realistic. That's not how we work in the real world, but it is really important that if we assume we can get in. What is that, if you think about that center, the crown jewels, if you like, how do I work backwards, where is it? Who's got access to it? What are the controls that I need to have? And that's where we build that kind of layers of defense in essence.

Um, and you kind of always think about the Tower of London, if you like. Um, so the crown jewels are right in the middle. Um, but before you can get anywhere near that, you have to go through umpteen walls of defenses, there's parapets, there's high walls, low walls, you know, all of these things. And, and further you get to the core, um, arguably the more monitoring, the more defenses you need, because they get that to that point, they are going to steal whatever they come in for.

So there's so many opportunities to stop that access, to stop them laterally moving, gaining a foothold, kind of doing all of those things. But [00:23:00] I really went to the lowest level. Understand identity and data is the kind of the two core principles. 

Dr. Rebecca Wynn: We've talked about looking at every incident you have, and if it's a power outage or something like that, and what lessons you can learn.

And one thing I noticed when I look at companies is when I look back at it, instances as recoveries, I fail to see that I. Actually, I don't see that, that they do a lot of those after action reports. And I say in a mindful way, they might have one person who's like, Hey, the power was off for three hours.

It came back on, whatever. And basically almost says it that way, instead of looking about, you know, how do we react as a team? Did we have the right people? Did the backups who are here because someone was on PTO, did they know what to do in a timely manner and really using them as a training session? And why do you think people?

Are constantly not doing that. Bigger companies might do that a little bit more successfully than smaller companies, but I tell people you're missing it. Those are, those are training you for the bigger event [00:24:00] that's going to happen at some point in time. 

Sarah Armstrong-Smith: Yeah, absolutely. I think it's more a case of wipe the brow.

Weren't we lucky and move on. But what I say to people is do you need to treat a near miss as a. Gift. So even if the power did come back on, it wasn't as bad as we expected. Um, we really need to treat it as if it did, because this incident, whatever that incident has hit you in some way. Now, if we were going to take that to the next degree, let's use it as a, in a.

Brilliant exercise. Let's take it and say, well, what if it didn't? What if this had gone on for one day, three days, a week? We, and if it had fallen back before the global pandemic, we would have never thought the global pandemic was going to go on for a couple of years, let alone. And so I don't think anyone's pandemic plan sort of said, well, let's see how we're going to pan out in from two years from now, we always assume we're going to get things back up and running as quickly as possible.

And I've seen so many plans, and I'm sure you have Rebecca, where they're so simplistic, even some of these exercises, even like [00:25:00] your scenario with the ransomware and my. Uh, example of the backup tapes, people just assume I have backup tapes. We're going to recover. Everything will be back up and running tomorrow morning.

Happy days. The end. And as we know, that is not the case at all. And so really it is incumbent on us to learn from those that have been attacked to really understand what went well, what went wrong. Um, if in hindsight, if they look. Backwards, what were the telltale signs? And that really means as all being open, all being transparent, all being willing to share.

So when we do have these things, rather than going, I hope nobody noticed, weren't we lucky few and say, put it under the table, put it under the carpet with the rest. Actually say, do you know what? We had a near miss guys. Oh, let's learn from it. Let's actually get all the people around the table, not just in our own organization, but as an opportunity to bring in others in from the same sector, whatever the case may be, [00:26:00] because you know what guys they might be after you, you might not be as lucky as we were.

But the other thing, which I think is really important for us to get across as well, is the reason why it was a near miss. Could be because your plans worked, your technology did its job. And I think it's really important to celebrate that. Actually, we did really good job here, guys. Really, really good. We need to celebrate that.

So I think it's, we need to look at it from both perspectives. What do we learn from a near miss, but actually take the victory as well. If things are working as it should be. Yeah, 

Dr. Rebecca Wynn: I agree with you. I think, you know, it's interesting. One of the things I think that I come across, I do know that I come across way too often is that, you know, we do a lot of things in the cloud.

We partner with a lot of big companies, maybe it is Microsoft or someone like that, and they're just going to handle it for us. And I tell everybody there is a shared model, you know, you cannot rely just because you are working with a lot of other [00:27:00] companies that you're going to be okay, because those other companies.

Also might have a cyber breach that takes them down and we've seen that over time when people have Facebook and stuff like that, they've gone down and that's how you, you handled all your customers. And now what, because now you can't get business in because they got attacked and they had to go offline.

How should people handle that more mindfully as people are. Less so running their own data centers and having other people manage them, but have the attitude that, you know what, someone else is going to handle it. I think that's way dangerous today as well. 

Sarah Armstrong-Smith: Well, I think the beauty, as you said, Rebecca, of the cloud is it is that shared responsibility model.

So you're moving away from being responsible from all infrastructure, all architecture, keeping it up to date. So when you move it into the cloud environment, the cloud service provider or the SAS service provider takes on a level. Level of responsibility. So the infrastructure level, the kind of patching goes away.

I mean, [00:28:00] patching's still required, um, but that is done by the service provider. So yeah, as you kind of go further up the chain, so if you get into SaaS services, and obviously the application becomes the, uh, responsibility of the service provider in terms of coding and everything else like that. That being said, and it does come back to one of the things I said about what's really important.

Um, however, looking at those cloud services and the service agreements that you may have, two things to bear in mind. You're always responsible for your data, and you're always responsible for what your users are doing with that data. So it really comes back to exactly what I just said. Identity and data is really core.

Now the cloud service provider. Um, the application service provider will no doubt have a number of inbuilt controls, um, but you can take advantage of, but it's really incumbent on you to understand how, how, um, they should be set how high, how low, and when something goes wrong, uh, as you sort of said, so if you're getting an alert or something's not working, right, [00:29:00] something looks a bit strange again, you still have to be accountable for what your business is going to do about that.

So, yes, you know, you've got, you've got. Huge opportunities to take advantage of tech and emerging things that are coming out, AI, these cool new models and everything else, but you still cannot transfer that risk over your, as I say, you're still accountable ultimately for your people and your data, um, and having the right process around that.

So there's things you can take advantage of, but I think that's really core to the conversation we're having today. 

Dr. Rebecca Wynn: Well, unfortunately our time has totally run short. I want to thank everybody for joining us today. Please go ahead and make sure you like, subscribe and share this and give us your comments and tell me who else you might like to have on the show.

Also please subscribe to the Soulful CXO Insights newsletter that is out on LinkedIn. Read through the description of the show where you can have Sarah's contact information. You also have links to her books as well. Sarah, thank you so much for coming on the [00:30:00] show. You're an inspiration. I love having strong, great women here in technology.

And thank you for being a role model for all of us women. 

Sarah Armstrong-Smith: It's been an absolute pleasure. Thank you, Rebecca.