Soulful CXO Podcast

Top 3 Cyber Roadmaps for 2024-2034 | A Conversation with Dr. Georgianna "George" Shea | The Soulful CXO Podcast with Dr. Rebecca Wynn

Episode Summary

20 Years of Defending U.S. Prosperity and Security and Spearheading Cyber Initiatives: Dr. Georgiana "George" Shea's Journey

Episode Notes

Guest: Dr. Georgianna "George" Shea, Chief Technologist, Defense of Democracies [@FDD], Center on Cyber and Technology Innovation (CCTI) and Transformative Cyber Innovation Lab (TCIL)

On LinkedIn | https://www.linkedin.com/in/drgeorgeshea

Host: Dr. Rebecca Wynn

On ITSPmagazine  👉  https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/rebecca-wynn

________________________________

This Episode’s Sponsors

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

________________________________

Episode Description

In this episode of the Soulful CXO, Dr. Rebecca Wynn welcomes Dr. Georgiana "George" Shea, the Chief Technologist at FDD Center on Cyber and Technology Innovation and Transformation Cyber Innovation Lab, who discusses her role in identifying cyber vulnerabilities and devising solutions for the U.S. government and private sector. She shares her extensive background in spearheading cyber initiatives in various government organizations and her expertise in cybersecurity testing and evaluation, and detailed predictions for the next 10 years in cyber.

________________________________

Resources

NIST Artificial Intelligence Risk Management Framework (AI RMF 1.0): https://nvlpubs.nist.gov/nistpubs/ai/NIST.AI.100-1.pdf

The Operational Resilience Framework: https://www.grf.org/orf

The significance of quantum computing: https://www.fdd.org/in_the_news/2023/08/07/the-significance-of-quantum-computing/

Stakeholders see opportunities in CISA secure software principles, raise questions on implementation: https://insidecybersecurity.com/share/14549

________________________________

Support:

Buy Me a Coffee: https://www.buymeacoffee.com/soulfulcxo

________________________________

For more podcast stories from The Soluful CXO Podcast With Rebecca Wynn: https://www.itspmagazine.com/the-soulful-cxo-podcast

ITSPMagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

Episode Transcription

Dr. Rebecca Wynn: [00:00:00] Welcome to the Soulful CXO. I'm your host, Dr. Rebecca Wynn. We are pleased to have with us today, Dr. Georgiana "George" Shea. George is the Chief Technologist at FDD Center on Cyber and Technology Innovation and Transformation Cyber Innovation Lab. In that role, she identifies cyber vulnerabilities in the U. S. government and private sector, devising pilot projects to demonstrate feasible technology and non tech solutions that, if scaled, could move the needle to defend U. S. prosperity, security, and innovation. Before joining FDD, she spent 20 years spearheading cyber initiatives throughout the Department of Defense and other government organizations.

Most recently, she served as a subject matter expert and consultant to the office of the secretary of defense, where she led multiple efforts to improve cyber resiliency and advance the practice of cyber security testing and evaluation by providing deep [00:01:00] cyber resilience analysis. Guidance and engagements with the U. S. Army, Navy, Marine Corps, Air Force, and Space Force programs within the acquisition process. Before working on cyber security test and evaluation, she specialized in cyber operations, capacity development, implementation, and execution for the Department of Defense, the Department of Justice, and the Department of Energy.

Additionally, she serves as a working group member for cyber physical systems resilience under the President's Council of advisors on Science and Technology. George, it's so great to see you again. Welcome to the show. 

Dr. Georgianna "George" Shea: Great. Thank you for having me. I'm excited to be here. 

Dr. Rebecca Wynn: Your background is just amazing and diverse and across many different government areas.

How did you get from going to college to getting this fabulous role where you're such an impactful subject matter expert? [00:02:00]

Dr. Georgianna "George" Shea: I had to start before I went to college because out of high school, I enlisted in the army, and I think that was a big step in the direction where I ended up today.

I joined the army, I got out, and then that provided me with I think I had a secret clearance at the time, plus I had experience, plus I had the drive for national security. I'd worked in some communication security systems with some of those systems. I got out of the army and then I went to college and I was focusing on math.

But I didn't really know what I wanted to do with math. I ended up switching from majoring in math to majoring in computer science. And then as I was graduating with my degree in computer science and a minor in math and having the military experience. I was one of the the first. Employees for the it's, of course, it's an acronym that [00:03:00] the PRCERT, the Pacific Regional Computer Emergency Response Team for the army out in Hawaii, which was the the first time we were leaving the United States was standing up.

Protection for critical infrastructure. So they were looking for an initial team of mathematician, electrical engineer, people that knew something about computers, because keep in mind way, way back then, they didn't have computers, cyber security certifications. It wasn't really a big thing. People didn't really know about it.

I'd explain what my job was all the time. So I went back from, the army to college then to supporting the army. 

Dr. Rebecca Wynn: Thank you for your service. Definitely appreciate doing that. You've written extensively, I would say this year, you've done priors too, about how we can think better about protecting companies, as well as government from insider threat.

And we keep seeing that come up and up again and data [00:04:00] breaches. Not only is it putting proprietary information at risk, it's putting client information risk, putting employees data at risk and obviously putting consumers data at risk. Is there a better way forward? So we don't keep repeating the same mistakes in the past and not protecting this critical data.

Dr. Georgianna "George" Shea: Yeah, I think so. The solutions I usually see focus on the technology. So it's those controls that are in place or those triggers that are in place to then identify what's going on. But I think we can incorporate that the people part of it. So you really start to bridge the people processes and technology.

It's a better approach in, in, both preventing insider threat and detecting insider threat. And I'll give you an example. So there was an incident. Earlier this year with a young National Guardsman that had insider threat. And having worked in [00:05:00] military environments myself, having worked in different SAFs familiar with how they do things.

If have the right access controls, then maybe you're going to be restricted. But this particular young man, it was his job to go through it and look at some of the classified information to handle the classified information.

So he becomes a vulnerable spot in that process. There was early indications of him actually just copying down information, then printing stuff out. And so my thought on that if you're going to print stuff out of a classified environment inside of a SCIF inside of a place where there's access controls there, he's got the access control piece.

He has the right to look at the information. But whenever you print something out. If I were to print something out in a SCIF, I'm going to need to take it to a SCIF manager and say, hey I want to store this someplace, so they then open a file, put it in a safe, because I have to physically store that paperwork [00:06:00] someplace, if I've printed it, so I'm not seeing the the, trigger to ensure.

Okay, it was printed. Something is now in a a three dimensional state. It's paper. So, who's going to know that ownership of it? Where is it going to get stored? How's that going to be looked at? It's on me, the person who printed out, then go find the The FSO, the person that handles the SCIF area and say, Hey can you put this in a safe or can you put this in a file or, me locking it up?

But the, control of saying, okay, that chain of custody for control has that been insured and it has been validated because I can see that things have been printed. Now, where are they? And that's where it drops off, because it enters the physical security realm and the human realm of, depending on those processes, which, which are lacking.

So it's easy to step out of the technical realm, jump into those again, the people and processes piece. And then it falls apart because [00:07:00] so many things do get compartmentalized in the design and execution. So I think if you approached it holistically, it'd be much better, way. 

Dr. Rebecca Wynn: For those people who are not from the government sector, can you explain to them what a SCIF is?

Dr. Georgianna "George" Shea: Yes. So it's the SCIF. It's the secure room where you can actually handle classified information. So you can't take classified, secret, top secret information down the hallway into a room where people don't have clearances. It's a facility that's been developed, built, engineered so that you can handle, store, read, discuss classified information.

Dr. Rebecca Wynn: Thank you for that. 

One of the things that you were mentioning too is, from a logical control, being able to see that someone actually did print something and then being able to have a check and balance is not only that you have that list. But who's going to check to see what happened with that [00:08:00] piece of paper was that piece of paper then properly shredded in the secure shredded bins. For companies are out there.

They're similar if you're not government, right people are doing proprietary information What are they doing to a proprietary information? Are they just leaving it on a desk somewhere? Are they leaving it where someone who should not be looking at that information is going to see that and that comes to, need to know. Can you explain to people how that comes into play as well, too? 

Dr. Georgianna "George" Shea: Yeah. So it stands between both the technical side and then the human process side of it. There's the human behaviors. For example if you have a company, you have employees are coming in from nine to five, yet you have one employee that's really staying until eight o'clock and there's no reason for that person to be staying till eight o'clock, but that should, raise a flag.

Or as we were just talking about printing, there's probably an [00:09:00] average amount of printing that goes on. If there's an average amount of printing going on and you have an outlier that's printing reams and reams of paper, why? Why is that person printing more? So you can start to develop these basic human metrics that you look at and then define what the baseline is and go through and understand what that is.

But then the other aspect that you were talking about, the least access or least common privileges of where the access are. I mentioned security clearances before, and if you're not in the military I'll explain this. So if I have a secret clearance, I have a top secret clearance, and then you also have a top secret clearance, that doesn't mean I get to share my work with you just because we both have a top secret clearance.

You don't have a need to know. You're not working on my project. So it's it's not just that level of classification, but the actual specifics. So even in a non classified environment again, if I'm if I'm working in accounting for a company, it's not top secret or [00:10:00] secret, but it's no one's business outside of the role that I have to know what I'm dealing with under accounting people's pay other things like that.

Dr. Rebecca Wynn: That comes down to training your acceptable use. And, that's the one thing I think from a business sector can do better is putting those guardrails up like the government does. Obviously the government has had insider threats and have other breaches as well too, but the ramifications. Are very clear what's going to happen when you do that. Have you seen that in your working groups to when people have talked about that both private and public sector?

It really need to be very clear on the repercussions are going to happen. If you go outside those, frames. 

Dr. Georgianna "George" Shea: I would say on the DoD side the, government side, it early on, I saw that a lot. So if you spill classified information, you take classified information, you divulge classified information.

You just [00:11:00] assume you're going to go to jail. I think that's where it is. So it's, very ingrained, but then I feel like it's been a little. Maybe less ingrained. I think more, I hate to say it's a generational thing, but there, there is a distinction in generations on, how they were raised with information.

So I would say my generation, I'm a little more conservative with putting all my information out there or what it is I'm going to say, sharing information. But then I believe that younger generation, that's what you do. You put it on Twitter, you put it on Facebook, you put it on Instagram. Like I know what you're eating.

I know where you are. I know who your friends are. I know what you're doing. And it's a way of life. So I don't think the the generation translation of what it means. If you get caught. Actually giving out information. It is the same because we're like we can push it. We'll see because that's what you do.

You just put it out there. Everyone needs to know. And [00:12:00] everyone's I don't want to say an activist, but a believer in free information in the younger generations. And that's a broad brush with a lot of people. And they're not all that way, but generically generically speaking that's how it looks to me. 

Dr. Rebecca Wynn: Yeah, I think part of the transparency, I remember working for a company that was a health company and they all of a sudden I got questioned by the CTO on why everybody couldn't see everything that was on my calendar, for example, and why didn't people had access to the, my drives and stuff like that?

I'm like, what are you talking about? Everything that you work on, everybody should have access to everything that you're working on, even your thought files and stuff like that.

And to me, that's always seemed very strange and can give more. More leak of information out there. 

Dr. Georgianna "George" Shea: Yeah. I haven't seen as strong of a stance in the in the commercial world on ramifications for divulging information. There might be a, some type of [00:13:00] NDA or non disclosure agreement that you sign when you start working someplace or work on a project, or you talk to particular people in collaboration, but it 30 page document, you sign it.

All right. That's it. It's not like the annual class or the reminder or that even like a demonstration of what that is because you don't really see that happening. 

Dr. Rebecca Wynn: Yeah, I'm, I agree with you. There's, been a real lacking of that. And it's been like training has come down to a few minutes and things like that.

And one thing that I've always tried to train people is what is the ramification to all of us if this happens? If we end up having data that gets leaked, what does that mean for our company? What does it mean for our brand? What does that mean by losing customer confidence in us that can also lose contracts, which also can drive how many people you can hire and, how you're moving forward.

And so not being a watcher of that and not being a good steward of that [00:14:00] potentially. You could work yourself out of a job just because a company has to downsize because they don't have the sales anymore. So part of it I think is, us as professionals, at least in my field is translating what's the so what beyond potentially you could do jail time or potentially you could have major fines because especially in our world PCI and stuff like that, you can have personal ramifications.

Financially as well, too. And individuals need to know that it's not only the executives that could be held liable. You as not being a good steward of that data and letting things leak, you personally can find yourself in civil problems. 

Dr. Georgianna "George" Shea: But that also gets into some of the I guess the business continuity resilience aspects of companies and being able to identify what are those key data sets that are prioritized?

What is the impact if they are leaked and it's against the wrong hands and that then as we. You continue down this AI road and quantum road where things become [00:15:00] faster and more aggregated. What does that mean when you have pieces of information that maybe aren't that important, but then together they do become important.

And a very simple example of this, again I'll pick on the DOD. You could have a, let's just say a classified system or classified project. And then they. We'll outsource to a contracting company and then those contracting company requisitions when they're asking for, okay, we want to hire people with these skills, you can take those skills and deduce, oh, what are they going to be working on?

Because there's a requirement to work on, I'll just say like the flux capacitor in the time machine. Oh if you have to have an expert that knows the flux capacitor, maybe they're working on a time machine. Again, especially through AI and aggregation of information, those types of things are going to be something else that organizations need to start looking at, not just the, we can't say we're working on time machines, but the pieces of information [00:16:00] that are out there that they're also handled at the level of security that's going to be required for mitigating it in that aggregation.

Dr. Rebecca Wynn: Because you brought up. AI has been around for a long period of time. So we're talking about the AI today. 

Dr. Georgianna "George" Shea: Yeah. Yeah. It used to be like if you talk about AI, you work in AI. Now you're talking to someone working on their eighth grade paper.

And yeah, I'm using AI. So it's much more commonplace now. 

Dr. Rebecca Wynn: So what do you. Like for me, one of the things that I always think about is the amount of data going out there. I read some metrics several months ago, and I know it's probably a little more that even for commercial companies that are out there that they, anywhere between eight and 12 percent of their intellectual property was getting zipped out.

And what I mean by that is just being streamed out into some sort of generative AI program out there. And once it's out there, there's not an easy clawback.[00:17:00] People say, oh, we're going to clawback, but there's no guarantee on that.

I think that's very dangerous. I use it to double check certain things too. 

What do you, what are your viewpoints on that? 

Dr. Georgianna "George" Shea: Yeah. So AI is really interesting. And I don't think companies have gotten real smart on what that means. Like it's great. You can go through and produce things really quick, but at the same time.

Everything that you're putting into the into the query, into the question, or the, I guess the, into the AI, into the model, into ChatGPT is now going to the internet. So if I, for example, work for Coke, and I put our secret formula for Coke in there, and I said, What would be a good addition to this formula?

Again, making this up. I just gave away the formula to Coke, and it's now in the ChatGPT model. And I don't think people are understanding. Yes, their employees are putting information in there. You're just throwing stuff right out the window to the Internet for it to be consumed by other people.[00:18:00]

So they have to understand again, what is important? How is it being handled? Who's allowed to handle it? What are the restrictions on that? 

Dr. Rebecca Wynn: Cause you mentioned on business continuity and things like that, I, it's amazing how many people I talked to and even CISOs, CIOs, and CTOs and I'll ask them, I said, do you have an acceptable use policy and do you train people on acceptable use and the responsible use of generative AI in your company? Sure, we have an unacceptable use.

And when is it thou shall use generative AI responsibly. I go, that doesn't tell people what they should be doing. 

I know we have through NIST AI framework. That's like in early stages, the EU early framework kind of things that you think about, but do you see something like the MITRE ATT&CK or something like that coming in and being a better AI framework for companies to use to protect themselves from systems trying to actually rip their data and for them to try and protect their data from being let out by [00:19:00] an insider who might not mean to do any harm. 

Dr. Georgianna "George" Shea: If you look at the NIST document that came out, I think it was January of this year, NIST Risk Management Framework for AI, I think it is 100-1. It goes over the life cycle of how AI is developed and it starts with a design. And so if you look at the design and then the information model that's used, you look at the testing that's used, you look at the intent of it during the design, you look at the test cases and then the actual use and impact of it.

Each, stage of that development process could be misunderstood by the following stage. So if you're the engineer and you're designing an AI system, maybe you were thinking it's This it's going to do ABC. This is the point of it. I'm going to work. I have some AI system just for dogs. I'll say I don't know what it is, but and then it gets It's reused.

And now it's not being used just for dogs. It's being used for cats and people and birds. So there [00:20:00] might be some change along the development that could cause an impact. Or if you have a the test processes that are geared towards not again, where it's going to be fielded or how it's going to be used.

And it wasn't tested accurately or the same thing with the data model that was used. If you didn't incorporate the right data to it based on how it's going to be used or the intent, then. It might be unethical, it might be mis misused, it might have the impact that you hope it's going to have, it might have some other effects to it in terms of frameworks, I wouldn't really say a framework, but you as the consumer of that, you can I know there's efforts going on with things called like a, an AI framework, an AI BOM. So your artificial intelligence bill of materials, which is very much like your software bill of materials or your hardware bill of materials, which is understanding what it is you're receiving, and that the components of it, and in the history of the development of it, [00:21:00] so you know how it was developed, who developed it, where it came from, what are those other pieces in it, So now if you can holistically say, Oh, okay.

This is the the design of this initial AI thing that was supposed to do this, but it's been repurposed and now it does this, that might be a risk or it might be an issue or, it's not, but at least it gives you the transparency of the entire purpose and where it is today and then how you incorporate that.

Dr. Rebecca Wynn: If you could give leaders out there three key things to take a look at as they move forward , what are some things that they should really go ahead and, start considering for the business so they can be better resiliently set up for cyber and business continuity?

Dr. Georgianna "George" Shea: So first. When you say, what should people start looking for in the future I've been working heavily on, on quantum, like a roadmap to quantum and making sure that helping ensure that commercial [00:22:00] organizations, government organizations are aware of what, that means.

In the quantum computing, Development process. If you were to look at the different papers that are written right now, you'll find papers that say quantum computing will be here within a year or another expert that says quantum computing will be here within 30 years. So when you have such a wide spectrum of experts saying the delta of 30 years I, believe that the focus kind of gets turned away from quantum.

We'll kick it down the road, we'll wait till it's important. However the experts saying it's one to five years that they're not necessarily wrong. There's there's lots that you need to understand with quantum. There's the, hybrid approaches to quantum computing. So there's, different ways that it's becoming more of a reality.

So, I'm thinking the, use of quantum is more around the five year mark versus the [00:23:00] the 30 year mark. And the, government's already taken steps to identify the vulnerabilities with our current day encryption and the need for a quantum algorithm, post quantum computing algorithm development, implementation and guidance.

So NIST has taken that on and the guidance for new algorithms that have already been selected should be out next year. And so with that, there's a timeline of NIST puts out the guidance for the new algorithms. All different instances of crypto are being identified within companies, organizations, national security systems that, that timeline of when they change over to the new algorithms. And that's more on the the federal timeline of things. Then if you're supporting the federal government, maybe you're a supply, a supplier, one of their components, or you're going to feed into this.

You need to be aware of that so that you're going to make those deadlines. So it's interesting. I don't think the commercial side is really following it too much. So to all the CTO, CEO, CIOs, [00:24:00] anyone handling the technology piece of it, I would say understand what that timeline looks like, understand how you fit into it.

I mentioned the AI BOM, the SBOM. There's also a CBOM, the crypto bill of materials. So within your organization where do you have your crypto? What will need to be changed? If you're doing your equipment asset refresh, maybe you're buying equipment and you expect it to last for the next five years.

If there's a change in cryptography, then there's going to be a change of what's going to work and what's going to be accepted and what's going to be acceptable. So you need to incorporate that in your planning now and understand where that is and how you fit into that timeline with, your services.

So I would definitely say quantum is one of those things to start looking at. And I don't know a lot of people that have cybersecurity quantum experts on their staff. You hear about the shortage of cybersecurity people, and then you throw in, oh, you'd be a quantum expert too. It becomes a lot fewer.

I don't even [00:25:00] want to be a quantum expert, but at least understanding what the horizon looks like, what the policy requirements are, what the compliance requirements are, and then what that means to your organization. So that would be one. The other is then I think you mentioned your resilience.

There's lots of different compliance that has always pushed down people's throat. You have to the GDPR, RMF, FISMA, whatever it is, whatever HIPAA what, whatever organization, whatever sector you're in, you follow these compliance pieces. And it's, always gonna change it depending on what the next big attack is or the, new reorg of the government.

I find it. Imperative that organizations look beyond what they have to do in compliance with law, but they do what needs to be done. And that's looking at the resilience because you can go through it and meet all the compliance pieces, check all the blocks, product controls and, you still have a [00:26:00] vulnerable system.

You still have the insider threat. That's always the, trump card there, insider threat, that all the control is going to be insider threat. Start preparing and understanding what what, working through a compromise looks like. So if you do have someone inside or threat, they're doing something to give away your crown jewels, how do you continue to work through that?

How do you continue to meet the mission? If you have an external attack, how do you work through that? The ransomware attack, how do you continue to work through that? And you're not just going to stop and say, oh, shucks. They got through our our controls. They got through our security.

We're, doomed. No, you want to develop your security system so that you, plan on being attacked. You plan on being compromised and you can plan on continuing business through that. And I will make a plug for the GRF, the Global Resilience Federation. They have an organization, the BRC, another acronym, the Business Resilience [00:27:00] Council, who put together yet another acronym, the ORF, the Operational Resilience Framework that takes resilience planning to I think another level and I, think this is very evident during, covid and people saw the issues with supply chain. So when people talk about business continuity and resilience, I think they only look internally, what do we have to do internally? But the, ORF operational resilience framework really focuses not just on the internal what you have to do, but who are you depending on? And then who's depending on you? What is your minimal viable product or minimal viable service that you have to be able to perform to? And you can get into the the technical side back to your CISO, CTO, CIO, and what their plans are for data backup and recovery.

If you have some type of an attack or an event. You don't necessarily have to recover everything right away, if it's a very low priority. Maybe you can wait [00:28:00] a week or a month, maybe two months, but maybe you have more priority done that has to be initially stood up right now much smaller.

Subset of data subset of processes that will ensure that you're meeting your minimal viable service and product for you and your customers, and that they can then continue on with their mission as well. So we've really moved to this much more connected society through data through operations through dependencies, and you have to understand what that is and plan around that, because it's You know, I think the old days of, okay, I'm taking care of my stuff.

And then you find out, oh, okay. Everything in my shop is good, but I can't access this widget because this company had a ransomware attack. So now I'm shut down. You need to understand that was a dependency that you have. So now how are you going to work around that? Maybe you stockpile those widgets.

If you're not going to be able to get [00:29:00] them or you. Have a backup supplier that you can also get those widgets from. And then same thing going forward with who's depending on you.

Dr. Rebecca Wynn: George, our time has totally just flown by. What is the best way for our audience to get ahold of you for advisory services, speaking engagements, learn more about your research and learn more about your company. 

Dr. Georgianna "George" Shea: Well, they can go to FDD. org, Foxtrot Delta Delta dot org, and um, I have a page on there so you guys search by my name and it has all of my products, my analysis, um, you know, projects that I'm working on, my, my bio, and then I'm also on LinkedIn.

Dr. Rebecca Wynn: Thank you again for being on the show. You are a soulful CXO. 

Dr. Georgianna "George" Shea: Well, thank you for having me. It's been a real pleasure. Thank you.