Soulful CXO Podcast

Proactive Privacy in the Age of AI | A Conversation with Dr. Ann Cavoukian | The Soulful CXO Podcast with Dr. Rebecca Wynn

Episode Summary

Join us as we uncover the potential benefits and risks associated with AI and emerging technologies and discuss the potential convergence of laws to facilitate global data exchange. Don't miss out on this insightful conversation that explores the future of privacy in the digital age.

Episode Notes

Guest:  Dr. Ann Cavoukian, Executive Director of the Global Privacy and Security by Design Centre

Website | https://gpsbydesign.org/

On LinkedIn | https://www.linkedin.com/in/ann-cavoukian-ph-d-3a78809/

On Twitter | https://twitter.com/anncavoukian

Wikipedia | https://en.wikipedia.org/wiki/Ann_Cavoukian

Host: Dr. Rebecca Wynn

On ITSPmagazine  👉  https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/rebecca-wynn

________________________________

This Episode’s Sponsors

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

________________________________

Episode Description

In this episode of the Soulful CXO, Dr. Rebecca Wynn welcomes Dr. Ann Cavoukian, a world-renowned leading privacy expert and the creator of the Privacy by Design (PbD) and the Executive Director of the Global Privacy and Security by Design Centre. We dive into the world of technology and its impact on privacy and the concept of privacy by design and emphasize the importance of finding a balance between privacy, security, and data utility. Additionally, we explore the challenges posed by new technologies like AI, smart cities, and quantum computing and the need for legal measures to ensure data privacy. Join us as we uncover these emerging technologies' potential benefits and risks and discuss the possible convergence of laws to facilitate global data exchange.  Dr. Ann Cavoukian's expertise and passion for privacy make this episode a must-listen for anyone concerned about protecting their personal information.

________________________________

Resources

Dr. Ann Cavoikian's Books: https://www.amazon.com/Books-Ann-Cavoukian/s?rh=n%3A283155%2Cp_27%3AAnn+Cavoukian

2024 Predictions: Impact of AI on GRC: https://www.linkedin.com/pulse/2024-predictions-impact-ai-grc-dr-rebecca-wynn-the-soulful-cxo-v7pdc/

The Artificial Intelligence Act: A Landmark Regulation for AI Systems: https://www.linkedin.com/pulse/artificial-intelligence-act-landmark-regulation-ai-dr-rebecca-rgahc/

Shaping the Future of AI: Key Takeaways from The AI Safety Summit: https://www.linkedin.com/pulse/shaping-future-ai-key-takeaways-from-safety-summit-dr-rebecca-oud6f/

The Future is Now: Mastering AI Management with ISO/IEC 42001:2023 Guidelines: https://www.linkedin.com/pulse/future-now-mastering-ai-management-isoiec-420012023-dr-rebecca-zx40c/

________________________________

Support:

Buy Me a Coffee: https://www.buymeacoffee.com/soulfulcxo

________________________________

For more podcast stories from The Soulful CXO Podcast With Rebecca Wynn: https://www.itspmagazine.com/the-soulful-cxo-podcast

ITSPMagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

Episode Transcription

Proactive Privacy in the Age of AI | A Conversation with Dr. Ann Cavoukian | The Soulful CXO Podcast with Dr. Rebecca Wynn

Dr. Rebecca Wynn: [00:00:00] Welcome to the Soulful CXO, I am your host, Dr. Rebecca Wynn. We are pleased to have with us today, Dr. Ann Cavoukian.

Ann is the Executive Director of the Global Privacy & Security by Design Center, is recognized as one of the world’s top privacy experts, and previously served an unprecedented three terms as the Information & Commissioner of Ontario, Canada. She is well-known as the creator of Privacy by Design, which has been translated into over 40 languages! Privacy by Design has been included in the EU: General Data Protection and many other countries & states Privacy Laws & Regulations. Additionally, she is the author of two books, “The Privacy Payoff: How Successful Businesses Build Customer Trust ” with Tyler Hamilton and “Who Knows: Safeguarding Your Privacy in a Networked World” with Don Tapscott. She has also written numerous articles and Op-Eds, is on many Boards, is a highly sought-after speaker & advisor, and has been awarded numerous awards in cybersecurity, privacy, data security, technology, AI, leadership, and lifetime achievement, including being awarded the Meritorious Service Medal for her outstanding work on creating Privacy by Design and taking it global.

Dr. Ann, my mentor and friend, welcome to the show. 

Dr. Ann Cavoukian: Thank you so much, Rebecca. It's a pleasure to be here. 

Dr. Rebecca Wynn: And for audience who doesn't know, Anne was my very first guest on my show when I first started the show, and now she's my 50th guest.

So thank you so much for, for returning for our second season. 

Dr. Ann Cavoukian: Oh, I look forward to it always. 

Dr. Rebecca Wynn: For the audience who might not be familiar with privacy by design and the framework and its importance in the world. Can you, can you explain a little bit of the backstory and why they should really pay attention to it?

Dr. Ann Cavoukian: One of the most important things, when I became a privacy commissioner of Ontario, Canada, from day one of the three terms I served, one of the things I noticed, I'm not a lawyer, I'm a [00:02:00] psychologist, my PhD is in psychology and law, but I didn't practice as a lawyer. And when I joined the commission, there were brilliant lawyers who wanted to apply the law anytime there was a data breach or privacy infraction.

All very important. But I wanted something in addition to that. I wanted something proactive, something that could ideally prevent the privacy harms from arising. So, It took me a little while. I literally, I spent, um, three nights at my kitchen table at home developing privacy by design. And then I took it into the office and I sold it to the lawyers and I explained to them, this doesn't replace privacy laws.

Of course, they're very important, but I'm trying to minimize the occurrence of privacy harms and data breaches to begin with. So it all worked out. It was, you know. Proactive and afterwards legal privacy by design is all about is ideally preventing the privacy related harms from arising, and it's a set of seven principles, and it tells [00:03:00] people, both in government and private sector.

Here are the things you should address in terms of avoiding privacy harms personal identifiers. Without the consent of the data subject, the individual involved, no, get rid of it. That's what's going to get you into trouble all the time. So eliminate any personal identifiers that have not been consented to by the data subjects, and then restrict your use of the data to the purposes that have been Consented to by data subjects.

So there are a number of measures like that, that proactively you can address before going public with the information. And it will save you so much trouble and so much grief, because you will avoid the privacy harms. You know, people might access data, but the data has no privacy implications. So it's a completely different matter.

That's the essence of Privacy by Design being all about being proactive and ideally preventing the privacy harms from arising. [00:04:00] And I should tell you, because I'm very excited about this, just a few months ago, Privacy by Design was labeled and embraced by the ISO, the International Standards Organization, which is huge.

They're based in Europe. And so many people follow the ISO so that when they embrace privacy by design and included it as an ISO, I mean, I get contacted regularly anyway, but now from all around the world, I'm getting all these contacts because of the new ISO classification. So I'm delighted. 

Dr. Rebecca Wynn: Oh, that's, that's awesome.

And then that's a big win. Can you explain a little bit, people might not understand that it's been what, since about 2010, , since you've been really. On this pathway, I'm trying to make this global and get in every country to see how important it is. That's the backstory on that, right?

That you've been working on this for like 13, 14 years. 

Dr. Ann Cavoukian: A long time. You're absolutely right. And you see, it takes a long time because when you're entrenched [00:05:00] in one mentality, which is privacy laws are essential and we apply them to correct the situation. Very important. I'm trying to avoid a situation that needs to be Corrected.

And, and that's the, you know, it takes a while getting that messaging across once the GDPR and the European Union included privacy by design in the general data protection regulation a number of years ago when they put it out. It was amazing that everybody started embracing it. And I got so many calls on a regular basis.

All about how do we do this and, you know, I always lead with encryption is, is critical. If you don't have, um, the consent of the data subject, or you still have personal identifiers at the very least encrypt the data. So it can't just be hacked and access gained by unauthorized third parties. And there are so many out there now, Rebecca.

Uh, data broke [00:06:00] brokers abound and they just grab data and sell it to whoever's interested with little regard to the data subjects or consent being obtained 

Dr. Rebecca Wynn: and we've seen a lot of great movement in the last several months, right? We've we've had several meetings with consortiums and places like. that over in Europe, United States, Canada, um, right.

Um, people in Europe where they're like, Hey, we need to adopt things like an AI framework and how we're going to regulate all that. But what do you see as really the future of privacy as we start moving forward? Because there's, it seems like there's going to be a lot of challenges trying to regulate that, you know, as you talk about technology and innovation and everything seems to be going at the speed of light, what do you see along those lines and, and what words of wisdom can you give us?

Dr. Ann Cavoukian: There are many challenges. There's no question. And especially one of the areas, law enforcement, the police regularly contact data protection authorities and commissioners and say, look, we have to have access to this [00:07:00] data for legal matters. It's very important, etc. What I always used to say to them when I was commissioner.

If you have a case for gaining access to personal identifiers, personally identifiable data, go to the court, make your case to a judge, and then get a warrant. If you, if you succeed in getting a warrant, then of course you will have access to the data. But short of that, no, because I don't know the extent to which you'll be using the data for purposes that are unacceptable to data subjects and maybe unrelated to the law enforcement matter at hand.

Once you have the data. And it's in the hands of third parties. I mean, anything can be done with it. So that's one of my concerns. Technology, of course, growing like crazy. And, and I love technology, don't get me wrong. Privacy by design is all about win win. I want to emphasize that. It's not privacy versus security, or privacy versus data utility.

It's privacy and how do you do both. And that's why I often mention [00:08:00] encryption and other tools like that because now with the amazing new technologies coming out like AI, artificial intelligence, which is just Extracting data from multiple, multiple databases, hundreds, thousands of them, without any consideration of personally identifiable data being included in that and being obtained without consent, then you've lost it.

So I always tell people who are contemplating using AI, AI companies, etc. You have to look under the hood. You have to make sure that the people who are going to be implementing this understand. That they cannot just do this with no regard to individuals not consenting to the release of their data.

Before you extract the data from databases, make sure all personal identifiers are scrubbed. Once you do that, then you have enormous freedom to do whatever you want with the data. But short of that, you don't, and commissioners like myself, when I was a commissioner, will come after you. So, that's why I tell all [00:09:00] businesses and governments, Scrub personal data, remove the personal identifiers before leaving in a database that can be access can be gained to by a variety of different players.

Dr. Rebecca Wynn: It seems like we're kind of changing. I say chasing the horse because. It seems like a lot of these AI models went out and grab some information that they really shouldn't have grabbed and it's in their databases. So how do we work with that going forward? Because the other thing that's built in is biases.

Because you didn't structure that data. So what do we do? with the current data? And what do we do going forward to make that better? 

Dr. Ann Cavoukian: There's no question.

It's an uphill battle. And I, I mean that, and I am an eternal optimist. I mean, people always wonder how I'm still doing this. Um, they say, look, you know, privacy is dead. That ship has sailed. And I always say, forget it, get another ship. You don't just give up on privacy, which forms a foundation of our freedom.

No, this is critical, so we, we will find new ways of doing it, which does, [00:10:00] but I'm not trying to pretend that there aren't enormous harms that arise. Um, what I try to do is meet with companies who approach me, who are very concerned about their, um, customers, their subjects data, and actually want to do something to protect it.

Then, once I have that inroad, Then we can easily come up with ways to protect the data, scrub the personal identifiers, encrypt the data so that the personal identifiers can't be revealed. I mean, there's all kinds of things we can do, but there has to be a desire. And prevent third party access data brokers who you have no idea what they're doing with your data, especially if it has personal identifiers.

So there are a variety of measures that can be taken. I'm not suggesting it doesn't take time. Or that it's a perfect solution. By, by far it's not, but it's something that you can do if you want your customers to have comfort with you as a company having their data. [00:11:00] In fact, I do privacy by design certification with, uh, in partnership with KPMG.

And the companies that come to us who want to be certified are the ones who care deeply about their customers. They want to preserve their customers trust. And by obtaining privacy by design certification, all of a sudden the customers, you know, the company say to me, customers are now happy to give us information we've been seeking for a long time, but have been reluctant to do so.

And now they're giving it to us because we're certified. So there are measures you can take far from perfect, but it's something. 

Dr. Rebecca Wynn: What can you do, though, as an individual? I know there's organizations out there that help trying to keep, , getting your data off the Internet after it's there, but you even have stuff when you're incognito mode, and that was supposed to be private.

It's not private. So, what can you do as an individual to be better empowered or to try and control your data? It seems like it's an impossible battle. 

Dr. Ann Cavoukian: It's not impossible. It is very difficult. Um, I always tell people, [00:12:00] lead with, before you give your information, lead with your desire to have your data protected, your personal, personal information.

Let that be known to the company you're dealing with, the government department, whatever. Something I do all the time. And what I find is when I lead with that, someone will come back to me, either in an actual store when I'm buying something or online, um, in, in a store, for example, uh, when they say, Oh, God, have your postal code.

And I say, Oh, uh, do you have privacy measures to protect that data? What usually they say to me, the clerk doesn't have a clue, but the clerk says, Oh, you're interested in privacy. Let me go get my manager. Manager comes and says, Oh, you're interested in privacy. We can do this, this or this, encrypt the data, keep it in a separate, different location.

They have measures, but you seem to need to indicate. Your strong interest in privacy. And that's what I do online as well before disclosing any of my information. I asked them to inform me what [00:13:00] measures are going to take to protect my data. So if you lead with that, at least you may get something in return.

I'm not saying it's 100%, but It is very worthwhile doing so I

Dr. Rebecca Wynn: know 1 of the things that I do when you just talk about stores and all that. I don't use my phone number. I don't use my address and I'll be honest with you. I use use a person who's no longer in existence or I'll use something like Tinkerbell or something along those lines or just tell people just because they ask you for a piece of information and they're blocking you from going forward doesn't mean that they're going to vet that information.

So I would tell people to also have a bunch of sodium. South of them that they use and stuff on those things. Same thing with throwaway phone numbers, throwaway email addresses and things along those lines. 

Dr. Ann Cavoukian: That can we have to take kind of help. Yeah, well, we have to take whatever measures we can to protect our personal data.

As I keep reminding people personal data, it's your data. And if you don't want it shared or disclosed to third parties, you have every right to protect that data and whatever by whatever means you choose. 

Dr. Rebecca Wynn: [00:14:00] Yeah, quick question to about that when we talk about terms and conditions, 1 of the things is that people will sign up for service and read that term and conditions.

And it seems like as soon as you sign up, we've updated. They don't always very good about calling out what they changed and then update and not everybody is very good about seems like every other nanosecond. They do an update to read those. So you might have signed up for a service where maybe they're using your, your data to you.

In a respectful manner, but that new terms and conditions says, you know what, we can sell your data to anybody we want. And by the way, we're going to re, enable all the privacy settings that you disabled. , how do you. Recommend that people reconcile that for me. At least once a year, twice a year, I go through every single service that I've used and I'm like, am I using I'm not using them and I dump out and then I do the opt out request data delete and I use them.

I actually copied legal and compliance as well. Just in case with along with the privacy departments. But how [00:15:00] do you tell people to try and do that more effectively as a user 

Dr. Ann Cavoukian: and it takes so long and most people won't do it. 1 of the things I do is before I sign up, um, I. Talks to whoever is the organizer is of the, um, terms and conditions, the site I'm on.

And I tell them, these are the terms and conditions. I'm consenting to this. Will you inform me when they're changed? Because if you're not going to inform me, I'm going to go to the privacy commissioner and I'm going to report you. Something to that effect. And. Trust me, they will inform you. Um, that's been my experience.

So I always say lead with your strong concern for privacy. Lead with the fact that you're consenting to this, these terms and conditions, these, but that that's it. And if they change, you want to be notified and you want to be able to withdraw your data, things of that nature. So I always find being very vocal about these measures go a long, long way.

Dr. Rebecca Wynn: What do you see happening on the collaboration front? Do you see it being more and more [00:16:00] towards, like, we saw in December, November as a global collaboration on bigger organizations and governments to try and help us not only as companies, but as individuals or do you see as going. We're, you know, we're going to have 225 different privacy regulations and AI regulations and things along those lines.

What do you see if you had a crystal ball? 

Dr. Ann Cavoukian: Uh, there will be a lot of diversity. There's no question. But increasingly now, there seems to be a convergence, like, for example, AI, a number of countries. are developing their own AI data protection laws and merging with others. So the EU, the European Union, is merging now with the U.S.. They're talking together about reconciling their respective um, AI privacy laws and data protection laws. And other countries are doing this as well. So, hopefully, what I'm saying for the future, just like when the GDPR came out, A lot of countries want it to [00:17:00] be, you know, in line with the GDPR so that they could facilitate trade and data exchange, etc.

I'm hoping the same will happen with new technologies like AI, uh, data privacy technologies, etc. That there will be some convergence in the laws, the legal measures being introduced. So that individuals don't have to, you know, follow dozens of laws all around the world, which is impossible, or constantly have to upgrade their sites.

Hopefully, there will be more convergence, which will make it a little easier for individuals to have their privacy protected. 

Dr. Rebecca Wynn: Is there other technologies that are on your forefront? I know one of the things I'm looking at is the quantum computing, the new NIST standards that are coming out or already have been out.

Um, one thing is we see with AI and with the bigger computers and being able to get through algorithms quicker, a lot of the old encryption models. Going forward and maybe 3, 5, 7 years that is in flux on what people actually think depends who you talk to. If it's gonna be 3 [00:18:00] years or it's gonna be 30 years on those encryption to be out of compliance. But what do you think along those lines? What do you see? Things like emerging technologies that are on your radar? I know quantum computing is one that's on mine. 

Dr. Ann Cavoukian: Quantum computing is amazing. I mean, and let me be clear. The positives of these technologies, quantum computing, artificial intelligence are enormous.

There are enormous benefits to these as well. That's why privacy by design is never privacy versus these technologies is how do you bake it in so that the two can operate together. And that's my hope that in all of these measures, um, you will have Understanding that there have to be some privacy protective measures built in, and then that will, in fact, enhance the value of the new technologies being introduced, because countries all around the world are very concerned about privacy and data protection.

It's not just the US or Canada, Europe, Brazil. Um, the EU, all around the world, Australia, uh, it's interesting, Brazil just [00:19:00] introduced a new privacy law last year, and they've included my Privacy by Design framework in it. I do calls, media interviews, all the time, weekly, and they've asked me to teach courses down at Brazil at the university on this, uh, via Zoom, I should add.

One of these days I want to go there live. So. There is a lot of interest in finding ways to preserve the, the emergence of amazing new technologies and wed privacy into the process. We can do this. We just have to put it on people's radar that this is doable. Um, I tweet the daily stories of the day.

And, uh, And I have a large Twitter following, and invariably someone will come on and tweet back to me and say, Lady, give it up. That ship has sailed. And I say, Get another friggin ship! You don't give up on privacy, which forms a foundation of our freedom. And they tweet back and they say, Oh, is that possible?

Tell me how. So there is [00:20:00] interest once you dispel people of the belief that It's impossible to do. Nothing's impossible. You never give up on freedom. 

Dr. Rebecca Wynn: You worked with smart cities and in privacy, and recently we've seen the news how using AI to try and go ahead and police in these instances where we've had people With AI facial recognition has been wrong.

It's been sunglasses or something like that. And we've seen people get arrested, um, initially put in jail when it wasn't them because of relying so much on AI recognition. And I think that does tie in the kind of smart cities and privacy and these cameras on that. What are you, what do you see along those lines?

And what you have concern that has concerns me. I don't want to be arrested just because I walked down the street and I didn't do anything. 

Dr. Ann Cavoukian: I totally agree with you. And it's not just AI facial recognition, it's facial recognition period. In, in England, for example, in the EU, in England, they have more facial recognition cameras than anywhere that I'm aware of.

And, um, last time I looked, [00:21:00] 83 percent of the matches, are wrong. They're identifying the wrong person from the facial recognition. Someone tweeted me recently and they said, no, no, it's actually 86 percent are wrong. It's over 80 percent of the matches are incorrect. So they identify you and me as having robbed something when it's not us because they have our facial image from these cameras on the streets, etc.

It's absurd. And now that AI is relying on that kind of facial recognition, it's crazy. It's crazy. That's why I want to alert people facial recognition on a one to one basis is actually very strong meaning Um, I have facial recognition when I go to the airport my face Compared to the face I have on my passport.

I go to the the camera It compares my face to my passport one to one, very high accuracy. It's the one to many where they compare my face [00:22:00] with thousands, sometimes hundreds of thousands of images. And of course they get it wrong. So please, AI, keep out of facial recognition is bad enough already. 

Dr. Rebecca Wynn: What do you see along the front on privacy with your voice?

We've talked about privacy in the data that is created. We've talked about privacy and biometrics as being facial recognition, but it doesn't seem like there's a lot of. Laws and regulations around your voice data and your voice being your unique voice and, , here in the United States, we saw that New Hampshire with, calls faking, you know, pretending that they're, Joe Biden had no negative effect, but that is a concern.

So, what do you see along those lines? You see new regulations coming about using a voice. 

Dr. Ann Cavoukian: I think that will happen, Rebecca, because the voice is just emerging as a new area of concern. And as you said, Joe Biden's and using his voice and pretending that is really [00:23:00] bringing it to light and alarming people.

Incredibly. So I think that's going to be a new area where you're going to have to have some technology and concern associated with who can gain access to your voice and use it in ways that were never consented to and and voices, um, have a lot of variation depending on whether you have a cold or this or that the, the inaccuracy is going to be enormous.

So I'm hoping that'll be another area. Where legislation will be taking place 

Dr. Rebecca Wynn: Another question on data. One thing that I keep telling people I don't understand when you can send an email and you can have embedded data key that expires off the email. You can do that in a text message and you can do it over a couple other mediums.

Why is it when data created that it has an expiration key? So, for example, if I allow you to use my resume to apply for a certain job, I can say, hey, you can keep it on file for 6 months. But after that. Yeah. I can expire it off. Why do you think [00:24:00] worldwide upon data creation that they just don't go ahead and say, this is the normal amount of time that that is 10 years, seven years, two days, one, one second.

And it has an expiration key. Why do you think that that seems to not be one of the global things that people are just doing? 

Dr. Ann Cavoukian: It's a great idea, Rebecca, but companies and governments, they're not interested in you and I and the protection of our data. As we are, obviously. So, um, one of the reasons I mentioned privacy by design certification, those companies, they're the companies who are keenly interested in different ways of protecting their customers data, and they get rewarded for that.

But I want to make it clear, they're in the minority. Most companies and government departments, they're just, they've got so much going on, they're not focused on that. And that's why, you know, you have to get out there and start speaking about it. That's why I'm so glad you're doing this kind of, these kind of interviews.

Because the more you speak about it, the more important it is. When I, [00:25:00] I get invited to speak a lot, and often to boards of directors, and I walk in, and I see, and I'm a member of the boards, they're not very happy to see me. They're gonna, they're thinking I'm gonna tell them to stop doing what they're doing.

And the first thing I say to them is, Give me 10 minutes to tell you about Privacy by Design. If you're not interested after that, I'm out of here. Because Privacy by Design is all about protecting your customers privacy and ensuring that you extract the data utility you need for your business operations.

Then all of a sudden, they, they wake up and they go, Oh, tell us. And then I go on about it being positive sum, not zero sum, not either or. I'm not trying. To make them lose out. I'm in fact trying to do the opposite by getting them more customers who will be more trustworthy of their operations. You got to get the story out, but it's, um, I'm not saying we don't have a long way to go.

Dr. Rebecca Wynn: I know one thing that I'm a proponent of. If you're going to make me personally responsible as chief information security officer, and I can do jail [00:26:00] time, why don't you do the same for the person you say is your chief compliance officer, your chief privacy officer? Because it seems that's, that's always under general counsel or something like that, and it doesn't seem like they, they really have any skin in the game.

What are your viewpoints or your peers viewpoints along those lines? If you're going to make me responsible as chief information security officer, make them do jail time too. The board now potentially can, can have ramifications. I think that would be a slight game changer. If not the game changer. 

Dr. Ann Cavoukian: It would be a huge game changer.

It ain't gonna happen. I don't think. I wish it would. But short of that, because I really don't think that's gonna happen, when I talk to boards again, what I tell them is, your chief privacy officer, please have them report directly to you, the, the, the, the head of the board, um, board of directors. Have them report to the CEO.

Because that makes a, that in itself is a game changer. Because if they're reporting through various managers and it's, It's all going to get lost, the importance of privacy and data protection, but [00:27:00] if they're reporting directly to one of the most senior people in your operations, everyone pays attention.

And a number of companies have done that after I've suggested that to them. So I think that will be easier to deliver than jail time. 

Dr. Rebecca Wynn: I do a lot of consulting and the one thing consistently, it seems like I'm finding that the chief compliance officer, or maybe the privacy officer is under legal, but consistently, when I've talked to them, to be honest with you, they don't know what they should be knowing.

And then they have to go to people like me and I'm like, why are you putting that under General Counsel instead of having maybe general counsel be part of a committee? It seems like it should be really maybe under the risk officer. Along those lines, where do you, I think part of that's the, the reporting lines.

Where do you think that that should be? 

Dr. Ann Cavoukian: I agree with you, but the reason I say have them report directly to someone very senior. Um, it doesn't have to be the CEO, but the CEO can designate his right hand. And once there is that kind of senior [00:28:00] connection, uh, then whoever they're reporting to is aware of that and all of a sudden privacy and data protection.

Is elevated in terms of its importance. So that's what I urge companies to do. It can't just go through the slow, lower groups, and it's going to disappear. Give it some foresight, some senior direction, and, uh, have it, uh, have them take Report directly to the top. 

Dr. Rebecca Wynn: Well, thank you so much for your time. Our time is run short.

I want to thank our audience for joining us. Please go ahead and check out the descriptions when we'll have all the contact information for Dr. Ann Cavoukian and also, please make sure you subscribe to the Soulful CXO insights newsletter available on LinkedIn, and we'll see you next time on the show, Dr. Ann, thank you so much for your inspiration, everything you do for the world for privacy by design and sharing your insights with us today.

Dr. Ann Cavoukian: You're so kind. Many thanks, Rebecca. It was always a pleasure.[00:29:00]