Privacy is changing. Across the globe, new standards are recognizing it as a fundamental human right.
But between GDPR, CCPA, and all the other standards popping up, figuring out all your data privacy obligations can be quite the challenge.
- The history of data privacy
- How to meet your privacy obligations
- The role a Data Privacy Officer plays (and whether you need one)
To hear this episode, and many more like it, you can subscribe to The Virtual CISO Podcast here.
If you don’t use Apple Podcasts, you can find all our episodes here.
This transcript was generated primarily by an automated voice recognition tool. Although the accuracy of the tool is 99% effective you may find some small discrepancies between the written content and the native audio file.
You’re listening to The Virtual CISO Podcast, a frank discussion providing the best information security advice and insights for security, IT and business leaders. If you’re looking for no BS answers to your biggest security questions, or simply want to stay informed and proactive, welcome to the show.
John Verry: (00:25)
Hey there, and welcome to another episode of The Virtual CISO Podcast. As always, I’m your host, John Verry and with me as always the Amy Poehler to my Tina Fey, Jeremy Sporn. Hey, Jeremy.
Jeremy Sporn: (00:37)
Hey, John. Hello, everyone.
John Verry: (00:40)
I think I would have gone Alec Baldwin to Tina Fey, but okay. You want to go [crosstalk 00:00:45].
Jeremy Sporn: (00:47)
Oh, that is just a shot at Amy Poehler. They made a movie called Sisters.
John Verry: (00:53)
I was thinking more of… What was it called? 30 Rock.
Jeremy Sporn: (00:57)
Oh, [crosstalk 00:00:58]. Okay.
John Verry: (00:58)
I would know the thing about their pairing in 30 Rock, and I am not an Alec Baldwin fan. I don’t think he’s a good person in general, but he was really good in that show. But we digress, which we normally do.
John Verry: (01:11)
So let’s get back on track. What did you think of my conversation with Dyann?
Jeremy Sporn: (01:15)
Wow, just wow. You had warned me that Dyann would be dynamic, you had said she would be sharp. But she still blew me away. She understands and can explain even better data privacy in a way that I had just never heard before. And I think part of that is she just has such a clear view of the future of data privacy.
Jeremy Sporn: (01:40)
There were a few times in the conversation where she was explaining things and you’re like, “Wow, that’s aspirational,” which I think is accurate, based upon where we are today. I just think it’s such a cool thing that she has this understanding and his view of the future of where data privacy is going, and is kind of like almost a steward to help us get us there. It was really amazing.
John Verry: (02:01)
Yeah. She’s really good. And you know the reason, right?
Jeremy Sporn: (02:05)
John Verry: (02:06)
Jeremy Sporn: (02:10)
John Verry: (02:11)
Clearly. That was my takeaway from it, is that I’m going to be a lot smarter now that I’m buying the same brand of Prosecco that she is. No, on a more serious note. I think the reason you haven’t heard it exactly that way, and one of the reasons that resonated so differently, and perhaps better with you, is that you hang around with a bunch of information security guys and we talk about privacy in a much different way, right? If you were hanging out at a local law firm, I think perhaps you may have heard it similar. That being said, she’s also incredibly knowledgeable and talented. And she’s really is a very good communicator. She was able to take, I think, some pretty complex topics and communicate them in a pretty easy to understand way.
Jeremy Sporn: (02:51)
Agreed. And it was interesting, especially this time around, since she sits in a different world than you. These domains of information security and privacy have converged. Officially, I think it’s safe to say. You guys did a really great job complementing one another from your individual domains.
John Verry: (03:08)
Yeah, which is honestly the exact reason why I was excited to have Dyann on.
Jeremy Sporn: (03:13)
Yup. And so anyone listening out there, if you’re a leader in your organization, or you’re a part of the legal or compliance team, or information security, this is going to be a really cool one for you. Dyann is an absolute powerhouse. Expect to walk away with that clear understanding of how and why we are in this world of data privacy concerns, and how you can address those concerns, and most importantly, what the future of the data privacy world will look like. This is just a must-listen episode.
John Verry: (03:42)
Yeah, and I think as we sit here on the precipice of CCPA really taking root and a number of other frameworks rapidly being expanded around the world, that the timing for this particular show is really good. So, no further ado, let’s get to the show.
John Verry: (04:03)
Dyann, how are you today?
Dyann Mills: (04:05)
I’m very good. Very good, John.
John Verry: (04:07)
Thank you for coming on today. Looking forward to our conversation.
Dyann Mills: (04:10)
Likewise, thank you very much for the invitation.
John Verry: (04:14)
No worries. So let’s start super simple. Tell us a little bit about who you are and what it is that you do every day.
Dyann Mills: (04:21)
Okay. So I am CEO of HewardMills, and we are a global Data Protection Office. So effectively, we act as a data protection officer, a DPO for a number of organizations across different sectors globally. And we also provide consultancy services in the area of data protection and privacy.
John Verry: (04:40)
Cool. And I would assume that makes you an attorney.
Dyann Mills: (04:44)
Yes, or a lawyer in the UK because [crosstalk 00:04:47].
John Verry: (04:51)
If anyone hasn’t noticed from the accent, Dyann is from Atlanta, Georgia. Dyann is from the other side of the pond. She’s from England. So I thought in England you guys called lawyers like barristers or something, right?
Dyann Mills: (05:08)
Yes, so I’m a qualified barrister. So we have barristers and solicitors. Solicitors actually have two streams, so I’m a qualified barrister. And traditionally, barristers are the specialists who tend to do the advocacy and go into court or have a specialism in a particular area. So the area that I specialized in is data protection and privacy.
John Verry: (05:27)
So you must be glad you made that choice at this point because we’ve hit a nutty point, which is obviously why we’ve got you on the show here today. So before we get down to business, I always like to ask people, what’s your drink of choice?
Dyann Mills: (05:41)
I’m glad you asked that. Well, it’s Friday evening in the UK, John, as you will know. And so my drink of choice on a Friday evening is Friday Fizz, which is a Prosecco. Cheers.
John Verry: (05:59)
Yeah, I’m not drinking Prosecco. Most English people would be having a pint right about this point, right?
Dyann Mills: (06:04)
No, I’m not one to pull a pint on a Friday, I’m afraid. I prefer the bubbles.
John Verry: (06:09)
I do. And what’s the brand of the Prosecco that you typically have?
Dyann Mills: (06:14)
So it’s an Italian Prosecco. So I’m clinging on to [crosstalk 00:06:18]. So I’m not sure of the particular brand but I just know it’s a sort of Italian dry Prosecco. It’s very nice. It goes down very well.
John Verry: (06:30)
Yeah, when we were in Italy. It’s very often in a lot of the nicer restaurants, they will give you a complimentary small glass of Prosecco when you come in to sit down, while you’re looking at the menu. So I did develop a taste for Prosecco as well.
Dyann Mills: (06:43)
It has a nice touch, right?
John Verry: (06:45)
Oh, it absolutely is. So great jumping off point. So from my perspective, privacy has just absolutely exploded in the last three, four years starting of course, with GDPR. How did we get here? Why is this happening? And why is this happening now?
Dyann Mills: (07:01)
Yeah. I guess, in Europe, we’ve been thinking about this for a very long time, haven’t we? So with the sort of explosion of technology, and this ability for supercomputers to process huge volumes of data about individuals, the Europeans started to think about the impact on kind of the human rights, the rights and freedoms that this may have. Going back to the days of the Second World War, where certain information about individuals could be used to discriminate against them, and wanting to put in place safeguards in this respect. So the European approach has been very much considering data protection and privacy as a fundamental human right.
Dyann Mills: (07:50)
The US takes a slightly different approach, it’s more sectoral. So looking at financial services or healthcare, and considering the impact that individuals they suffer if information [inaudible 00:08:04] in that particular context, in education and financial services, in relation to health.
Dyann Mills: (08:10)
But I think it’s really been the explosion of social media sites. And the understanding, if you like the awareness of just how much information organizations can hold on us. It becomes very personal. It’s about when you browse a website, the information in terms of your personal habits that can be picked up, where you go shopping, and you use a card. How that information can be used to profile you. And I guess, just how that can impact your day-to-day life. And just as you say, in the last few years, there’s been so many media stories has become something that’s so central, really, to us that there has been this awareness. It’s almost like an awakening of goodness me. There is a huge opportunity in all of this, but huge potential for harm also, if the right [inaudible 00:09:13] are not put in place.
John Verry: (09:15)
Right. So in short, the internet created a vast accumulation of personal information that organizations were using in a manner to drive revenue, right? To drive income, which wasn’t being communicated to the people whose information it was, and often would potentially hurt them, right? Effectively, is that a decent summary?
Dyann Mills: (09:42)
Yeah, I guess it was just that the lack of transparency and accountability around it was really what was the focus and the awareness around the lack of transparency and accountability. So going back to what I said about the European approach and this recognition that actually it was more governments and some public authorities had so much power through the data that they had, that there had to be accountability and frameworks in place that protected the rights of individuals. So those rights weren’t abused. And it’s [crosstalk 00:10:12]. Sorry. Go ahead.
John Verry: (10:13)
So on your side of the pond. It was the government that you were protecting yourself against. On our side of the pond, it was capitalism.
Dyann Mills: (10:24)
It starts with governments and public bodies, but then it flows to private organizations because in a sense, it was thought, “Well, if we’re holding our governments to account, when you look at private organizations, they have as much if not more data, visibility of what we do in our private lives, and then day-to-day, and actually they should be held to the same standards.” And in the US there’s that recognition as well. It’s sort of this idea that consumers have to be protected against abuses that may take place if there isn’t the safeguards that the degree of transparency that’s required around the handling of data. And if governments are held to account. Certainly, private organizations should be held to account in the same way.
John Verry: (11:10)
Yeah, that was incredibly well said. And I got a funny little story that you’d appreciate being a privacy person. I was talking with a person who worked for an organization. And they prided themselves on taking anonymized data from multiple sources, and then combining it back and deanonymizing it to a point where they could identify the individuals, which is nutty. And then they were selling that data.
John Verry: (11:34)
And the guy gave an example of one of the things that we’re most proud of where he said that, “Based on this data, we can determine when couples are heading towards divorce. And we sell that information to divorce attorneys to market to them.” So I think to your point, that is a clear abuse of our personal information, and preying on people at a point which is a very difficult time of their life.
Dyann Mills: (12:03)
Indeed, and there’s so much information that can be discerned from our shopping habits, from our browsing habits, from the [crosstalk 00:12:11].
John Verry: (12:11)
If someone has like a problem with drinking a lot of Prosecco. That might be something that somebody could pickup on.
Dyann Mills: (12:18)
John Verry: (12:19)
If anyone’s Friday night Prosecco consumption tends to max out for the week, things like that, maybe?
Dyann Mills: (12:29)
Certainly. Yes, in the area in which I live. That’s definitely yes, and increase consumption towards the end of the week. But [crosstalk 00:12:35]. Joking aside, there have been incidents haven’t there? Where, for example, I think it may have been even Walmart, right? The store was able to determine that an individual in the household was pregnant before the parents of that individual were made aware, and the store was sending discount vouchers for nappies, and baby food and the likes, based on it was the purchasing habits of individuals.
Dyann Mills: (13:04)
So there is an impact. And of course, we can all relate to that. There are certain things that you want to be able to do in private, and you don’t want to have a third party have access or to be privy to those activities. And so there’s a sphere of privacy that needs to be respected, it really comes down to the core of who you are as an individual. And we tend to use the expression at HewardMills of sort of protecting kind of data dignity, and respecting that.
Dyann Mills: (13:36)
And actually, organizations are entrusted with a lot. If I give my information to an organization, I’m entrusting that organization to respect that information, and value me and my dignity as a human being. And that’s the difference, I guess in approach. But actually, interestingly, we’re starting to see a sort of an acceptance of this set globally, right? [crosstalk 00:14:04].
John Verry: (14:05)
We’ve all been subject to… And you’re talking about the rights, but then there’s also the impacts associated with that. The creepy feeling that somebody is watching you. But also, I don’t know how many times my credit cards have been stolen, that I literally had someone two weeks ago file unemployment claims using all of my personal information. So now I spent endless hours filing the proper paperwork necessary legally, to ensure that I’m not held liable for any of this, to make sure the checks aren’t being issued. So yeah, this is a huge issue.
John Verry: (14:42)
And what’s interesting to me, so I’m an information security guy, you all know that. Privacy three years ago was something that the the guy in the corner office labeled legal and compliance, go talk to him, right?
Dyann Mills: (14:55)
John Verry: (14:55)
And really information security, we didn’t care about privacy. And really what’s happened now is that it’s really shifted from a legal and compliance domain to legal and compliance and information security, right? Two paths that never crossed are now, inexorably twined. Why is that happening? That’s two different skill sets, two different experiences, two different educational backgrounds. Talk a little bit about that, what you’re seeing there, what some of those challenges are and how organizations can deal with that particular challenge.
Dyann Mills: (15:22)
Yeah. So I think that’s a very interesting development. And yes, we are seeing that convergence between privacy and security. It makes a lot of sense to me because actually, they are board-level issues. Security became a board-level. And you mentioned this of data breaches, and the inconvenience that they cause you. But for organizations, a cybersecurity breach, a data breach is a board-level issue. It’s going to impact your reputation, it’s going to impact your bottom line.
Dyann Mills: (15:52)
And privacy is the same thing. It’s something that touches upon all functions. And something that boards need to be absolutely across in terms of if they don’t get it right, if they don’t have an understanding, if they don’t set standards and the tone from the top, that potentially can impact the reputation of their organization. And there’s a common saying as well, “You can have security without privacy, but you cannot have privacy without security.
John Verry: (16:24)
Dyann Mills: (16:24)
And typically, organizations move to a situation of keeping all the information secure. But what privacy does is it asked questions around, “Well, what’s your basis for having the information in the first place? And have you been transparent about it? And have you a legitimate right to be holding it and processing it in the way in which you’re doing or you’re intending to do? And fundamentally, have you communicated with the individuals who will be impacted in a way that they understand, to get their buy-in for what you are doing with their information? Is that much trust there? Have you built that trust with the individual?” And so these nuances need to be addressed by organizations and be done in a sort of an integrative way. You can’t work in silos. You cannot have the compliance team working on these issues without the input from the security team. Because often those security standards have to be agreed and assessed in a very, very context specific.
John Verry: (17:33)
Mm-hmm (affirmative). So it’s a do what you say, and say what you do? Is that a-
Dyann Mills: (17:43)
Absolutely, because if you’ve got a privacy notice on your website speaking out to the world, about what you’re doing in relation to the data that you hold internally, and there is a gap or a disconnect with that, there is a credibility issue that starts to arise and an integrity issue. And that will come up to bite you, whether it’s through an individual like a consumer raising issues, or an employee within your organization, challenging what you’re doing around what you say and what you actually do. In terms of whistleblowing issues, and other risks associated, or other external parties, just looking at the notice, and then actually challenging whether you as an organization are living up to what you’re stating, what you’re committing externally. So I think it’s a fundamental issue to get right within a business.
John Verry: (18:47)
Right. And the bigger challenge that we have now is that those privacy policies for many organizations at one level or another have existed forever, right, but for a long period of time. But now starting with our good friends in the EU, which you guys will eventually not be part of, but we won’t go there yet.
Dyann Mills: (19:07)
John Verry: (19:09)
I noticed you have a drink right away. I mentioned Brexit, the drinking starts. Little worried? Well, let’s not go there yet.
Dyann Mills: (19:16)
John Verry: (19:20)
Dyann Mills: (20:14)
Yeah. And that’s a huge challenge for multinationals, right? So there are some common themes. And we’ve touched upon those actually, earlier on. This idea of putting the individual at the center and recognizing the dignity that you need to attach to that individual’s information because it relates to a core aspect of who they are. So once that’s understood that how do you do that. And there are certain requirements around transparency, giving notice, ensuring consent of individuals to what’s been done around their data, ensuring adequate security over information, ensuring that there is accountability within the organization. And this is this idea of the tone from the top and also an acceptance of responsibility of what’s happening within the organization, right from the top down.
Dyann Mills: (21:09)
So those threads of accountability, transparency, notice, consent, common through lots of different privacy frameworks. But ultimately, also, what has to be understood is the local nuances. So we have a saying at HewardMills. It says, “Think globally, act locally on privacy.” And that’s have a global approach. But you must understand that there will be local requirements that you also need to be cognizant of.
Dyann Mills: (21:44)
And even across Europe, the cultural differences mean that in specific jurisdictions, there are different approaches taken by the supervisory authorities. And what the GDPR is, is a baseline.
John Verry: (21:59)
Dyann Mills: (21:59)
Right? And there will be certain jurisdictions that will go over that baseline in terms of requirements, you needs to understand what they are, and meet compliance with them.
John Verry: (22:08)
Dyann Mills: (22:08)
And the typical example and one obviously, that we’re very familiar with is obviously the requirement to have a data protection officer, a DPO. So that’s set out in the GDPR. And certain organizations will have a mandatory obligation to have one, but the criteria for appointing may vary across different jurisdictions.
Dyann Mills: (22:28)
So in Germany, if you have over 20 employees, then the obligation could be triggered based on the types of processing that you’re engaged with, that will differ across European jurisdictions, and you need to have an understanding of what those local requirements are, if you’re operating across various jurisdictions.
John Verry: (22:51)
Dyann Mills: (22:52)
But ultimately, taking a high standard like the GDPR across your organization, if you’re a multinational, is a good strategy.
John Verry: (23:04)
Question for you. So you talked about transparency, you talked about consent, we didn’t touch on data subject, access requests, and everything. You kind of touched on the front side of that, kind of establishing everything necessary, so people know what data you’re gathering and what you’re going to do with it. Now that you’ve got that data, right? These people have a lot of rights under most, if not all of these laws and regulations. So let’s talk about those rights. And let’s talk about what most people are scared to death of, right? That data subject access request.
Dyann Mills: (23:39)
Right. And so you’re accessing the back end, right? Because ultimately, you’re holding yourself out as meeting all these requirements. They relate to individuals, and individuals have a right to investigate whether actually what you say you’re doing, you’re actually able to do. And also, a right to understand what exactly organizations are doing with respect to their own personal information. And these sort of fundamental rights is really what ensures that the accountability principle is being met.
Dyann Mills: (24:18)
If an individual making requests to an organization, and can be met with the information that that organization holds about them, and an explanation as to where they got the information from, what they’re doing with the information, the grounds of which they process that information, that holds the organization to account. And there’s a lot actually that can be done operationally to facilitate that exchange of information to encourage the transparency. So an example I can give is, for example, with relation to employees. If you’re very transparent about the way in which you’re processing employee data. Firstly, you build that trust with your employees and it starts with your internal data protection policy where you set all of that information out.
Dyann Mills: (25:09)
And then you can have a means or mechanism for employees to access their own data. So they don’t actually need to ask you for it, if you facilitate that opportunity for individuals to go to a dashboard or a portal, or whatever it is that they’re able to then access information directly from. Again, it’s operationally efficient, and it builds that relationship of trust, and transparency. Similarly, with consumers, too. You can have a way in which you can facilitate self-service while [crosstalk 00:25:47].
John Verry: (25:46)
But you’re sprinting, and most of the people I know are not yet crawling, Dyann. You’re talking about the Holy Grail. So I use the term right to present, right? So that’s what you were just referring to, right? The right to present. So any data that I have about you, I have an obligation, right, to be able to present to you. I have an obligation to tell you, like you said, how I got it. And then you also have the right to request changes to that data. Correct? And you also have the right to request that I effectively forget or delete all of that data as well. Correct?
Dyann Mills: (26:21)
Yeah. If it’s inaccurate, or it’s no longer relevant. Absolutely. Because again, it’s about that right to be forgotten.
John Verry: (26:30)
Dyann Mills: (26:32)
And not having things of your past sort of follow you digitally because they’re forever recorded.
John Verry: (26:40)
Yeah, there’s a few out there that I’d like to get rid of. No, it is funny you say that. A crazy story, just to give you an idea how none of this is. In the late ’90s, like when the internet was fledgling. I like to play poker. And I was on a poker website, Two Plus Two, it used to be called. I don’t know if it’s still out there. And I asked a question about how to play a certain hand in a hold ’em game. And about 20 years later, we get shortlisted for this massive engagement with a government organization. And we’re in final interviews, and get invited into the room and there’s 12 people, and they’re all sitting there. I felt like it was a congressional hearing, if you’ve ever watched one in the US. They’re all seated on the other side of the table. And I’m nervous this could be because this is like this great opportunity for us as a company. The first thing the guy says to me is, “How often do you gamble? Specifically, how often do you play poker?”
John Verry: (27:38)
That’s the first question of like, helping them manage information is going to be a risk. So I answered the question and the whole thing. We end up winning the project, and I caught up to the guy afterwards. I’m like, “What was that about?” He goes, “Oh, yeah. Our research indicated that we want to make sure you didn’t have a gambling problem because you posted a question on a gambling website in 1994.” And I was like, “Oh, my gosh.” So to your point, these footprints are out there that, that people who have the skill or tools or knowledge to do it, can find that. And that would be a great example of the right to be forgotten. If I don’t want people to always ask me that question, thinking I have a gambling problem.
Dyann Mills: (28:13)
Absolutely. And you can see how it’s completely justifiable for that to not follow you, right? And actually, increasingly thinking about the digital footprints are being left by our children where they don’t have the maturity [crosstalk 00:28:32].
John Verry: (28:34)
You’re killing me. I have a 21-year-old daughter, so I know.
Dyann Mills: (28:39)
And they’ve grown up in this digital world, right? And that’s where they express themselves where we would write on walls-
John Verry: (28:46)
And your journal.
Dyann Mills: (28:49)
… and in your journal.
John Verry: (28:49)
And worry that mom might find it under your pillow.
Dyann Mills: (28:53)
John Verry: (28:54)
Dyann Mills: (28:55)
[crosstalk 00:28:55] then be tracking you into your professional life. And when you start out wanting to have a career for yourself. Again, that’s where it comes down to your fundamental rights, right, to exist and to be able to express yourself, and to not have your digital footprints track you and to be raised in circumstances where it’s just not legitimate.
John Verry: (29:20)
Dyann Mills: (29:21)
And it’s not appropriate.
John Verry: (29:22)
So I want to drill down on that one part of that question that I asked you before because it’s something that a lot of times I ask questions, I know the answers to. But this one is an answer I really I want to know the answer to.
John Verry: (29:36)
So this these fundamental tenets of privacy that we talked about, right? The do what you say, say what you do, transparency of disclosure, consent, data subject access requests, right to forget, right to edit, right to delete kind of stuff. If I was someone in the US here, and I was dealing with CCPA and I implemented that. Well, am I 90% covered with GDPR? Am I 90% covered with APAC? If I go to all of the different… I know there’s newer ones, right? Brazil’s got one, Mexico’s got one. Someone else just introduced one. How common are they? Because one of the things that people are going to ask me is like, “Okay. How do I make sure what I did covers me against everything?” Because a lot of times, if you’re an eCommerce company, you don’t really know or can control who’s coming to your website and giving you information and who might come knocking on your door?
Dyann Mills: (30:31)
Yeah. I think, of course, that’s kind of one of the most challenging areas is how do you implement a process and a system that’s fit for purpose globally? And actually, if you have done that work, in terms of your GDPR implementation, it should assist greatly in efforts around CCPA and other frameworks. As we mentioned, GDPR is the high standard. So generally, where you meet that standard, it should assist in other jurisdictions, regions, and for other frameworks. But there is no magic bullet. There’s just, it doesn’t exist.
John Verry: (31:11)
Darn. I was hoping you were going to tell me, “There is one.” Well, look, I think the good news is you said and it was my opinion that GDPR and CCPA were fairly close. And that seems to be the bulk of the questions that, at least the folks that we deal with on an everyday basis are largely sweating.
Dyann Mills: (31:30)
Mm-hmm (affirmative). It’s typically around the rights requests and handling those right requests. But again, you’ve got to have a really solid foundation to be in a good position to not have this operational challenge around responding to requests. John, I think you and I agree with this, if you don’t know the information that you have within your organization. If you have not done that work, to have a good understanding of the information that you hold, where it sits, where it workflows to, we would use the expression records of processing activities from a European and GDPR perspective.
John Verry: (32:05)
ROPA. That horrible word that we all dread.
Dyann Mills: (32:10)
John Verry: (32:10)
Show me your ROPA. No, not the ROPA. Ask me for anything but.
Dyann Mills: (32:13)
The ROPA has to be good. [crosstalk 00:32:16].
John Verry: (32:16)
We’re going to tagline this episode. There it is, Jeremy. Your ROPA Better Be Good. That’s the name of the episode.
Dyann Mills: (32:27)
Your ROPA has to be good, and your data mapping supporting that ROPA also, has to be solid.
John Verry: (32:31)
So I was going to go there yet, but you brought it up. So let’s talk about that, so that’s where the rubber meets the road. And that’s where fundamentally, we’ve got the window dressing on the front side. And then we’ve got the ability to meet the requirements on the backside to service these DSRs. And that piece in the middle, right? That’s the data mapping. Explain to someone what, and you can take it one of two ways. We do data maps, so I know how we do them. But how would you describe a data map and/or how would you describe a record of processing activities? And I’m assuming that from your perspective, that would be like a GDP or was is Article 30?
Dyann Mills: (33:09)
Yeah. [crosstalk 00:33:10].
John Verry: (33:09)
So can you talk a little bit about that from your perspective? And by the way, I’m going to just lean back and grab this bottle of mine [crosstalk 00:33:19].
Dyann Mills: (33:19)
You need to top up. This is getting heavy now, right?
John Verry: (33:21)
You need to top up on your Prosecco, or you’re okay? Is that an IV? You’re looking at me like you don’t know what an IV is.
Dyann Mills: (33:26)
I have no idea.
John Verry: (33:30)
Oh, is that just the IV intravenous? I was joking.
Dyann Mills: (33:37)
Oh, okay. I see.
John Verry: (33:37)
I was joking that you were so hardcore that you were Iving your Prosecco.
Dyann Mills: (33:40)
Not at all.
John Verry: (33:45)
That’s [crosstalk 00:33:45].
Dyann Mills: (33:45)
I’m going to have another sip of my drink. So I think you need this when you start to talk ROPAs and data mapping, right?
John Verry: (33:50)
I got to be honest with you. I think most people need that when they talk with me. I know my poor wife. I come in the house, she goes and grabs a glass of wine because [crosstalk 00:33:59] is alcohol, I believe. So, it’s not unusual.
Dyann Mills: (34:02)
John Verry: (34:03)
Yeah. I think it’s me. I really do. All right, so let’s get to this ROPA data mapping. If you kind of talk people through, it’s a term that they hear a lot about, but I don’t think most people really understand it.
Dyann Mills: (34:15)
Yeah. So I guess, a ROPA really is that sort of view of the information that you have within your organization, the basis from which you hold that information and process that information, the third parties with which you might share that information, the security standards that you’ve applied to the information. And also where required, the risk assessment in terms of any impact that the processing might have on individuals. So you may have a Data Protection Impact Assessment that you then feed into your ROPA. Once you’re at that stage, you know that you’ve got pretty good understanding of your information universe and infrastructure. I’m simplifying it a lot because essentially-
John Verry: (35:04)
It doesn’t sound simple.
Dyann Mills: (35:06)
It’s essentially, how do you get that view? And you may have to do this in a sort of iterative way. One way of getting that view is obviously speaking to individuals who have functional responsibility for the data. And there is going to be a degree of interviewing, recording, documenting what happens within your organization, but then that also has to be validated, right? And normally, the way in which you validate the information that you’ve gathered through that sort of mapping process, is by carrying out a more in-depth review using often technology, and more in-depth review of the systems and the data that you hold within the systems of your organization. And that’s where technical experts such as yourself, John, will come in and do that sort of data mapping, shadow IT review, get deep into the systems and the pipes to ensure that whatever is being said, at the higher-end is accurate.
John Verry: (36:18)
Dyann Mills: (36:18)
Because you might speak with a marketing team who says, “Right, we have these systems, we share data with these third parties. And that’s it.”
John Verry: (36:26)
Dyann Mills: (36:27)
But then you may go and do an in-depth review, you find that there are all these cookies that you have on your systems, there are all these applications that individuals have downloaded and are using, and they haven’t been captured because perhaps they didn’t know the individuals, the front end, and the functional teams weren’t aware of the extent of data use within the organization.
John Verry: (36:52)
So you’d be surprised. So I agree with you. I think the technical tools are interesting because that can… So the way I look at a data map, and tell me if you disagree, you are trying to tie the individual discrete elements of personal information. And we should really talk about how the definition of personal information has changed, hopefully in a second as well. So put a pin in that one, and don’t let me forget that. But how each of the individual components of personal information, flow into your organization, right? What are the processing activities that act on those? And then what are the assets, the systems, the applications, the databases, the SAS tools that store that information? Because that’s going to give me that ability, that’s the map of personal information to the people that touch it, and the activities that act on it to the systems that store it. That gives me that ability to service a data subject access requests. What information do you have? Where did it come from? My data map tells me that. Correct?
Dyann Mills: (37:45)
Indeed, and not just for subject access requests, actually, it’s something that some of the regulators will proactively ask for.
John Verry: (37:52)
Yes, I agree. My internal use for that is that tells me where my data is and why it’s there. And who put it there.
Dyann Mills: (38:01)
Indeed. And then there may be some other strategic benefits as well, of having that overview, right? For organizations. There’s this sort of realization, “Oh, we do that. Is that something that we should continue doing? Is it something that we should stop doing? Or maybe there’s a better way of processing this information?”
John Verry: (38:18)
Yeah. And I think, you talked about using technology to validate that and catch the exceptions of what people don’t realize happening. I would argue that actually, sometimes the processing interviews do the same thing in a way that we can’t catch with technology. Because when you think about it, if I’m scanning with technology, I’m largely living on their infrastructure.
Dyann Mills: (38:39)
John Verry: (38:39)
But if they’re putting a bunch of things up into the cloud, right, I have no way of knowing. Or if they’re sending data to a third party. So great [crosstalk 00:38:47].
Dyann Mills: (38:47)
So you need both you. You absolutely [crosstalk 00:38:49].
John Verry: (38:50)
And to your [crosstalk 00:38:51] the iterative nature of it. We were doing interviews once with a law firm. And they were processing these cases, and we’re going to genericized it. And what happened was, is that the CIO of the organization was shocked. He sat in on the interviews, when this person said, “Then we package all the information up and we send it to X.” And he’s like, “I don’t know who that is.” “Oh, it’s an offshore organization in India.” And so they’re sending inordinate amounts of personal information. “How long has that been going on?” “About three and a half years.”
Dyann Mills: (39:28)
John Verry: (39:29)
So I think to your point. Yeah, those conversations are absolutely critical, the validation on the technology side. But you’ll find an amazing number of things that you can’t pick up with technology through the interviews. Although, I’m a technologist. I’m more of a fan of the interviews because I think that’s where you’re going to really… I think you have to do interviews. Some people try to do it just with technology, and just show me every place in a database or every place in a file share that I can find personal information, and now I know where everything is. No.
Dyann Mills: (40:01)
Yeah. And actually, importantly, is having an infrastructure and process in place to keep that ROPA evergreen, right?
John Verry: (40:17)
Listen, from your lips to God’s ears. But getting to an initial ROPA is hard enough, now putting, like you said, “Putting something in place that would keep that updated dynamically.” That is definitely aspirational I think, in most organizations.
Dyann Mills: (40:32)
It is so necessary. And that’s where your governments infrastructure is so important in terms of who are your foot soldiers who have responsibility for this, right, across your different functions? And how are they empowered and trained to understand exactly what needs to be done here?
John Verry: (40:51)
Listen, I think you’re on your side of the pond, I think you guys are ahead of us by a bit. So I would say here, so many people looked at GDPR and said, “We’re pretty far away from them.” And we once overthrew the king over there anyway.
John Verry: (41:07)
I didn’t mean to bring that up. I’m not gloating, or anything like that. But CCPA changed, I think people’s opinion of privacy, and GDPR as well, at the same time they think that, “Well, I got to do CCPA, so I might as… ” So I think we’re behind the times relative to most of the folks over in the EU. So I do think that, we’re still getting to initial ROPA. And I think for many of our orgs, not getting to a point where our ROPAs are dynamically updated. So that’s why I smiled and laughed when you said that.
John Verry: (41:36)
One quick question for you. So I did want to put a pin on that and come back to it. So in the old days, personal information what we call PII in the US, had a very-
Dyann Mills: (41:46)
Oh, the US PII term.
John Verry: (41:49)
I know. Sorry about that. But we have very specific things that were PII. So one of the huge things that GDPR and CCPA have done is redefined what’s personal information. So what’s the definition of personal information? And what does that mean in real world?
Dyann Mills: (42:05)
So I think this concept of PII, and having specific categories, name, number plates, certain identifiable [crosstalk 00:42:16]. Security number [crosstalk 00:42:18].
John Verry: (42:20)
Privileged identifier it would be a general term that we would use, right?
Dyann Mills: (42:26)
Indeed. There’s some certainty in that. And obviously, that helps, especially security guys, right? They know, right, this information falls under the PII category. So we’re treating it in a particular way within the organization. Personal data is a much broader concept. And it goes back to this idea of Europe considering the protection of personal data as being a fundamental human right. So the way in which they define personal data is any information that can identify an individual or be used to single out an individual is personal data. Because if you can single out somebody, then you can attribute certain things to that individual. And once you are able to make that attribution, actually, you can impact that individual’s life in a positive or negative way. And that’s what gives rise to their rights and freedoms and their human rights potentially being compromised.
John Verry: (43:26)
Mm-hmm (affirmative). And that can be almost anything, right? That’s your sexual orientation, your religious affiliation, political party, dog’s name. Like, literally, almost anything could be used to identify [crosstalk 00:43:38] especially in combination with other components, right?
Dyann Mills: (43:39)
Yeah. [crosstalk 00:43:42] with other things.
John Verry: (43:43)
Right. Exactly. And then in the US with the CCPA, they even go so far as to extend that to or your household, right? So it’s the individual or their household. You’re like anything that can be used to identify the household, which is now you’re into your children, now you’re into your pet. So literally, I tell, “If you have any information about somebody, consider it personal information.”
Dyann Mills: (44:08)
And that’s a safe way to approach it, but then also have in place an opportunity to do your risk assessment, right? So the legislation doesn’t prevent you from processing data. But actually, part of what the legislations is designed to do is to facilitate the flow of data. But you need to do the risk assessments and where you identify risks to individuals, ways to mitigate that risk.
John Verry: (44:38)
Gotcha. So question first. So you mentioned it earlier. And you’re a DPO for many organizations. You mentioned GDPR requires a DPO under certain contexts or constructs. How about the other regulation? So if I want to be CCPA compliant, do I need a DPO or someone who’s specifically assigned that title? Same thing with APAC. Talk a little bit about the DPO. And then also, I think most people don’t understand what a DPO is role is. And there’s, I think, a prescribed role. Some of that stuff about liaising with the data privacy agencies and things of that nature. And I think there’s the non-prescribed roles that I think are the ones we struggle with, right?
Dyann Mills: (45:28)
John Verry: (45:28)
Clients coming to us and saying, “Oh, the law says this, does this meet the law?” Right. So a great example that I think a lot of people listening, or reading this would recognize is, “Oh, a client walked up to me at a trade show, and they gave me their business card. Is that explicit consent?” Or, “We send a newsletter out, and it says, ‘If you want to unsubscribe, click here.’ The fact that they didn’t click here? Is that explicit consent? And does that meet the requirement?” Right. So this interpretation of laws, regulations, things of that nature, new pronouncements, or whatever the right words are. Talk a little bit about that DPO role and what the value prop is. Why you have to have one in certain laws, and then what the value prop is to having a DPO.
Dyann Mills: (46:11)
Okay, so here’s the thing. We know that this awareness of privacy has exploded. And we touched upon it right at the outset, individuals care about the information that is held about them within organizations. So this is going to be one of the kind of the defining issues going forward as we accelerate digital, if you like, progress because we’re all going online now. We’re ordering things online, and all our lives have become very virtual. So that’s the direction of travel, it’s not going away. There are new laws emerging in different regions around this topic.
Dyann Mills: (46:55)
How do you manage all of that within an organization? Understanding that, yes, this is a board-level issue. And you have different functions that bring different perspectives. So we talked about the security function, the compliance function, maybe the legal function, the product teams. So you’ve got all these different functions that will have a particular view and perspective. What’s really helpful is having an individual or a team who have specialism in this area, actually advising on in a very objective, independent way, the compliance.
Dyann Mills: (47:36)
And so ultimately, that’s the role of the DPO. And it’s hardwired into the GDPR. And it was recognized as one of the ways in which you can, as an organization, demonstrate your accountability by having this independent specialist, team or individual who can advise on compliance. And as you mentioned, interface with individuals, whether they’re employees or customers and the supervisory authority.
Dyann Mills: (48:12)
Now, in terms of what about the legislative frameworks? The CCPA, doesn’t mandate that you have a DPO. But ultimately, if you’re a global organization, and you’re operating across different markets, Europe, US, APAC, Africa. Having a DPO who’s cognizant of global requirements, is only going to benefit you as an organization. And fundamentally, you have to do your analysis on whether you have a mandatory obligation to appoint one. And if you’re going after the European market, right? So you’re targeting individuals in Europe, you need to think about, “Well actually, who are my clients of the future, and do I have an obligation to appoint a DPO?”
Dyann Mills: (49:11)
If you operate in the tech space, health tech, FinTech, if you’re processing huge volumes of information about individuals, then you have to do your analysis of whether you have an obligation to appoint a DPO. And even if you don’t fall under that mandatory bucket, given the direction of travel, given all these requirements, you perhaps have to seriously think, “Does it make sense to have a specialist individual or team that can advise on what our obligations are?” And also, support us in triaging on some of the gray areas that you mentioned.
John Verry: (49:49)
Right. Two things that strike me is when I heard you speak is first off, anyone listening that says, “If you’re a global organization.” That doesn’t mean that you’ve got people in seven continents, that means that your products or services are consumed globally. So many of our customers are startups 10-person, 20-person, 50-person, 100-person organizations are global, right? Because they’re providing good, services, especially as you’re a technology company any Software as a Service vendor, if they’re a global organization, so don’t listen to Dyann, say the term global and think, “Oh, we’re not a global organization, we’re based out of Massachusetts.” No, we are a global organization if your customers are global. So that definition of global is, are you addressing a global market? Correct?
Dyann Mills: (50:40)
John Verry: (50:41)
Okay. Second thing is that I’d be curious about and where I see the struggles are. A, the kind of questions I asked you before, about is that explicit consent or not? Well, the other thing is and I don’t know what the right legal term is, but when the CCPA the California Attorney General is constantly releasing, I don’t know what they call them, clarifications, pronouncements or these addendums. And these clarifications, like the original law is 156 pages, and the clarification is longer than the original law. And it tends to muddy the watery. So as an example, they had, I don’t know, 18 pages on validating an individual or something crazy like that. Validating who an individual is before you service a DSR, right?
Dyann Mills: (51:28)
John Verry: (51:29)
And what’s the right way to do that? So my understanding would be that my DPO is the person who I’m going to go to as the technologist, or I’m going to go to as the COO, with the organization and say, “What does this mean? And how do we implement this in our organization in a way that keeps us safe?” Is that fair?
Dyann Mills: (51:50)
Yeah. And I think the DPO is there to advise, is there to advocate on your behalf, and is there to be very pragmatic, right?
John Verry: (51:58)
Dyann Mills: (51:58)
They understand your business, they understand your services, and they have to help you to operationalize these requirements, as you say, in a way that makes sense for you. Now, often, the DPO is providing that oversight role. So they’re not actually the ones that are making the decisions around the processing of information. One of the key requirements is that the DPO has to be sufficiently independent, in order to be able to properly fulfill their duties and responsibilities. And you mentioned that there is specific statutory role that the DPO plays. And then there was the slightly more nuanced role around advocacy on behalf of organizations, horizon watching, advising an organization. But ultimately, they’re there to ensure that you meet compliance with your legal obligations. And they have to do so in a way that is free.
Dyann Mills: (53:03)
And I use the analogy of a referee, a coach and a player, right? So as a player, you can’t be the referee because effectively, there will be an element optically of bias, you’ve got two sides playing. One player on the side, cannot be a referee. If you’re a coach, that’s a very, very different role to being a referee, and I would associate a coach as being the privacy office or the chief privacy officer within an organization. You’re kind of looking to see what the strategic opportunities are for the organization, how you’re able to meet those strategic opportunities around data, but you’re not necessarily acting as a DPO. The DPO effectively is the referee. The DPO needs to blow the whistle to say, “Well, actually, no, there’s a red line here. This is something that would be contrary to your obligations, and it’s a no go.”
Dyann Mills: (54:05)
Now, as an organization, you can accept that position, or you can challenge it. You can say, “Well, we’re going to take a risk-based approach.” Provided that you document that and you’re able to demonstrate the sort of the considerations that took place in arriving at that position, fine. And really, there are opportunities for a DPO to report to a supervisory authority. [inaudible 00:54:33] is very strictly laid down, and it would be quite rare actually, that you will find a DPO not coming to some form of agreement with the organization for whom they’re acting as a DPO. So you can record your… And this is where processes and if you like, documentation like your Data Protection Impact Assessments are really helpful. So you set out the processing that you will be engaged in, you set out the risks associated with that, you set up the DPO’s advice, and you can either choose to follow it or not. But that’s all very clear, very transparent, documented. And that gives you then the basis to move forward as an organization.
John Verry: (55:21)
Yeah. I never really thought about the value proposition of having a DPO during that DPIA. But for anyone listening, Data Privacy Impact Assessment, that’s what we’re referring to. And if you’re not a privacy person, it’s the equivalent of a risk assessment that’s focused on personal information. Information security people do risk assessments, we don’t do DPIAs. But it’s basically a risk assessment for personal information.
Dyann Mills: (55:48)
It’s actually the same thing, yeah.
John Verry: (55:48)
Yeah. It’s essentially the same thing. And I think most people especially we do a lot of work in ISO 27001 as you might know. And ISO 27001, we have the same exact concept there, which is really interesting to me. We have this concept of you conduct a risk assessment. ISO says, “You should have these 114 controls, and they should be implemented in accordance with the risk and context.” And you can say, “I don’t need that one.” And an auditor might disagree with you, but if you’ve documented in your risk assessment fairly and reasonably, you follow the right process, and you’ve got the evidence to support, then you should be okay.
John Verry: (56:24)
So it sounds like it’s the same situation here. And the DPO would have the added value of being able to… If you’re going to accept a risk that a knowledged individual would look at your decision and say, “That’s not consistent with prevailing wisdom or good practice.” The DPO would be that person that would do that. So I can come to you through my DPIA and say, “Here’s what we’re going to do.” And you would say, “Check, check, check, check. You know what? Not having locks on the door and accepting that risk.” I know that, that would save us from having to spend money on a locksmith, but probably not a good idea. And if you did have an action by a DPA, we’re going to end up in the hoosegow for that one, right? Hoosegow, by the way in the US a term for jail. By you English people, I was going to translate stuff for you.
Dyann Mills: (57:14)
John Verry: (57:17)
So that’s another value proposition, right? I would imagine you’d be very, very helpful while we’re reviewing a DPIA.
Dyann Mills: (57:24)
Absolutely. And actually, there is an obligation to consult DPO and seek their opinion when completing a DPIA because you have to involve them in decisions around processing of data where there is a risk to the rights of individuals. So ultimately, DPOs must and do input into the DPIA process.
John Verry: (57:47)
And that would be probably a reasonable time to just say something. So Dyann and HewardMills, are starting to help Pivot Point work with our clients because what we’re increasingly finding is some of these challenging legal questions to ask. We found that there’s a good delineation that Dyann and her team are legal experts and can offer those opinions and provide that legal advice and real-world experience. We’re coming from the opposite direction, information security people. We know how to implement the information security controls appropriately to achieve the privacy requirements. So if you like Dyann, she’s offering a virtual data privacy officer, and she’s working with a number of our clients for that reason. We think she’s great.
Dyann Mills: (58:29)
Thank you, John. One thing I would say is that we as a DPO… Obviously, we provide regulatory and compliance advice as opposed to legal advice. But work side by side, as you say, with you guys to ensure that there is robust interpretation of the legal obligations that organizations have.
John Verry: (58:51)
I’m going to use that term. No, seriously. I think that was well said. Robust interpretation of the legal side because that’s really where we struggle because we’re not lawyers. We can look at the regulation, we can say, “Well, the way we interpreted is this, but the way a non-lawyer and a lawyer interpret legal language is very different. For anyone listening, you do not want your IT people, or infosec people interpreting legal language.
Dyann Mills: (59:18)
Yeah. And I think it’s about the complimentary skill sets, isn’t it? And it’s knowing what we do well, and working together to achieve a good result for our clients. But we certainly very much value your technical expertise and your IT experience when we come to dealing with those sorts of issues. And in particular, we talked about it, didn’t we? In the breach situation where you need both the interpretation understanding of when a breach is reportable or not, and also some of the security issues in looking at reports or drafting reports. You need that very technical expertise. So it is, in my view, a very complementary skill set, and both are needed around the table. Especially [crosstalk 01:00:06].
John Verry: (01:00:06)
And you can’t find both in the same person, right?
Dyann Mills: (01:00:11)
John Verry: (01:00:12)
It’s just not possible because you spend 24 by seven, right, staying on top of your side of the equation, we spend 24. You can’t do both. Anyway, enough said. We’d beat the hell out of that one. So hold up your glass. I just want to make sure you’ve had enough of that Prosecco [crosstalk 01:00:33].
Dyann Mills: (01:00:32)
Oh, I’ve had plenty.
John Verry: (01:00:33)
Is it done. [crosstalk 01:00:34]. That’s not enough. Should I ask the Brexit question now? Because I know what gets you upset, or should I let you finish the glass.
Dyann Mills: (01:00:41)
I’m definitely going to take a sip.
John Verry: (01:00:45)
So here’s what the things that I’m wondering about. So if Brexit happens, and you guys were back, negotiate with them, aren’t you?
Dyann Mills: (01:00:52)
When it happens, we’ve literally till the end of this month.
John Verry: (01:00:55)
It’s when? I thought that I saw that with some of the conversations that there was maybe a last-minute reprieve. But if Brexit happens, and assuming Brexit happens, so you’re no longer under GDPR, would that mean? And does that mean that if organizations now are working with citizens of Britain, that we’re going to have another regulation that we have to deal with?
Dyann Mills: (01:01:18)
Yeah. When we fall out of the European Union, I think we need to just deal with the realities of what we’re facing.
John Verry: (01:01:27)
I was trying to be nice to you. I know you get upset when we talk about it.
Dyann Mills: (01:01:31)
So obviously, we don’t know what the extent of the deal that’s going to be struck with Europe will be. And so between now, and when that’s clarified, they’re speculating. And there are all sorts of possibilities from a very hard Brexit to a softer Brexit. And those hard negotiations are happening now. But ultimately, as the UK falls out of the European Union, it will no longer be subject to the European GDPR. What the UK Government has is a provision that is the UK’s own version of the GDPR.
John Verry: (01:02:17)
Please tell me you have photocopied it and it’s identical.
Dyann Mills: (01:02:22)
It’s pretty much a copy and paste. And the good news is that the UK certainly will recognize Europe as being a safe region to transfer data to, but that’s not necessarily going to be reciprocated. We’re watching very closely to understand whether the UK will receive what’s called an adequacy status from Europe, and will be deemed a safe country to transfer European data to. That’s essentially, some of the movable parts that we’re having to wrestle with at the moment, alongside other requirements.
Dyann Mills: (01:03:07)
For example, if an organization is targeting or operating in the UK, is there a need to appoint an Article 27 representative? Do they have an establishment in the UK? I know, there’s the same considerations for those organizations that don’t have a footprint in Europe, for example, but offering goods and services and targeting individuals in Europe. So this consideration of are they obliged to appoint an Article 27 representative?
Dyann Mills: (01:03:39)
So essentially, as we were discussing earlier, yes, there are global requirements, there are regional requirements, but you also have to be cognizant of the local and emerging local requirements. And of 1st January, we’ll have a clearer idea of what the requirements are for the UK and to meet compliance with the UK’s data protection regime.
John Verry: (01:04:04)
Okay, so you realize I’m from America, and we don’t really care what happens to us. So if it’s tough on you guys in Britain, you made your own bed, lay in it. So let’s talk about what matters to us.
John Verry: (01:04:20)
So you talked about this adequacy. And I think it’d be fair to say that the way that many US companies tried to prove that adequacy was the Privacy Shield in the US, which has been denigrated minimally or it disappeared. So what are the implications of the fact that for US companies, that Privacy Shield no longer exists? And then I would like to follow on that question with, we’re fans of the ISO 27701 standard, if you’re familiar with that. We’re excited that we just got our first couple customers ISO 27701 certified, and we’re viewing ISO 27701 as, in the absence of the Privacy Shield at this point in time as something that you can have an independent objective validation of your privacy program to prove some level of adequacy. So could you point on, tell us what that denigration is and whether or not you’re a believer or not in 27701 being a somewhat of an alternative?
Dyann Mills: (01:05:19)
Yeah. So I guess, one of the main implications of the Privacy Shield being invalidated just a huge amount of uncertainty around transfers of data to the US and what alternative mechanisms can be put in place to legitimize the transfer of data to the US. So I think a lot of organizations have been doing a lot of work to understand their data flows, it goes back to this whole idea of data mapping and having a good record of processing activity, good ROPA in place, and then also understanding the risks associated with the transfer and what alternatives can be used.
Dyann Mills: (01:06:00)
So understand the contractual clauses. Mainly the go-to mechanism, the alternative mechanism used to legitimize transfers. And the European Data Protection Board, the body that appoints and provides guidance, and further instructions have issued details of what other additional measures organizations need to consider and put in place when transferring data to third countries, including the US. So we’ve got all that information and that’s being worked through.
Dyann Mills: (01:06:33)
I think, ultimately, what you are pointing to is being able to demonstrate that you have certain standards in place, that are applicable globally, that demonstrate you to be a good organization, and one that can be trusted in handling information. So I think that’s essentially what the ISOs do. And so yes, of course, as a DPO, I’m a strong believer in meeting those standards and international standards.
John Verry: (01:07:05)
That is what I was hoping to hear. No, because if you think about it, and you probably struggle with the same thing. You have clients calling up, and they’re saying, “Hey, I’m getting these DPAs, Privacy Shield is gone.” How do I prove to somebody for short of me saying, ‘Hey, I’m doing a great job, trust me.’ How do I prove to somebody that we’re doing the right thing?” And to me, third party attestation, independent objective third party of pining on the adequacy of your controls is really the only way to do that. And there’s really no other way. I guess you could do with the SOP to with their privacy principle, and which I think is also a solid way of doing it. Although, I think that that privacy principles a little bit older, where the ISO standard is brand new, and the ISO standard directly maps to. I think it used GDPR as the basis, right because it covers all of those key tenants that you laid out prior, right?
Dyann Mills: (01:08:00)
John Verry: (01:08:00)
It covers our transparency, it covers consent, and it covers that whole from data map DPIA, DSR down, right?
Dyann Mills: (01:08:11)
Absolutely. And I think, look, frameworks are important. We have some clients who still subscribed to the Shield, not that they’re relying on the Shield to validate their data transfers. But they believe in the sort of rigor and the framework that the Shield provides to them. So they want to be able to demonstrate that they’re still meeting those standards. So standards are important. And the good thing about the ISO as you say it’s kind of very relevant and incorporates the GDPR requirements, and it’s global. And there are others too. So one of the other alternative transfer mechanisms is Binding Corporate Rules.
John Verry: (01:08:52)
Oh, I didn’t know about that. What does that mean?
Dyann Mills: (01:08:54)
So that’s a standard that organizations can use to demonstrate their privacy programs as being robust. And it’s a standard that is reviewed by the regulator. So not only does it allow you to transfer data freely within your corporate group, but you have to demonstrate that you actually have a robust program. So you have one with a good infrastructure, good policies in place, you’ve got regular audits, you’re able to demonstrate that you’ve got the right notices in place, you’ve got the right procedures in place, and that’s validated by the European regulators. So it gives you that security that you’ve actually met a very high standard and met the requirements of the European regulators.
Dyann Mills: (01:09:45)
The challenge around it is the time and the resource investment required, but increasingly, it’s seen again a bit like an ISO and other standards. It’s seen as something to definitely aspire to and to look to secure, especially if you’ve done a lot of work and have invested heavily in implementing a robust GDPR program. A lot of the requirements that you would be expected to meet for Binding Corporate Rules, for BCR, will be met if you’ve had a very robust GDPR program.
John Verry: (01:10:19)
Is that BCR a third party audit, or is a self-attestation?
Dyann Mills: (01:10:25)
So it’s reviewed and approved by the authority. [crosstalk 01:10:30]?
John Verry: (01:10:29)
So it’s third party?
Dyann Mills: (01:10:29)
John Verry: (01:10:32)
Cool. Does BCR only exist in the EU, or could a US company go BCR as well?
Dyann Mills: (01:10:37)
Yes, so it’s used for global companies. So what it allows you to do is to transfer European data freely within your group. So if you’re a say US headquartered organization, but you have operations across Europe, and you want a global, externally approved standard, Binding Corporate Rules is something that you could potentially look into. But especially where you might have a patchwork or matrix, standard contractual clauses that allow you to transfer within your group. That might be quite difficult to maintain, especially when you have new entities or new processes, how are you able to update your standard contractual clauses in a way that gives you confidence that you are meeting compliance. And especially that now, where there is so much focus on the substance of your standard contractual clauses, so you can’t just sign those clauses and put them in a cupboard and expect that that’s enough. You’re increasingly required to demonstrate how you meet compliance with your standard contractual clauses.
Dyann Mills: (01:11:40)
So with all that additional scrutiny, many organizations are looking at a holistic global standards such as Binding Corporate Rules, that not only allows for the transfer of data globally, but also demonstrates that you have met high standards in terms of your internal policies, your procedures, your audits, your corporate governance, around data protection and privacy. It really does demonstrate you to be best-in-class when it comes to data protection and privacy compliance.
John Verry: (01:12:13)
Thank you for that. Question for you. So one of the things which we’re seeing a lot more is the term Privacy by Design. What does that mean?
Dyann Mills: (01:12:23)
You do have a way of just asking these very [crosstalk 01:12:26] very simple way.
John Verry: (01:12:31)
Why do you think I’m the host instead of the guests? It’s much easier to be the host, I can tell you that. I don’t have to know anything. I can say one pithy little thing, I sound really intelligent, and then I put you on that seat. So welcome to the hot seat.
Dyann Mills: (01:12:48)
Thank you very much. I think it should be simple because if it’s over complex, then actually organizations won’t buy into it, and individuals won’t understand it. To me, Privacy by Design is building privacy into the culture of an organization. And thereby ensuring that privacy is embedded in all new technologies and processes that the organization could be engaged in. And this is ensuring that privacy is the default standard, not something that an individual consumer needs to try and find, and implement. It’s there by default.
Dyann Mills: (01:13:31)
And there is an opportunity for organizations to really consider what does it mean for them. So you can have all the tools and all the guidance, and all this kind of the technical standards and procedures. But really, in terms of how does this play out for us? And how can we make this relatively straightforward for our product teams, for our HR and people teams, for our compliance teams? How do we start to build that culture of privacy within our organization? And typically, as we’ve alluded to, it starts from the top. So the top guys-
John Verry: (01:14:15)
Dyann Mills: (01:14:15)
If the top guys believe in security by design, guess what? That becomes embedded within the culture of an organization. If the tone at the top is to believe and champion Privacy by Design, then that becomes the culture of the organization. And I tell you what? In the age of pandemic, treating individuals fairly and building trust, concepts like Privacy by Design, and being able to demonstrate that you are a good organization, in terms of respecting the rights of individuals is going to become so critical.
John Verry: (01:14:53)
So I think that’s a really good thing to begin our wrap up on because I think what you just said is so germane to so many of the things that we’re seeing happen in the marketplace right now. In the US, we have the Cybersecurity Maturity Model Certification. And really, what is that about? It’s about the defense industrial base saying to the primes, and that you can trust us because we’re doing things right.
John Verry: (01:15:19)
And really what you just said about Privacy by Design, is you can trust us as an organization to do business with, to provide your data to. We’re going to do the right thing, we’re going to tell you what we’re doing, we’re going to share that data with you. And I think so many organizations like, in the defense industrial base, when they know it’s going to cost them 100 grand plus to achieve conformance, or when we’re talking with a client, it’s going to cost them 50 or $60,000, to achieve baseline initial conformance with some of the privacy rights. They don’t look at that. They look at this as money that has no return on investment.
Dyann Mills: (01:15:53)
John Verry: (01:15:55)
And realistically, what you just painted was the perfect picture of why it’s not. The return on investment is going to be that if you’re not doing this… We’re at a point now where no one is trustworthy, right, from a privacy perspective, very few companies, right? And we’ll go into a place where a fair number, and if you get behind that curve, there’s going to be a financial implication. Yes, there’s a financial implication to getting there. But there’ll be a financial implication that’s bigger than that to not getting there, right? So in the defense industrial base with the CMMC, if you do not get certified, okay, you can no longer do business. And privacy is probably going to sit in the same arc, right? People are not going to give you their data.
Dyann Mills: (01:16:37)
You just won’t get it and you’ll be sued. You’ll have action and individuals are going to be armed and rightly are going to be able to pursue their rights, where those rights have been infringed.
John Verry: (01:16:53)
Yeah. So this has been awesome. Thank you. I’m looking over the list of things that you and I had chatted about prior. Yes, I do prepare for this, people. You don’t think I just show up and [crosstalk 01:17:06].
Dyann Mills: (01:17:05)
You’re making this so effortless.
John Verry: (01:17:10)
Exactly. So anything we missed, or anything that you think we should kind of just hit on before we wrap up?
Dyann Mills: (01:17:15)
So you asked about, and I don’t know whether this is part of the wrap up of like future topics to-
John Verry: (01:17:22)
Oh, I’m going to get to those. I’m going to-
Dyann Mills: (01:17:23)
All right. Okay.
John Verry: (01:17:24)
You’re not going off the hook on the DPO question. I just meant, is there anything in our main block of stuff we wanted to talk about that we haven’t gotten to yet? I think we beat it up pretty good to be honest with you.
Dyann Mills: (01:17:35)
Yeah. No, I think we did well with Privacy by Design, Brexit, cyber security issues, definitions. We covered a lot tonight.
John Verry: (01:17:45)
We did. I think you did. Well, not me. But that’s another story guys. It’s all right. [crosstalk 01:17:49]. Night for you, day for me, although it’s rapidly becoming night here, too. So did you do your homework because I’m going to ask you the question, and I hope you did your homework. So what fictional character or real person do you think would make an amazing-
Dyann Mills: (01:18:02)
John Verry: (01:18:03)
Oh, she didn’t do her homework folks.
Dyann Mills: (01:18:04)
John Verry: (01:18:05)
[crosstalk 01:18:05] that makes it a horrible DPO, and why?
Dyann Mills: (01:18:08)
Okay. No, I thought about this-
John Verry: (01:18:09)
I thought you’d say Boris Johnson. If Boris Johnson’s your answer, we’re going to strike it from the record.
Dyann Mills: (01:18:16)
I’ve gone for a lady, of course because I’m a champion of women too. And so my DPO of choice would inevitably be Michelle Obama.
John Verry: (01:18:26)
Wow, very cool. Why?
Dyann Mills: (01:18:28)
I think she’s got all the right attributes. She understands the law. Obviously, you have the legal training. She is a very effective communicator. I think she’s very clear, full of integrity, and she’s fun. You want your DPO to be someone that you want to spend time with.
John Verry: (01:18:49)
Listen, I don’t think you can underwrite fun for information security and IT because so many of the people in this field are dreadful to deal with, right? And it’s serious crap, but you’re right. So I love the fun element, so that was a good answer. Well done. You did prepare for it.
John Verry: (01:19:08)
Anything else? So you chat every day about privacy. Is there anything else that you think would be an interesting topic for another episode that we didn’t cover in today’s episode?
Dyann Mills: (01:19:17)
So we touched upon the protection of children, right?
John Verry: (01:19:22)
Dyann Mills: (01:19:23)
And I think that it lends itself to a longer discussion, right? Because this is an area that obviously, we all care greatly about the future. And just the risks that our children are potentially being exposed to right now. And how technology just impacts is so pervasive in their lives. Increasingly so actually, in light of the pandemic. So they have online classes, they’ve got technologies that they have with them continuously.
John Verry: (01:19:58)
Yeah, it’s dreadful. TikTok. My daughter is a brilliant young lady. So by the way, I’m a huge advocate of STEM. Women in Science, Technology, Engineering and Management. My wife is an engineer, my daughter is going to be an engineer. So I’m a huge advocate and supporter of that. So I applaud you for the Michelle Obama. And it scares the hell out of me to know that we’ve got 12, 13, 14, 15. I think back to all the stupid things that I did, and if that stuff was sitting somewhere as a matter of public record, I wouldn’t be in the seat that I’m sitting in now.
Dyann Mills: (01:20:34)
John Verry: (01:20:34)
I did some absolutely stupid things.
Dyann Mills: (01:20:36)
We all have. Let’s find a human being that hasn’t made a mistake. But the fact that this is now being tracked as part of their digital footprint is frightening, and we need to do something about it.
John Verry: (01:20:46)
Right, so that’s actually really interesting. Now, I’m just curious as to your thought process on that. Is it a generic conversation on that? Or is it how it impacts business? Or should it be how it impacts some form of public policy? I’m curious as to where you’re going with that? Because I think it’s a really interesting topic, but where are you going with it?
Dyann Mills: (01:21:06)
So I think policy is obviously a very important aspect, and we need to influence that and shape it and quickly, right?
John Verry: (01:21:13)
Dyann Mills: (01:21:14)
Because these technologies are out there. And they’re all pervasive. But I think also businesses have a responsibility in not just same way of being respectful of any category of individuals that are vulnerable. [crosstalk 01:21:26]. And to me children are vulnerable.
John Verry: (01:21:28)
Well said. That’s incredibly well. So I was just going to say the same thing is that, that we can’t treat privacy for that class of people, the same as we treat privacy for adults, right? Because like you said, “They’re more vulnerable.” And maybe there are other… It’d be interesting, I never really thought about this, but there might be other vulnerable subpopulations, right? Children being a sub-population of the broader thing. I know about it, just because some of our clients have it as a rag, and I know a little bit about it. But do you have the equivalent of the US law, COPPA, Children’s Online Privacy Protection Act? Do you have the equivalent in the GDPR?
Dyann Mills: (01:22:07)
So effectively, the GDPR does also cover protection of vulnerable individuals, of which children will fall into. So it’s a sort of umbrella framework, but actually, the supervisory authorities have also issued guidance, specifically around the protection of children because they’re considered to be vulnerable individuals. So yes, GDPR is the all-encompassing, but there are also specific provisions by supervisory authorities on children.
John Verry: (01:22:41)
I like that. Thank you. Maybe I can convince you in six months or a year to come back on, and have that conversation.
Dyann Mills: (01:22:46)
Well, I think it’s a deal.
John Verry: (01:22:48)
Great, it sounds good to me. So last question, before we say goodbye. So anyone listening is convinced that they want to know who Dyann Heward Mills is and what your organization does, how would they get in contact with you if they want to chat?
Dyann Mills: (01:23:03)
So I’m on LinkedIn so they can reach out to me on LinkedIn or email@example.com. So just drop an email, and we’ll definitely respond and connect that way too.
John Verry: (01:23:16)
Excellent. So thank you so much. I know it’s by this point, what time is it there? 9:30? 10:30 at night?
Dyann Mills: (01:23:23)
Yes, it’s just gone 9:30.
John Verry: (01:23:24)
Yeah. So basically, you’re ready to head out on the town. So I really appreciate you especially on a Friday night at 9:30, wasting your time with me. I appreciate it so much. So thanks for coming on, Dyann.
Dyann Mills: (01:23:38)
I’ve really enjoyed it. Thank you so much. And it’s been a very positive experience as I expected it to be. And I’ve loved the topics that we’ve covered. So you’ve been a great host, John. Always good to talk to you.
You’ve been listening to The Virtual CISO Podcast. As you probably figured out, we really enjoy information security. So if there’s a question we haven’t yet answered, or you need some help, you can reach us at firstname.lastname@example.org. And to ensure you never miss an episode, subscribe to the show in your favorite podcast player. Until next time, let’s be careful out there.