TRANSCRIPT | PRIVACY & CONSENT WITH KATIE STEVENS

Identity at the Center is a weekly podcast all about identity security in the context of identity and access management (IAM). With a combined 30+ years of IAM experience, hosts Jim McDonald and Jeff Steadman bring you conversations with news, topics, and guests from the identity management industry.

Male

You’re listening to the Identity at the Center podcast. This is a show that talks about identity and access management and making sure you know who has access to what. Let’s get started.

 

Jeff

Welcome to the Identity at the Center podcast. I’m Jeff, and that’s Jim. Hey, Jim.

 

Jim

I wanted to mention we met a new colleague the other day: Andrew. He pointed out that he went into the Apple podcast search bar and searched for “access management,” and we were the first podcast that was returned. I’d say we are definitely in the top 10 now. My plea to our great, fantastic guests who have been so awesome about reaching out to us and connecting on LinkedIn: Keep doing that. Also, if you think it’s worth it, go out, subscribe to the podcast if you’re not already, leave a five-star review — that would be fantastic. That’s such a shill for the podcast. I don’t like doing that, but at the same time, we want to get the podcast out to more people. I think a lot of people reach out to us and say, “I didn’t even know your podcast existed.” If more people know about it, then I think we’d get more listeners. That’s really about getting the best practices and getting the word out there.

 

Jeff

The only thing that I will correct on that is, I read this morning that Apple is changing their verbiage to no longer be “Subscribe” and to be to “Follow podcast,” because there is this perception sometimes that a podcast costs money because of the subscription element to it. Follow us on Apple, Spotify and wherever. Listen, share. Obviously, the reviews and stuff certainly help from a search ranking perspective, and things like that.

That’s enough of a commercial for us. There are dozens of people out there who might be interested in that. We should talk about a couple of things that we’ve been mentioning — things like Identiverse 2021. That’s coming up here in June 21–23. It’s both in person in Denver and virtual. I could say pick your poison, but that’s kind of forward, right?

 

Jim

I’d like to go in person, because I need some new T-shirts, and I’m hoping to get to the vendor hall and get some free T-shirts.

 

Jeff

Yes. Okta, the last couple of years, has had the T-shirt machine, where you can customise it. They learned their lesson after the first two doing that where maybe some people — I won’t say who — were submitting text that probably wasn’t appropriate.

 

Jim

I think the last in-person Gartner, they had hats that you can pick your own patch and where you wanted it. I thought that was super cool.

 

Jeff

Yes. I found my “IAM Hero” Okta hat recently. It was a couple of days ago. Maybe at some point, if we ever go on video, that will be a nice background prop.

 

Jim

Yes, absolutely.

 

Jeff

We’ve also got Identity Management day — that is April 13. That’s something that we’ve been talking about. It’s a cool thing that the Identity Defined Security Alliance has been putting together over at IdentityManagementDay.org. Hopefully, folks will celebrate with us on that day. I got an email from them asking for tips for best practices and things that organisations struggle with. I’m curious to see what they’ll do with that information as they collect it from the folks who are taking part. You can get more information about it, again, at IdentityManagementDay.org, or you can go to our website, where I’ve placed a very handy little Click button so that you can go right to their website because yay, internet.

Enough talk here. I know we want to talk about privacy and consent today. That is a topic that is more and more becoming intertwined with IAM in a whole bunch of different ways. I’ll be honest: It’s not an area that I am super strong with. I’m very happy that we’re able to get one of our experts on here. Her name is Katie Stevens. She’s a director with Protiviti’s Security and Privacy practice. She’s also the lead for privacy as a service. I want to welcome you to the show, Katie.

 

Katie

Thank you, Jeff. Thank you, Jim. I’m very happy to be here.

 

Jeff

Yes, thank you much for joining us. One of our first questions that we always ask our guest is, how did you get into the infosec space, the IAM space? Obviously, we’re going to talk through security and privacy at some point, too. Take us through your journey and then how you’ve gotten into this. Is it something that you chose, or did it choose you?

 

Katie

I definitely chose identity access management, but privacy came completely accidentally for me. I was exposed to identity access management very early in my career, when I started my first job at Sprint. I joined their software development team, and my job was to be the production gatekeeper. I was responsible for moving code from the test environment to production. As part of that role, I also managed what we used to call the fire quality process. I think you’re familiar with that term: It’s when you grant developers temporary access for emergency situations. Back then, we didn’t have tools like CyberArk and had to do all this manually. I remember myself carrying a pager 24/7, manually provisioning access and removing access. It was quite a bit of a nightmare, but this is the first time when I had learned about segregation of duties and why restriction of access is important. That was when I first became passionate about identity access management.

Just to fast-forward a little bit, my career then took off. I went to JPMorgan Chase, and I spent about eight years there doing identity management, information security and data protection. At that time, I got exposed to privacy laws like the Gramm–Leach–Bliley Act, the GLBA, which controls and governs financial services. I didn’t think that privacy was on my radar early on in my career, but as I started to learn more about the privacy regulations in Europe — my background is, I grew up in Russia, and in Russia, we had very little privacy — I got very interested. When the General Data Protection Regulation in Europe was passed into law in 2016, I really jumped at the opportunity to help, operation-wise, our model and help our clients. Now, I lead our global rollout of the managed privacy services.

 

Jeff

You piqued my interest about growing up in Russia. You have to dive into that briefly. What was that like?

 

Katie

I have lots of stories. I moved to the United States when I was 15. I moved from Russia to Iowa. My parents had a job with Iowa State University. It was a huge culture shock. Russia was very simple, a very simple life: lines to go everywhere in stores, people wear very similar clothes. Now that I know what privacy is, a lot of messaging that was delivered to us in school and just broadly, the population was very controlled. Again, this is going from very little privacy to what we have today with GDPR — it’s like night and day.

 

Jeff

Very interesting. I don’t know if we’ve had anyone that has that Russia experience on the show. Thank you for bringing it. You mentioned privacy as a service. I always think that’s an interesting approach. What does that actually mean? What is privacy as a service?

 

Katie

Privacy as a service is a managed service that helps our client quickly stand up and sustain ongoing privacy operations. What we’ve seen early on, when GDPR first came to light, was that a lot of companies — especially U.S. companies — practice this wait-to-see approach. They weren’t sure what was going to happen and how the GDPR was going to be enforced globally, so now, those companies are struggling to catch up with the new and enhanced privacy laws. This is when privacy as a managed service really comes in. We can quickly deploy the programmeme, maintain it, manage it, report to the DPO, and be arms and legs with the organisation to execute their privacy needs and privacy obligations.

 

Jim

I thought the Russian angle was interesting as well. I think that we’re probably around the same age and grew up in that time frame where you had movies like Top Gun, and Rocky IV, where Ivan Drago is the bad guy. It’s interesting that you’ve morphed into the world of privacy. You mentioned there was not much privacy expectation in Russia. I was at a conference five or six years ago, and one of their board members was the former CEO of Sun Microsystems, Scott McNealy. He made the statement “You have zero privacy —get over it.” The folks that are deciding what to do with your data are in the technical world, or the computer world, and your data is now in the system and is being pushed around.

One thing that I find extremely frustrating is when I have to use a website. I have to do online banking, where I have to have an account with my electric utility or something like that. They say, “This is our privacy policy. Do you accept or not accept?” If I do not accept, I cannot use their website. Now, how is that fair? How does that really give me an option? I’m actually going to pose the question at a very simple level, but feel free to freestyle: What is privacy and consent? Because to me, that’s not consent. I’m not really consenting to give my information. I’m not really agreeing to their privacy policy. I’m being forced to. That’s not freedom.

 

Katie

You’re absolutely right, Jim. When we talk about consent, we have to keep in mind that the privacy laws are designed for individuals, and for those individuals to gain better control and transparency over collection and use of their personal data by companies. This means that in most cases, individuals must have a choice to be able to prevent the collection and usage without impacting the service.

Consent is simply a mechanism on how you capture that, but there are two distinct consent types: The first one is the affirmative consent, what we call the opt-in. That’s where you actually have to opt in in order for them to collect the data. The second type is implicit consent. It’s implied. If you accept that privacy notice, it’s implied that you’ve consented. Now, with the General Data Protection Regulation in Europe, the implicit consent is not legal. It does not exist. One of the significant differences is where some of the privacy laws in the world are not requiring that explicit consent, and the GDPR does. We’re starting to see companies that are global companies go through this transformation effort to obtain consent.

By the way, they can’t just accept a blanket consent. If they’re going to share data with third parties, they have to obtain specific consent. If they’re going to use it for marketing purposes, they have to obtain specific consent. Every purpose requires for you to consent, opt in, and then allow you the right to than opt out and withdraw your consent, which means they have to go back and delete that information upon you executing that right. You’re absolutely right, Jim. I think we’re going to see this practice evolve — U.S. regulations such as the California Consumer Privacy Act, which is now CPRA, the new Virginia privacy law — I think we’re going to see some more maturity, but we’re not there yet.

 

Jeff

You mentioned a few different new laws, at least here in the U.S., that have come up recently. Maybe we should start with the GDPR, because I feel like that is the big one that most people at least have awareness about. They may have not all the details, but they know that the GDPR is a thing. We have to account for it from a usability standpoint but also from a data-collection standpoint. Then, what does that mean for users and being able to manage their data? Let’s start with the GDPR, and then we can talk about California and Virginia and the U.S. What is the GDPR, in a nutshell?

 

Katie

If you think about the GDPR, it is perhaps the most sweeping law that came out in the last 20 years. It really set the standard. The reason for the change: In Europe, very similar to what we’re going though in United States, each country had a data protection regulation that was slightly different, and it was very difficult to comply, so the European Data Protection Board came together with all the member countries, and they decided to pass this General Data Protection Regulation, which is actually much stricter than what they had before. Let me tell you why.

In Europe, in order for you to collect the data, you must have a legal basis. What this means is, you can’t just go and collect data from individuals. You need to either have a contract with them and have a service, you need to have a regulatory requirement to collect the data, you need to obtain consent or even use legitimate interest. There are all this legal mechanisms that you have to apply before you collect the data. If that data does not rely on a legal basis, that data is illegal and must be deleted. Some of the latest and probably the largest fines that you’ll see in Europe that have been issued related to the GDPR are because data is collected without that legal basis. It becomes very important. Unfortunately, we don’t have that yet in the United States. We’re starting to do purpose limitations, but the formal legal basis is not in place.

The second significant change was the right to be forgotten, and having access to data. If the individual wants to exercise their rights, now they are able to go to companies and request to be deleted. Now, there are some exceptions that apply. I always tell my clients, “If you want to have no privacy requirements, go delete all the data,” so companies can’t just go and delete data right and left. If the data is necessary to finish processing of a transaction, if it’s required to be kept for regulatory reasons, retention policies — there are some exceptions when that deletion does not apply. Another significant requirement: The data protection officer is now required for most organisations in Europe. This concept “Privacy by design and by default” — have you heard that term before?

 

Jeff

Yes. I heard it. Maybe you can explain it for our folks who are listening that aren’t familiar.

 

Katie

Yes. Privacy by design and by default is this practice for building privacy into the design and supportive systems. It applies very simply in two areas: Number one, when companies are implementing new technology or changing technology, they have to incorporate privacy up front, very similar to how we now refer to DevSecOps. We had to do that with security many years ago. Privacy is starting to really get embedded into initial evaluation of technology because if you don’t build privacy, and you go to production and you can’t fulfill some of those requirements, it’s a significant fine. Companies, number one, have to embed this into their DevOps as DLC. The second piece about privacy by default is all about data minimalisation. How many companies do you know, Jeff and Jim, that delete data upon their retention policy?

 

Jeff

None.

 

Katie

That’s right. I don’t know a lot of those companies. Historically, all the large corporations that I’ve ever worked for, it was “The more data we have, the better.” That is shifting because now, if you have a retention schedule, you can keep the data for that long, but then, anything beyond the retention schedule has to be deleted. It’s a very significant challenge for organisations to take their retention policies that are written by legal, and for IT to take that retention policy and operationalise it at the data layer is very challenging.

 

Jeff

There are a couple of things that I find really interesting about the GDPR. The first and foremost, probably the most public one, is the fine structure, or the penalty. It is a percentage of, as I understand it, at least profit or revenue for the organisation, which can be very substantial if you’re a global organisation. We’ve seen in the millions, maybe tens of millions, maybe hundreds of millions in fines for potential violations and things like that. Do you have any perspective on how that might translate over to some of the U.S. regulations that are coming up?

 

Katie

Yes, and it’s a very great question. Even with the GDPR, if you talk to some of the privacy practitioners, we didn’t see those fines come out early on, because the data protection authorities really took their time to evaluate and give companies a chance to fix it. Some of the fines that you see that are coming now, those have been in the works, and very large fines. The reason why I’m telling you this is because legal frameworks, they’re tested in court, they’re tested in real life. I think that what we’ve seen in Europe, that those that are being enforced are going to happen here as well. The CCPA actually has no unlimited liability. If you have a million records — I think it’s $7,500 per record — essentially, your fines could be significant, but also, a California resident can then take private right of action and sue you for damages. Again, at least in Europe, they put a cap on it. I know it’s a large one, but here in California, it is unlimited liability.

 

Jim

Katie, I think there’s a perception out there that I’ve heard people state over time, which is, if we look at the GDPR, we figure out a way to comply with the GDPR, we’re good with the CCPA and assume that rolls on to the new Virginia law as well. I know you’re going to talk about those a little bit. Is that a commonsense approach, or is that just too simple?

 

Katie

Again, I’ve worked for JPMorgan Chase and large financial services. That’s the approach they take at a global level. The GDPR is a very good standard to follow. Again, the CCPA is a great example because it started that the CCPA was maybe 50%, 60% of the GDPR. Now, the CPRA is going to be closer to the GDPR. However, there are some differences. The way that we approach privacy for global organisations, before the NIST privacy framework came out, we built a model that aligns to the GDPR and then also identifies areas that are special to a certain jurisdiction.

For example, in California, we have requirements related to the sale of data. The GDPR doesn’t have that, so that would be an example where it’s an add-on to the GDPR baseline. Now, when you think about that approach, it becomes very pricey for companies to do that. Depending on your operations in Europe versus the U.S., you may decide to take a different approach. I’ve had clients that when the GDPR first came out, their business was so small in Europe that they actually stopped doing business in Europe because it was too expensive for them to comply.

 

Jeff

Yes, I remember there was the apocalypse, with websites not allowing visitors from different countries that are parts of the world because of that. You talked about California and Virginia laws. There is the CCPA, and then there is now the CDPA. Can you talk about those two? I know you briefly mentioned California. Maybe we can talk in the context of how are they the same and what are the key differences between the two?

 

Katie

Sure. The CDPA — again, it’s very new. It’s a Virginia law that was recently passed. The CDPA applicability is a little bit different than the CCPA. In California, the CCPA applies to any organisation that does business in California and is a for-profit organisation and meets certain thresholds. It applies to organisations not even located in California. It applies to organisations that are located all over the United States. as long as they’re doing business, which means they bring a portion of revenue from the state of California.

For Virginia, the applicability is a little bit different. You have to be doing business and registered in the state of Virginia. The scope for the Virginia law is much smaller than the CCPA. That’s one difference. Another difference is, the CCPA is now going to the California Privacy Rights and Enforcement Act, the CPRA, which will be much closer to the GDPR, and that will include data minimisation, which is data retention. It will include privacy by design, and it will include some of these additional controls that are currently not in the CCPA. The Virginia privacy law — again, Jeff, I’ll be honest with you: I’m still trying to get up to speed too. It’s like every week, there’s a new law that’s popping up. My understanding is, the Virginia law is very focused on transparency and consumer rights, very similar to where the CCPA started.

 

Jeff

I think this highlights a challenge that at least in the U.S., some people are facing, especially in the IAM space: We have this patchwork of local laws. There’s nothing at the federal level that governs over all this. It makes it very difficult for organisations to figure out, “What am I supposed to comply with? Who does it apply to? What are we supposed to do about it?”

I feel like this routes back into proper IAM. At the end of the day — and feel free to tear me apart if my thinking is wrong on this, because I’m happy to grow on it — all of these privacy laws are designed around the appropriate use of the data for the user themselves. If you have proper identity and access management controls in place, regardless of the laws that are getting put together, you should be in a pretty good state of compliance. There might be a couple of extra-mile things that you might have to go around, but I think of things, like from SOX — the good old days of Sarbanes-Oxley, it’s like old hat now: Is the access appropriate for the person and the appropriateness liberally defined? The manager says it’s OK. As long as the manager says it was OK, it’s fine. You need to be able to basically prove that the access was granted in the process that it was mandated through the organisation. As long as you’re following the basics of IAM, do I have a record of the access, when it was given, who it was given by? If there was an approval process, who approved it, then that access is removed in a timely manner — again, subject to perhaps a liberal definition of what timely means. You should be in a relatively good state from a compliance standpoint. Now, I know a lot of organisations struggle with that. That’s why IAM is a thing. Does that thinking of “Let’s at least get proper base controls,” would that take me a long way toward meeting some of the compliance things no matter where they come from?

 

Katie

Yes. You highlighted two main points. If you think about the scenario outline, I would break this out into two different scenarios. One is your employees and access for your employees. One, employees also have rights, and they can also execute them. With employees, if you can identify them, which most organisations can, but also restrict their access only to systems that are authorised, that’s a big data protection control, which will help prevent exposure of the data. Now, the second piece that you highlighted, if you think about all the external identities, organisations are good at managing customers and consumers, hopefully. Where we get into a challenge is, think about marketing data. Have you ever had to implement an identity management system on marketing data?

 

Jeff

Briefly, and it sucks.

 

Katie

Or vendors? If your vendors have access to your system, you know them, but potentially, vendors that may not have access to your system, that provide a subprocessor-type service. I think that you’re absolutely right. The identity management controls are foundational. Having those mature and in the right state — with effectiveness, the right effectiveness — is very important, because the data protection piece of all privacy regulations relies on that foundational control.

For those organisations that manage customer identities, that digital identity of your consumer experience is going to be very important with privacy, because as individuals exercise their rights, they can also use their online accounts and portals where you can start setting those privacy settings. They can just go in and out and decide what they can and cannot do, so managing those identities becomes very important. The third step is identifying all the identities for which you’re collecting data, but you may not have any tools in place to validate them.

 

Jim

Katie, I’d like to bring up a related topic. I’m not sure if it really falls into the space of privacy and consent, but it’s certainly something that a lot of organisations are struggling with, which is data residency and determining when there are regulations require that they store identity data in country, within the walls of a certain country. The ones that I see pop up the most recently are Russia, China, where, rather than storing a person’s identifiable information in one database wherever in the world — the U.S., somewhere in Europe — those are countries that maybe have some requirements. I know in the past we talked about Germany and France. Primarily, those are, at least from my experience, when I was doing workforce identity implementations and having to keep that data resident within that country’s borders. As just a general question, are you seeing that the regulatory landscape is moving more toward data residency requirements, or away from that? Where do you see that those requirements are the strongest?

 

Katie

Excellent question. Again, we deal with a lot of those questions from our clients. And now, I’m sure you’ve heard that the privacy shield and validation — it’s becoming even more difficult. Let me first explain two points. Data residency is a practice when a company can decide where their data is. Data residency, they decide, “We’re going to keep our data here, and we’re going to use contractual means and obligations to make sure that data doesn’t leave the country.” Now, what you mentioned about China and Russia, those are data localisation laws. The data that originates in this country or a region —  sometimes, for China, it’s a region —must stay in the same territory. The reason why I’m distinguishing between those two is because with data residency, again, it’s a strategy: Where do you go?

Let me talk about that one first. When we talk about data residency, when you decide where the data is, you want to prevent cross-border data transfers. Just picture this example: Let’s say that — you mentioned Germany. Germany and France, they do have special localisation requirements that are not as strict as China’s. Let’s just play out the scenario that you decided that your data is going to be in Germany, for example. If company engineers are in India and they are accessing that data from India, they’re not physically transferring the data. They’re only accessing that data in Germany. In that scenario, it’s still considered a move, and the residency is broken.

When you think about data residency, you really have to ask yourself, “Am I doing it for compliance reasons? Am I doing it for protection of the data, making sure you know where it goes, or something else?” I always tell my clients that if you can keep your data in Europe, it’s much easier for you to comply. Realistically, with this global footprint and how we’re all accessing data, that virtual transfer happens very easily unless you put a perimeter around your European data center, where nobody can get in and out. That’s one point.

The second point: For data localisation laws — and I’ll start with China because it features the most comprehensive list of strict requirements, and I know they are changing slightly — but that law defines two operator types: the network operator and the critical infrastructure operator. Those definitions are so broad. Most of the time, they apply to all foreign organisations. With this requirement, all personal information that’s collected in China must be stored on servers located in China. If you want to transfer that data outside of the country, you have to receive government permission, and you have to undergo a security review. It’s a pretty strict requirement. There are some other ones in China where providers of telecommunications and internet firms must also provide encryptions keys to the Chinese government, and there are some limitations on financial services data.

Now, with Russia, it’s a little bit less strict. Again, with Russia, it’s a pretty new requirement, but, based on their data, the personal data law, all data operators that collect data in Russia must first store that first copy of the data in Russia. Then, once you store the initial data in Russia, you can take that and move it to other countries. Russians just want to be able to have access to that information and have it on-premise because it is data collected about their Russian citizens. Telecommunications companies and internet service providers, they have some additional requirements where they have to also work with the government to make sure that the government approves the operations. Other than that, it’s a little bit easier in Russia.

 

Jeff

It’s a pretty complex topic. It makes it challenging for people to be experts in all things. This is probably a good situation to divide and conquer, to pull in the right people when you’re ready to have these conversations with different organisations. Are there any best practices or things that people who are listening can take away to at least start the conversation with the organisation, if they haven’t already, or if they have to keep in mind as they start down this journey?

 

Katie

Yes. To understand your data residency requirements, you have to know what data you already have and where that data currently resides. If your organisation has not done a data inventory, and understanding where your data is, that’s the first step where I would start. Then, if your organisation has already done that data, the next level is looking at transfers: How is the data transferred? A lot of times, we actually oversee access controls. This is where identity management comes in. In order for us to determine who’s accessing the data, we have to do restratification of all the servers, and servers in Europe, so we can see that Jim and Jeff are located in the U.S. If we look at those access controls, we will quickly identify the data transfers that may be inappropriate or could be shut down to reduce the scope of applicability.

 

Jeff

Yes. Reducing scope is a tactic we’ve seen before — things like PCI and in-scope systems for SOX, financials, those sorts of things. That makes a lot of sense.

I know we’re getting short on time here, but I want to mention that you and I are actually both doing a couple of webinars for Protiviti coming up. They’re very wordy. Mine is Privacy Is Personal: Digital Identity’s Central Role and Consumer Privacy. I’m doing that with my friend Paul. You’re doing one called The Future Is cloud: Holistic Safeguards to Secure Your Data and Protect Privacy. Hopefully, folks will be able check those out. I’ll put links into our show notes, because I don’t expect people to remember those names very easily. Real briefly, though, what is your webinar about?

 

Katie

We’re talking about cloud. Data residency will be one of the topics as well. With cloud, how do you manage that, and how it applies. We’ll talk about the liability. With cloud providers, we need to understand, what is the liability of the cloud provider versus your organisation as a data controller, and do they even have access to the data? We’ll talk about that. Then, we’ll focus on data protection: What are the key security privacy controls that need to be applied to the cloud as a lot of organisations are making that move to the cloud? Everybody is going to the cloud. Some companies don’t even know what that means, but they’re going to the cloud.

 

Jeff

It’s just a fad. It’ll pass just like the internet, right?

 

Katie

I think so.

 

Jeff

Any final words of wisdom for us, Katie?

 

Katie

I really enjoyed being here. One key takeaway that I want to share is that a lot of times, we start talking about privacy and privacy compliance, compliance, compliance. I think that if organisations take a step back and understand that privacy helps individuals manage boundaries on how they can protect themselves and how they can negotiate who they communicate with, who they interact with and how they communicate with the world, respecting that consumer privacy can really go a long way, because we’re starting to see that approach establishes a very strong trust between the organisation and the consumer. Instead of just focusing 100% on “Am I compliant?” I recommend organisations take a step back and really look at their relationships and try to accommodate those individuals.

 

Jeff

That is great advice. Jim, how about yourself?

 

Jim

I think that’s a great point. That was what I was going to ask Katie about. Privacy is growing in terms of mindshare. From a corporate standpoint, it’s “How do we become compliant?” We have to ask ourselves a question: Why does it take government regulations to guarantee our privacy rights? I think that’s the bigger question that I’ll even throw back to Katie. Do you have any thoughts? Why do we need these regulations in the first place?

 

Katie

I have so many. Jim, I shared with you, I lived in a country where there is very little privacy. In my mind, if you think about it, privacy is a limit on power. The more someone knows about us, the more power they really have over us. Personal data is not used to make very important decisions in our lives. Personal data is used to validate our behaviors — sometimes, change our decisions. We’ve seen that happen in recent political events. Privacy is so important. If there’s not a rule or regulation, it’s going to be overlooked, and it will not be enforced. Unfortunately, you do have to have a governing body that’s monitoring and policing it, because otherwise, it’s going to be very difficult to have a standard and to enforce the same standards. Again, speaking from personal experience, privacy is all about power.

 

Jim

I think you’re spot on. We’ve seen, unfortunately, instances where people’s private data was abused, and it was done without transparency of how private data was being collected, sold to third parties, used to influence people’s political decisions. I don’t think anybody feels good about that. I started the podcast with my example of you want to sign up to pay your bills online. All of a sudden, you have to agree to a privacy policy, whether you want to agree to it or not. You can make the case that it’s an open market, and the open market will dictate that. But the bottom line is, most people don’t even read the privacy policies. They just agree to them and move on. Then, all of a sudden, their data is being sold to a third party. Nobody cares about it until it makes the front page of the news.

Anyway, Jeff, the point that I wanted to make is that privacy, just like our recent show on penetration testing, is not specifically IAM, but we’re trying to bring more of these strongly related topics into the podcast because this is a forum for education, and all of this learning. I’m learning as we’re going with great guests like Katie sharing this information.

Hopefully, our listeners are getting a lot out of it. We’d love feedback in terms of what other topics you’d like to hear about. I really feel like becoming a strong IAM practitioner requires more than just understanding workflows and access management. It’s understanding how business operates with some of these major issues that go into architecting IAM systems so they meet the needs of the business and meet the needs of society in general — we’re getting into societal stuff today. We’d love that feedback. As always, we invite people to connect with us via LinkedIn. Network with us. Send us messages. Let us know that either you like what we’re talking about or you want to hear some other things. This is your podcast as well, and we want this to be something that people get a lot of value out of.

 

Jeff

I do think it is important to have topics that are like this because it is so adjacent to, if there isn’t a direct overlap sometimes — although I think there is in this case. You need to be well-rounded from an identity perspective to be able to understand the perspectives that different parts of the organisation are bringing to the table. You need to do things like privacy by design, secure by design, and take those concepts to be able to drive stakeholder involvement, whoever they may be within the organisation. I think it is important.

I’m very happy, and I’m glad we were able to get you on the show, Katie. It really starts the conversation around privacy. Obviously, there’s only so much we can do in about 45 minutes or an hour, but I think it’s worth starting the conversation and at least start to educate and drive awareness toward it so that we can move toward this holistic strategy around making sure that at the end of the day, the right person has the right access at the right time and it’s for the right reasons. That’s the simple statement of what IAM is, but there are a lot of different meanings to that and a lot of different tie-ins. I certainly appreciate you being part of the show.

We’ll go ahead and wrap it up for this week. Just to remind again, Identity Management Day, April 13. You can hit IdentityManagementDay.org to find out more. You can visit our website at IdentityattheCenter.com. You can also follow us on Twitter @IDACPodcast. With that, we’re going to go ahead and wrap it. Thanks, everyone, for listening. We’ll talk with you all in the next one.

 

Male

Thanks for listening to the Identity at the Center podcast. If you like what you heard, don’t forget to subscribe, and visit us on the web at IdentityattheCenter.com.

Loading...