Beyond the Checkbox: Privacy Engineering and the Fight for Data Trust with Kim Wuyts
- Mar 12
- 17 min read

Is privacy becoming a luxury of the past? In this episode of Signal Shift, Raakhee sits down with Kim Wuyts, a world-leading expert in privacy engineering and the creator of the LINDDUN threat modeling framework. From smart glasses that unknowingly record intimate moments to robot vacuums capturing family photos, our most private spaces are increasingly under surveillance.
Kim explains why privacy is no longer just a "legal checkbox" but an engineering challenge that must be embedded into the very architecture of our digital world. We explore the concept of Privacy by Design, the role of AI in amplifying data risks, and how "gamification" is helping developers understand the "cringe" of privacy violations. Kim also provides essential "cyber hygiene" tips for the average consumer to help protect their digital footprint.
Where to Find More Information:
"Context and Cringe" - coming soon to https://cybersecgames.com/
The Current Landscape: Privacy Violations in Safe Spaces
Kim highlights that many privacy violations occur through devices designed to make our lives easier, often in places we consider most private.
Smart Glasses and Wearables: A recent story involving Meta AI glasses revealed that recordings of credit card numbers and intimate moments were being outsourced to a third-party company in Kenya for processing.
In-Home Robotics: Robot vacuum cleaners (like Roomba) and smart cars (like Tesla) have had instances where employees or contractors could view sensitive camera images from inside people’s homes or vehicles.
LLM Profiling: Large Language Models (LLMs) like ChatGPT and Gemini are now effectively building complete personal profiles by connecting context from various chats over long periods of time.
Privacy Engineering and "Privacy by Design"
Kim advocates for moving beyond "checkbox compliance"—where privacy is merely a legal hurdle—toward privacy engineering.
Definition: Privacy engineering embeds privacy concepts directly into a product's design, architecture, and code, rather than treating it as an afterthought.
Privacy by Design: This paradigm, coined in the mid-90s, requires organizations to include privacy considerations at the very start (the ideation phase) of any product or process.
The LINDDUN Framework: Created by Dr. Wuyts, LINDDUN is a threat modeling framework that helps engineers identify specific threats like Linking, Identifying, Non-repudiation, Detecting, Disclosure of Information, Unawareness, and Non-compliance.
Gamifying Privacy
To make academic concepts more accessible to developers and non-technical stakeholders, Kim has turned privacy engineering into games.
LINDDUN Card Game: A gamified version of her framework used to reason about system threats.
"Context and Cringe": A new game she co-created to help people feel the visceral impact of data misuse. Players pair sensitive data (like sexual preferences) with apps (like dating apps) to see how changing context can make a situation "cringe" or inappropriate.
Basic Privacy Hygiene for Consumers
While organizations hold the most responsibility for back-end data management, Kim offers several tips for individuals to protect their digital footprint:
App Permissions: Be wary of "red flags," such as a weather app requesting access to your photos or constant location updates.
Data Minimization: Do not fill out non-required fields in forms and use specialized email addresses for newsletters.
Clean Up: Delete apps you no longer use, as they may continue to track and send data in the background.
Tools: Depending on your comfort level, use privacy-respecting browsers, VPNs, ad/tracker blockers, or even a Tor browser for maximum protection.
*Disclaimer: The text in this post is AI-generated from an original video podcast - applicable data sources, references and/or the episode transcript are provided below.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Episode Transcript:
Raakhee: (00:00)
Hello and welcome to Signal Shift with me, Raakhee We've spoken about data privacy in various contexts before, including a very illuminating episode about the privacy we sign away when we purchase cars and the kind of tracking our vehicles are legally able to conduct on us. Facial monitoring is becoming so much more prevalent and it's making life easier, but we also hear stories like mistaken identity at Sainsbury's in UK where somebody who was incorrectly profiled was humiliated, right?
And it's not the first of such stories. There are many such stories we're hearing. And then, of course, there's things like LLMs building more complete profiles on us. know, chat, GPT, Gemini will now take context from other chats you've had with them and bring it into conversations much more effectively. In fact, better than certain humans do. So, for example, I maybe discussed fashion and maybe food in completely different chats with ChatGPT. And it will now take that context and bring it in when I'm planning a holiday, even if it's months later.
And it's doing that pretty effectively. So we are at a really interesting time in terms of privacy, right? And the line between public and private almost feels like it's at its thinnest or we at a deciding point. How far do we go? How far do we want to go or not?
So today we are speaking to an exceptional guest, Dr. Kim Wuyts, Kim is a world leading privacy engineering expert, and she's the creator of the LINDDUN threat modeling framework. It's a very popular privacy threat modeling framework that she developed while she was a researcher at KU Leuven.
She is currently Manager of Cyber and Privacy at PWC Belgium. Kim's mission is to raise privacy awareness and encourage organizations to embrace privacy engineering best practices. She's a guest lecturer and experienced speaker at international privacy and security conferences around the world. Kim, a very warm welcome to the podcast. Thank you for your time.
Kim: (02:14)
Thank you. Thank you for inviting me.
Raakhee: (02:16)
I was hoping, could you paint a picture for us of what the current landscape is like? What are the risks to privacy that we actually face today?
Kim: (02:25)
I think we all have kind of a feeling of what privacy violations are. the impact and like the bad impact of collecting, sharing, storing, processing personal data, your data. The impact that has on you as an individual. That is basically what privacy is all about. just last week there was an interesting story from the Meta AI glasses,it records everything that you do that you see that belongs to you. So now there was news that some tagging, some processing of all the images of all the recordings was being outsourced to some company in Kenya. so those people kind of leaked that they were seeing credit card numbers because, well, people look at their credit cards. So they see credit card numbers. They see people on the toilet. They see people in very intimate positions. So everything that you do, that you share with, for instance, smart glasses,
Well, doesn't just stay on the glasses. It doesn't just stay on your phone or your laptop. It's being forwarded and it's kind of out of your control. And that happens with all the data that we have that is being collected about us. And we live in a current age where, well, everything is basically in some way or another being collected. So there's a lot of things that can go wrong and that will feel like a violation to our privacy.
Raakhee: (04:05)
Yes, I heard about that story. mean, that's pretty, you know, almost unbelievable, right? Is that these other humans are sifting through that data and have access to that data. And I guess, again, it was in the fine print when people signed up for those glasses, right? Yeah.
Kim: (04:23)
It's not the first story, the same happened. You mentioned smart cars. The same happens there that Tesla employees can see the camera images of people's cars. Same story for Roomba, the robot vacuum cleaner that was taking pictures and that, again, outstores people could see kids and people sitting on the toilet as well.
Every smart device that you bring in your home that feels like a safe place can feel like a violation to your privacy because that's where you feel like most at home safe in private.
What we as privacy engineers, as data protection professionals try to minimize. We can't avoid it because then you need to live in isolation, basically, you know, put like a box around you and don't talk to people. But you can try to, contain some of the risks.
Raakhee: (05:24)
Yeah, yeah. And I think, again, that's where your work becomes so critical to where we are at this point in time. But this was a quote from one of your posts that I really loved. And it said, privacy isn't a policy checkbox. It's an engineering challenge, especially now as AI amplifies privacy risks like never before. So can you tell us a little bit about your work? What is privacy engineering? And also this concept called privacy by design.
Kim: (05:52)
Yeah, so let me first start with the first part, the checkbox compliance, especially in Europe with GDPR, privacy data protection has been mainly applied from a compliance kind of perspective. So we have DPOs, we have lawyers, we have more governance kind of people saying, okay, we need to do this and we're gonna write you a 50 page report And then the project owner typically says, we hear what you say, but we're gonna do it anyway.
It's not always going as poorly like that, but sometimes that's how it goes. you can still check that box of compliance because we did the analysis but the whole point of privacy and data protection is to make the systems that are using that are processing that are collecting that are sharing our data, make those secure, but also privacy respecting. So we can clearly not do that by just, you know, doing analysis and filing it somewhere in the bottom floor. We want to implement that embedded in the system.
So just like you have software engineering and security engineering, you also have privacy engineering where you're gonna put those privacy concepts, not just in the overall organizational process, but also embedded into the design, the architecture, the codes of your product. So that is privacy engineering. And to me, that is an essential part of making privacy by design or data protection by design become reality because
the privacy by design paradigm is it's been coined like in the mid nineties by a Vukian and Ulterian privacy commissioner. And it's also part of, for instance, GDPR saying, okay, you need to include privacy by design. So starting at ideation phase of a project product process and make sure that privacy remains in there in each step of that development lifecycle.
We want to bring it in there so privacy engineering becomes that enabler to make privacy by design actually be possible.
Raakhee: (08:13)
I guess inherent in that is that we're saying that systems can be designed to be more private, to be more secure. We shouldn't be having the kinds of data leaks we are having. We shouldn't be having the kind of data that's getting manipulated, thrown around. Is that fair to say then that these are just not outcomes we have to just contend to live with in the world that we can make it better?
Kim: (08:39)
Do you want us to the optimistic or the pessimistic answer there? I think it's kind of somewhere in the middle. It can be way better than it is. But saying we can fully secure or make something fully privacy respecting, I think that's like an unrealistic ideal image. People who say, my system is 100 % secure.
I don't trust them because you really don't know. So we can do better because especially from a privacy perspective, there's so much more work that we can do.
Raakhee: (09:18)
Yeah, yeah, exactly, exactly. And that's, that's, you know, I think that's fair answer. And I think jumping into then LINDDUN I mean, what is that? What is a threat modeling framework? How does it work? How do you use that?
Kim: (09:30)
What we want to do is make sure that privacy and security come in each of those phases of the development lifecycle. What is in essence, threat modeling? Threat modeling is kind of giving you a structured frame to do that.
Adam Shostack summarized it in four simple questions, which is what are we working on? What can go wrong? What are we going to do about it? And did we do a good enough job, that's basically the essence of threat modeling. So at the start of a project, ideation phase, architecture, we look at what are we working on? We understand, we try to, well, the more we know about it, the more detailed of course the analysis is going to be, then we can look at threats.
What can actually go wrong from the thing we're trying to do here. That is where LINDDUN comes in. LINDDUN is inspired by Stride. Stride is probably the most popular security threat modeling approach created by Microsoft 25 years ago or so. LINDDUN was inspired by that and is kind of the privacy sibling of that. It has a very similar way of working. So that also means that if you're familiar with Strides, LINDDUN should be easy to implement as well.
And LINDDUN is an acronym. So LINDDUN stands for Linking, Identifying, Non-Repudiation, Detecting, Disclosure of Information, Unawareness, and Non-Compliance. So for each of those categories, LINDDUN has a set of threat trees and those have, well, threat characteristics that will help you reason about, if I have these things in my system, ha, that means I probably have a linking threat or an identifying threat. So I started developing that in 2010, I think, and
My colleagues at the University are taking over and are further developing it. I think working on an extension specifically for AI. A couple of years ago, also decided to create a card game of that because we noticed that people thought the stuff that was out there, the Threat Trees, were kind of academic and hard to use. I want people to actually use this, so let's figure out how to make this more practical. So put everything in cards that can be used as a reference, but also more in a gamified way for people,
Raakhee: (12:04)
I love that the LINDDUN card game that you developed, right? And I read about that and I read the article that referenced some of your work as well. It was the Forbes article about gamification in privacy engineering. And I thought it was such a fantastic article. I mean, some of the things mentioned there, and I'm going to read this, but this is so interesting. There was mention of a six week festival called Disneyland for Privacy, a fraud detection hackathon.
Something called Play Secure, an event where people, you you could explore how play could be used to educate people and protect people. There was a privacy musical and artist gallery and even a book titled The Hitchhiker's Guide to Privacy Engineering. So there's obviously really cool creative stuff happening in the sphere as well. And that leads me to kind of maybe a different aspect of our conversation today, but that's really around:
I think privacy engineering is a career in this time where, you know, I think people are re-questioning so many kinds of jobs. This seems like such a burgeoning area with, I think, such potential for what it can be. So, yeah, tell us a little bit about that and the gamification in this arena as well.
Kim: (13:17)
I'm going to start with the gamification because well, very recently, I'm kind of living and breathing gamification of privacy at the moment because we're finalizing a new game, which is called Context and Cringe, which I have created together with my co-trainer, Avi Douglen. So we do a lot of training for privacy, threat-malling privacy by design too. And we noticed that while you can explain people the concept, but... it lands best during exercises or so, and people go like, So we were thinking we want to go for that, like the visceral feel.
And we came up with a way to make people go, in a repeatable, fun way. So we have a card game where you basically create application scenarios with features, apps, and data cards and you can block opponents and do all kinds of stuff. And that makes people go like, have like a very cringy card, like sexual preferences, but then you give it to your opponent and then they pair it with a dating app. it's like, well, of course, that makes sense. So changing context is something that you need to feel. So having that in a gamified form, made a lot of sense to us. So I see more and more of those gamification being applied in team buildings and conferences
If you go to CyberSec Games, that's a company that basically specializes in selling all kinds of those for threat modeling and other things.
Career-wise, well, yeah, it's also very interesting. I actually started as a security researcher, moved into privacy. So I am seeing both. have friends in both.I think it's very interesting, especially now with AI, things are moving so fast. We see how AI can help, but also make it worse. We talked about the amplification because while before you have some data hidden in a corner somewhere and it wasn't that big of a deal. if you use those AI systems, that data hidden in a corner is gonna surface really easily.
And so suddenly, all the things that you were like not doing really well becomes a problem now because people can easily find it or AI systems, AI agents, LLMs can easily find it. So it's gonna be really interesting to see how we can secure those AI systems.
I think before it gets better, it's gonna get worse first. But if you see how quickly it evolves, it's like a very interesting domain to be part of and to help make it more trustworthy, more secure, more privacy respecting. And I think we still have quite some work cut out for all of us as a community. yeah, it's every day something new and something new to learn, but you can contribute to the domain. it's to me, absolutely a very fun one to work in.
Raakhee: (16:32)
It's such an incredible time we are living in, right? And your work really at the center of such a critical aspect of all of that. And the card game, is that something like we could purchase as something I could go and purchase? It sounds so incredible and so fun.
Kim: (16:48)
It's gonna be released in like a couple of weeks, I hope. it can be purchased at CyberSecGames website.
You don't need to be technical to play it. It's feeling what new systems and combinations of applications and data that might make sense suddenly don't make sense anymore.
Raakhee: (17:09)
It sounds something up our alley, just like you said, even as somebody outside of that sector to get the context around some of those questions, right? And your mindset and thinking around some of this and how we even use our data and bringing it back then to the average person like myself, the average consumer, I guess, two sort of questions there.
And the first one being when we sign up for an app, or on a website or download something. And we all kind of just hit the tick box like, yep, I agree to the terms and conditions. None of us really read it. It's far too lengthy. But is there something we should be looking out for? Should we be checking certain things? Should we be scrolling through terms? Yeah, what should we be looking out for when we download or sign up for something?
Kim: (17:57)
Yeah, well, of course, I read all of them. Like, no, I don't read them either. So I should. You can like skim them and look for what data is being collected. How long is it being stored? What part is being shared with? That's already quite an overhead, I know. When you download an app, it typically gives you some pointers on what data is going to be used when you install an app. It asks you for permissions and that
That should be red flags. Like if a weather app is asking access to your photos, why would it need it? Also, how fine-grained is it? Like, I know that weather apps will like your location, but do they need to know where exactly, how specific it is?
Also, some apps will request new access later on. That should also be a red flag. And also when you don't need an app anymore, for instance, you travel to some other country, you need an app for that. We'll delete it when you get back home because there's no point of keeping it. And maybe it's still sending your location information every five minutes to I don't know which country.
So it's a bit like basic cyber hygiene, but then for privacy, just thinking, okay, am I okay with having my data shared with this company? Maybe also reputation of a company. Do you know the company? Did I hear it's with leaks in the news?
Common sense is currently the best we can do. know researchers are working on more transparency about privacy issues for specific apps, either before you download or even at runtime to give you pointers like, are you aware that your weather app is sending every five minutes your location to then let yourself to data brokers? It happens more than you even realize or want to know.
But it's all so hidden and we're not aware of it. So it needs to change, but it's hard as a consumer to fix it.
Raakhee: (20:15)
And then the second question, I think, too, exactly what you mentioned, cyber hygiene. And then the part where we do control, or control as average consumers is our digital footprint and what we put out there and how much we say or how much we put out there. Any guidance there? What should we be doing or what shouldn't we be doing?
Kim: (20:34)
Don't fill out the not required fields. Some required fields are also sometimes ridiculous, but I mean, if you wanna use the service, basically have to go for it. You can have special email addresses or something for the spam of newsletters and all of that.
If you don't need anything, something anymore, remove it because, well, we mentioned earlier on that we can't be sure that a data breach is never gonna happen. It's always possible.
Have your own minimization, basically, also towards your friends. Don't just share stuff about your friends. It's like the social media, I know it's designed for it. But it's not my job to post a weird picture of my friend being sort of drunk at a party. That's basic stuff you shouldn't do, there's tools that are more designed for privacy specifically, browsers that are more privacy respecting, the AdMob block trackers. You can use a VPN if you're on an unprotected network. You can use a Tor browser if you really want your communication being protected.
So depending on how far you want to go, there is more tools that you can use. But as a consumer, besides just requesting more privacy, we're kind of stuck with what is offered to us, which is kind of depressing.
Raakhee: (22:19)
There's only so much we can do. But I think, you know, I think all of us have to take that responsibility and just do as much as we can and take some of it into our hands.
Kim: (22:29)
Yeah, that's also why I think privacy engineering is such an important part because we need to push the organizations to implement this in their systems as an individual, as a user, as a consumer. There's only that much you can do. It's what happens in the back end, how they are processing our information, how they are storing, how long they are storing, who they're sharing it with. It's basically once you send your information to an organization, to a product, to a system, it's basically out of your control.
Raakhee: (22:58)
Yeah, it's scary, I think, yeah, I knowing like the work you are doing, and so many of your peers and colleagues, I think definitely instills bit of hope. Thank you for sharing with us today really more about this world of privacy engineering and your work and the really amazing stuff you're doing. And so I'm hoping I'm really hopeful that things will, like you said, maybe be a little rough in the start, we'll learn and we'll get better.
Kim: (23:27)
Basically, we need some really big breaches or really big problems before people will start acting on it. And that's a depressing thought, but let's hope at least then organizations will pick it up and start really embracing it.
Raakhee: (23:42)
Kim, thank you for being here. I really, really appreciate your time. This was a great conversation. Any final thoughts or anything else you want to share?
Kim: (23:52)
I like to talk about privacy because all of us, know, the visceral feeling that we have about, I don't want this to happen to me. I think we need to work together as a community to raise that also within organizations, within development teams, to not just go for, let's make money.
But also let's make sure that we give value to our users and our consumers and they want to be able to trust us. So we need to have privacy as a first class citizen in our product. So it's not just doing good, it's also bringing value.
Raakhee: (24:30)
Yeah, absolutely. Privacy as a human right. You know, critical. Kim, thank you so, so much. Yeah, I really appreciate you being here. To everybody who's been listening, thank you so much for listening and watching. And we will catch you again next week. Thanks for being here and bye for now.
.png)



Comments