Who does data privacy anyway?

Simon Mulquin
16 min readApr 18, 2023

--

Data privacy advocacy and the way we struggle to engage people in practices they don’t like.

This image shows two doors leading to the same building. This will be referred to later in the article.
Photo by Hans Eiskonen on Unsplash

Have you ever heard anyone claiming they don’t care about data privacy?

Well if you did, you may have ended up in a growth marketing amateurs circle and this is not what this article will cover.

What you may have heard are people claiming their “company” don’t care about data privacy.

Then you definitely ended up in a frustrated data privacy professionals forum and this is actually what I’ll cover here.

🕊️ Making the internet a safer place also means we have to protect ourselves from easily buying opinions as facts or disabling our ability to deal with complex ideas when exposed to too many simplifications.
I really believe in what I write here but it is still no more than an opinion and I will sometimes simplify or extrapolate ideas to fit in a readable format.
Please take care and own your own ideas. 🫴🏽

What is a company that doesn’t care about data privacy?

The answer is actually far from obvious.

You may argue if they hired a data privacy expert with good references, they probably care.

Well, this would have become common sense a long time ago if the growing population of these experts had kept professional tensions quiet and respected their NDA.

But there is a limit to any NDA and to how people can keep silent when they don’t feel aligned with their company’s culture. This limit is social exposure or, let's say, entry-level activism.

You know this guy who had a little too much beer, was asked about his work at a family dinner or simply needs a bit of dopamine writing a post on LinkedIn and getting support/social validation from people who can relate to his situation?

Yes, this specific guy, sometimes me, sometimes you, sometimes the pain in the ass you don’t really know how to deal with. Let's see how later. 🙂

This guy, and all of us, had stood up against cheap compliance and advocated for rigorous, ethical, civilization-defining data privacy.

And once we did it, there was no possible call for company-level compliance anymore.

We, stubborn self-proclaimed heroes of future generations, took over your company's dreams of compliance and crushed it in the egg.

The Tower of Babel by Pieter Bruegel — Wikipedia

So I guess you want to say thank you? 🙂

Well if you do, your company is still not compliant and you won’t escape purgatory so easily, sorry but we have a mission to build a civilization safe for individuals here. 😈

And this is where we need to stop and face our own megalomania. Your company isn’t compliant but this is perfectly okay.

Actually, this was never a smart goal to be compliant as a company.

Compliance would have never helped you improve your operations, your sales, or your engineering at all, but rather put it under even more pressure. You don’t put a badge on a uniform to improve someone but to incentive this person into developing themself and others.

This is exactly what compliance is from my perspective. If compliance is a badge for the company, it is an incentive for the people in it.

I want to dig a little further into these ideas in the next section using the more or less popular concept of social construct.

I will then, come back to the original subject of this article which is to rethink how we engage people in data privacy practices.

What are companies and compliance?

Let’s just make it simple, a company is not a person, it does not have a unified way of thinking and taking decisions.

Compliance is merely no guarantee that people have your best interest in mind either or that they are able to protect you from their goodwill.

The people inside a company can share a vision, processes, documents, and any organizational artifacts, it will still rely on the ability of people to work together to achieve a work of great quality together and a shared understanding of a company’s vision and values.

This said, there isn’t such a thing as a trustable company nor a compliant one, as it has no conscious by itself and therefore doesn’t really exist as an entity able to take a decision on its own.

As social constructs, companies are rather not much than a bunch of assets of any kind, physical or not, which enable people to work together and often aim to reduce the standard deviation between the work of people sharing the same role in this company.

The role itself is an asset as it defines your responsibilities and limits, but this asset alone doesn’t prevent you from going out of limit or disregarding your responsibilities, it merely assumes that you are actually able to carry these responsibilities.

Of course, you can reduce the standard deviation even more while growing your company more complex. You will make sure to have good skills management to make sure people are able to fulfill their roles, and that your company never misses the critical skills it relies on.

You will make sure to have strong contracts, processes, discriminate permissions, and so on,…

You’ll end up with a more or less sexy homemade monster in which every single part had been built by individuals, taking decisions and trying to align other individuals on it, most likely without any written or even less readable history.

So how do you make such organized chaos predictably become “compliant” and when do you start doing it?

What can you build on top of that to ensure the Graal of compliance?

This is the question data privacy experts answer in a pretty good way, bringing even more tools to help companies take care of their customers and employees from the way we design, secure, and sustain our products and services.

The issue is, once you got the tools, you still need people to use them properly.

How does a company spread the practice is quite harder than how it brings in new tools.

By giving people tons of questionnaires and reports, you add more meaningless work on people who may not even understand the purpose of what they are doing and what harm it can make not to do it properly.

People will influence each other in many ways and leaders may bring some order in that but they won’t raise awareness as well as the people you directly live or work with.

You would rather adapt by inspiration, seeing someone using a tool you misused in a way that works a lot better for them, or by integrating what all other people around you seem to acknowledge as “the good way” or “the common sense”

This is where compliance comes in handy from my perspective as it does not only assess a company for what is already achieved and how it plans the near future but rather defines the company in a way people inside it can take as their “common sense”:

I work for a company that is compliant with this label and therefore meets these requirements, so I have to meet these requirements in my day-to-day work or I’ll fail to do what my company expects

Or

If I want to show my skills and develop my career here, I have to improve how I comply with these requirements

Or

I am legitimate to tell my colleagues about mistakes they do as we are supposed to comply with this requirement and they actually don’t

Once you get your holy book of compliance, you can really start building on strong values and awareness infrastructure.

It doesn’t mean you won’t rewrite or find a new way of reading the holy book and completely change its nature, it just means that for a moment a technical requirement will come with a social or even economic incentive as it is now not only “ethic” but “what the holy book says”.

There are plenty of artifacts that can achieve a holy book’s purpose and compliance certificates are surely not the best, but it would be unnecessarily mean to people who invest in it to say that it is only about social privileges and lobbyism that aim to engineer new elites based on a company’s offer or a university’s program (yep, this is quite a popular opinion 🙂)

The average against the elite

We often entrust companies to deal with our privacy based on a historical or social perspective as if they were “Good” and “Bad” companies, subjectively categorizing them using this scale (Bullshit alert ⚠️):

  • Evil: We know they have “Bad” intentions
  • Willing full: We know they really try and they will take responsibility for whatever happens. Sadly we also know bad things will happen to some extent and we do them a favor to accept that risk.
  • Average: We know they get some expertise and it’s unlikely anything bad happens but the world is a dangerous place they are not yet ready to enter.
  • Elite: We trust them with our lives, these organizations hire brilliant employees and are led by the most recognized people in the world. They show great aptitude in technical fields, faced many challenges, and advocate for a better world where we will all be able to live in peace and harmony with sustainable ethic driven technology.

This linear scale is natural but is just so bad at defining how skilled organizations are to deal with our personal data for what concerns us.

The reason why I listed four levels rather than merging willing full and average into one “Average” level is that there is no point in the middle of such a scale as it is absolutely fake and not reliable to measure anything more than how people perceive your company.

This is more likely a sales or reputation metric rather than one you could leverage to improve any privacy concerns in the company. Still, this may help you get the budget you need. 🤫

The “Evil” companies

So let’s imagine you categorize the GAFAM as “Elite” rather than “Evil”, which I cover later.

These companies are incentivized to keep strong control and cannot be as fully transparent as those I would categorize as “Willing full”.

This is because the impact of being too transparent while facing a crisis is more likely to make the crisis even bigger, they will think they are more able than you are to protect your interests and, on the opposite of “Willing full” companies, they are.

But what if your main interest is to know there was maybe a breach, you may rely on speed to mitigate the impact of such a breach and cannot afford to wait for official communications and the incident to be dealt with on the company’s end in order to react.

Some organizations will communicate about risk as soon as they sense it while others will better keep it and analyze it before they actually accurately report to you about any incident.

This is the situation I imagined when I picked the beautiful photo of two doors by Hans Eiskonen on Unsplash:

Let’s say you live in a small apartment and you don’t own anything people would really steal plus you trust the people living in your building.

This apartment is quite strange cause it has two doors, one to enter and one to leave.

There is one thing, hidden in the apartment, that is very valuable to you but you hardly ever look at it.

One day, the realtor gives you two options, you may either buy a very secure lock that will keep everyone out or one that will let anyone enter but speak the time of last entrance.

I don’t know for you, but in that context, I would go for the super fancy speaking lock for the very simple reason that I don’t mind if some curious people come in my place or even steal from me.

My main concern is actually that they don’t find out about my treasure without me knowing about it and so I wouldn’t go for the super secure lock that may be picked one day but rather for the fancy one (please let just assume it can’t be hacked either).

This would actually lead me to entrust the “Willing full” company over the “Elite” one as they would show a more suitable behavior in that fictive scenario. (Cause yes, I assume GAFAM are better in access management and observability than most SMBs 🙈)

Big “Evil” companies are by nature the most incentivized organizations when it comes to security as they have the biggest risks of the world discovering how they manage data and assess the impact these data can have over the world.

There is a world between conspiracy and pragmatic reality where these organizations exist and do stuff you are very unlikely to accept if you don’t get something great in return and this is not only the world of GAFAM. It may not even be this world at all. (Unpopular opinion 🤐)

Small organizations are the most incentivized into breaking the rules and facing a day-to-day bargain between what they do with the data to be able to operate, develop and sustain and what they think they give in return.

From the owner of a small online business to the volunteers in a nonprofit, including small and mid-sized businesses putting people’s employment and lives at high risk if they fail to make them profitable,

These three innocent little pigs are most likely to behave as big bad bad wolves when dealing with your data which will usually fall under the “We do our best but we lack the resources to do it great”, this compliance level actually pushes people to think they lack resources and focus on something else, outputting a non-aware organization which will hardly learn anything about data privacy practices at all.

To be clear, not every small organization is like that and I don’t say we should blindly entrust GAFAM with any data because they have robust infrastructure and sustainable organization either.

But it’s important to show how non-linear the scales of privacy quality are and highlight their relation with our capitalist, mostly liberal society.

To put it simply, don’t judge the book by its cover.

Learning by failing

While this very meritocratic concept became popular in tech and entrepreneurship for a lot of bad reasons it seems it struggles to make its way in the data protection discussion.

I really don’t know why and I don’t think it is worth trying to understand as we don’t need more mainstream than what we already deal with every day.

Why is it worth mentioning then?

Because I think this is where we fail to understand how to assess if a company succeeds to grow a culture of data protection and privacy awareness.

While most of our work is abstract to anyone not interested in it, there are things that would be spectacular enough to occur an actual culture change in a company and eventually survive generations of turnover.

This image from the fable “The three little pigs” shows the big bad wolf blowing the house of the pigs family. While they are pushing the door to keep the wolf out, the house flies in the air and only the pigs and the door remains.
The three little pigs — Disney

You can spend hours… Sorry, actual months teaching people around you about good practices and raising privacy awareness, and you will feel like you are really doing the job but struggle to have any impact.

You may either feel you are not good enough engaging people or they don’t have any drive to make actual data privacy happen. This doesn’t lead you anywhere.

Well, the best thing that could happen to relieve you of that pressure is nothing more than a quite simple, dramatic, probably expected, disaster.

An award-winning drama that people will remember for years, letting its marks deep in people’s brains and values systems.

You may not feel good about relying on such a thing, and you should definitely not trigger it, you absolutely don’t want to become some kind of privacy messiah or dictator, don’t you?

You obviously don’t need to trigger such a thing or be the person saying “I told you”, since this happened, people probably forgot you told them anyway.

What do we do then?

I am sure it is not mine to say but, well you read quite a lot already so here is my opinion.

These spectacular events already happened and already marked the people you work with, they were either benevolent accomplices or victims of data leaks or bad management.

Public services and companies we rely on day to day life such as banks, energy, internet, water providers, and even your badminton club or your children’s school, all make a pretty fucked up use of new technologies and still used it enough to forget how to deal with paper organization.

We had great civilization and intelligence before IBM but this is another story, the stories that matter here are the stories people have to tell.

🕊️I apologize for all the extrapolations I already wrote in this article, the goal is to make a point rather than insult your ability to understand ideas, please feel free to release a PowerPoint compilation of your favorite extrapolations and feed it around. 🤐

Privacy is about people’s stories

If spectacular events help to remember, you have to ask people about the events they faced during their life and connect it (in a realistic way) with data privacy practices.

By doing that, you aim to leave a “We meant no arm” culture to what I would call a “Caring” one.

The question “When did we fail to protect people’s privacy in our company” is great as many can relate but it means young organizations won’t have any experience to start building on and you will delay engaging people.

On the opposite, questions such as “When did a company fail to protect your privacy”, “When did a company took a decision that had an impact on your life based on inaccurate data”, and “When did something really bad happen to you because of a process you were not aware of” will usually find a lot of answers and carry both emotions and spectacularity, everything any social being needs to start personally caring about data protection.

Once you get a “Caring” culture you are ready to move to a “Taking care” culture where tangible and improvable stuff will happen.

In case this should be said, you don’t actually step up from culture A to culture B, it is most likely that the new culture starts with a small group of people who will spread and eventually reword it into some kind of AB transition, or a trend, that will prepare the field for further opportunities of making culture B dominant.

Take your time, and the hardest part, take theirs. 🙂

Now is it for the aesthetic or do I really need to mention it in this article, I am not sure, but if privacy is about actual people’s stories, it is also about the stories yet to happen.

Once you have set the context and described the data processing activities, privacy impact assessment is all about this question.

“What is the story that may happen to the people who trust us and they wouldn’t like to happen?”

Ask that question first, and only then you will assess how likely this story may come to happen and how much you put people’s quality of life in danger.

Asking about putting risks in a questionnaire is not something that will motivate people, let them call to their creativity and take time to introspect about their own work will. (Well maybe not if you word it that way… you would do that only if you use the 🙏 emoji a lot inside your company indeed)

⚠️ Profiling people based on the use of emojis is an arguable practice. 🙏

Our safety relies on people

While it is more than needed to set up the company in a way that is secure and succeeds to protect people from anyone inside or outside the company, it is important to remember that the first factor of bad privacy protection is people’s lack of awareness.

This can be due to an underlying cause such as lack of training, incentives, or features but also to small occurrences such as an non-habitual situation at the office, during commute or travel, or after a bad night,…

You can’t overcome anything by simply getting prepared, but the best way you can prepare a company to deal with data privacy challenges is to work on people and not only systems.

You got to be the system designer sometimes, but you’ll be a coach, an instructor, a knowledge manager most of the time.

You got to face the concerns and struggles of people working for your company and link them to your mission as a data privacy expert of any kind.

As a victim of bad personal data management, I was often shocked by how people were unable to repair their mistakes or even notice their colleagues or managers about serious issues in their systems.

Data management is not only about moving data here and there in a secure way, it is also about criticizing the design of how people or elements of a system with rightful access to this data will have an impact on these people’s lives or the company.

This is what business processes managers forget the most once they come up with these perfectly shaped swim lanes, there is a real world outside of the swim lane, and a wrongly defined node in a process may make this world hell.

While no arrows in the diagram will ever point to the outer world, people will, and you need to listen to them when they will raise the alert. 🤷‍♂️

A professionally dressed guy drawing a call to action mailing workflow to have people come back on an online shop and buy what was left in the cart
Photo by Campaign Creators on Unsplash

Yes, the guy in this photo will make money and eventually leak a gift surprise to the wrong person, but who doesn’t enjoy a whiteboard session and marketing automation?

These experiences may have led me to deeply care about data privacy and take it to a very high level of importance that someone who didn’t face these issues yet or didn’t associate them with data privacy would not achieve.

This is why I want people around me to take it easy. Don’t take me wrong, it is important, but it is that important that everyone has to contribute and none can pretend to define a whole company’s data management policies and individually assess all data processing activities.

I want people to have their own stories of why they care about data privacy and how this affects the decisions they make every day and how they contribute to improving each data processing activity inside the company.

As a data protection officer, developer, or project manager, I more often faced the issue that people couldn’t really relate to the criticality of protecting people’s privacy while building the future.

I couldn’t help but notice the gap between my ideal situation and the status quo among hundreds of people who had access to harmful data with hardly any sense of the responsibilities that come with it and I felt like I was contributing to designing an unsafe future.

Yes, I was this misaligned guy more than once.

I will still be and I hope people around me will be.

This is the job.

Misalignment is nothing more than an opportunity for alignment.

This is how you keep the momentum going further than a compliance sprint, a meeting with overly expensive consultants, or one very aware and skilled generation of people 🙂

--

--

Simon Mulquin

Fulltime curious guy; freelance in public interest IT; passionated by human sciences, territory development and privacy; I write in french or english 🙂