“Stop manipulating us, and give us real choices,” says Katarzyna Szymielewicz, a technology and human rights expert, lawyer and activist who advocates for people to have more control over how their data is processed and used.
Companies are building digital profiles of us, made up of data collected by thousands of trackers in mobile apps or on the web. They gather information about us practically whenever we are connected to the internet. Data brokers sell this data to whoever is willing to pay the price. It changes hands between countless companies without our knowledge.
Data about us is sorted into categories we often can’t see and analyzed by algorithms we often don’t know about – and then used to make decisions that could impact our lives, for better or worse.
But what if we could take guessing out of the equation, and just tell companies who we are? Would they respect our answers?
Katarzyna Szymielewicz is the co-founder and president of Panoptykon Foundation, a digital rights organization in Poland. In January 2019, Panoptykon filed a complaint against Google under the new European General Data Protection Regulation, alleging the company had violated the regulation’s requirements to provide users with access to data held about them.
To help a broader audience visualize how little we’re currently able to control our digital profiles, Szymielewicz has developed a metaphor of “three layers” of data: providing examples of what is collected about us, what is observed and what is generated by machines.
Three layers of our digital profile
Q: Are our data profiles inaccurate?
A: Who knows? Without transparency and access to the full profiles that are generated for us by tech companies we cannot really tell. I am sure users themselves would be the best auditors of these datasets because they have real (often economic) incentives not to be judged on the basis of incorrect or incomplete information. But they are not given the chance to do so.
I came up with this layered metaphor to explain the complexity (and dangers) of how online data profiles work after hearing for the hundredth time: ‘What’s the problem if we choose to share and publish our data ourselves?’ The thing is that we do not make these choices ourselves. We are lured into sharing more data than we would accept, observed and qualified by machines in ways we can hardly imagine. Not surprisingly, they detect sensitive characteristics we may prefer to keep private.
Q: Why should we want to see our data?
The only way to regain full control over our profiles is to convince the companies who do the profiling to change their approach. Instead of hiding our data from us, they should become more transparent. We need to open these opaque systems to the scrutiny of users.
On the other hand – instead of guessing our location, relationships, or hidden desires behind our backs, I think companies could simply start asking us questions, and respecting our answers. I even see this as a real opportunity for marketing companies to build trust and make targeted ads more relevant and fair.
In the European Union, we have a legal framework that facilitates greater openness and access. The General Data Protection Regulation (GDPR) now gives Europeans the right to verify data held by individual companies, including marketing and advertising profiles. Companies can still protect their code and algorithms as business secrets, but in theory they can no longer hide personal data they generate about their users. I say in theory – because in practice companies don’t reveal the full picture when confronted with this legal obligation. In particular, they hide behavioural observation data and data generated with proprietary algorithms. This must change, and I am sure it will, once we begin to see the first legal complaints result in fines.
Q: How could we make radical transparency a reality?
Well, no doubt we have to be prepared for a long march. We need to work together as a movement and test different approaches. Some of us will continue to test legal tools and fight opponents in courts or in front of Data Protection Authorities. Others will advocate for (still) better legal safeguards, for example in the upcoming European ePrivacy Regulation. Others will build or crowdfund alternative services or push big tech to test new business models, and so on. I am sure it will be a long run, but as a movement, we are at least heading in the right direction. The main challenge for us now is to convince or compel commercial actors to come along.