Because you can never have too many directions…
Nicola Osborne explains data protection and GDPR to me (we should all understand it by now I know!)
We’re all worried about data protection rights, but what direction are they really going in?
Well since we are loosely in World Cup season I’m going to say it’s a game of two halves…
If you look at the amount of data we’ve gotten used to sharing on Facebook, Google, etc. and the way we use phone apps and home voice control systems like Alexa then it looks pretty bad. We are totally used to sharing our name, date of birth, our pictures, our emotional state, our current real world location, tracking our weight, our runs, our lives… And never really reading the terms of service.
At the same time, I think most people are getting much more thoughtful about how they share data, and who that data is shared with – even as we are adopting more potentially-privacy invading technologies, I think we are becoming more demanding about our data and much more aware of the impact that sharing data can have.
Where will data protection be in 5 years?
Well, we could be in a great place with data protection…
At the University of Edinburgh and EDINA we’ve been undertaking research on how people manage their digital footprint – tracks and traces that are left behind online whether on purpose or by accident – for the last five years. When people start to think deeply about how they would want to present themselves online you hear questions like “but how do I get rid of something I posted that I now regret” or “what if someone else posts something about me” or “how do I know how my data is being shared across sites and apps?”. We’ve had lots of suggestions of course, but as of 25th May 2018 we have a whole new set of ways of exercising our rights…
The new General Data Protection Regulations (or GDPR) might sound like the least exciting piece of legislation but I think they could be the new superhero powers to help us make those demands, exercise our rights over our own data, and really start questioning how our personal data is used and shared. They cover all kinds of data – including “metadata” like our location, ip address, etc. so a really broad and inclusive sense of “personal” data. And the regulations require real transparency over how data is used, stored, retained, shared, and bulks up consumer rights to understand what data others have about them, how its stored, how it is used, who it is shared with, and also how they can request its removal.
Is that good?
It’s brilliant! It gives us a chance to step back and curate our digital footprint more thoughtfully. Although GDPR includes lots of protections that were already in the Data Protection act, they now come with much sharper teeth. A really good example of this is that the Information Commissioner’s Officer has recently announced an intention to fine Facebook £500k for two data breaches associated with the Cambridge Analytica scandal. Facebook makes that much money every 2.5 minutes. Under GDPR – so anything that happened after 25th May – they could be fined 4% of their global turnover – which would be £1.4bn; or responsible individuals could face a (max) 2 year prison term. That’s quite a ramp up in super powers!
Can/should we change the direction of data protection rights?
Yes! And the best way to do that is to question what data we want out there about ourselves, how we want that data used, and then really take advantage of these new rights and legal obligations. If we use them, we could really reshape how our data is used, and the kind of privacy we want in the future. If we don’t use those rights and make it clear that how our data is used matters, we make it harder to have conscious choices about our data and the way it is used. That isn’t just about having a great online presence for potential employers, or looking really cool for our personal social media audiences – though that’s part of it – it can be much more serious than that.
Personal data, including social media postings and location data, on phones is already being used in refugee validation/deportation processes already; leaked personal data and “doxxing” can have serious consequences for employment, mental health and physical safety; health tracking data can have significant influence on insurance prices or availability (for good and bad); there is some evidence that pregnancy apps have been selling data to political campaigns (the ICO is looking at that one at the moment); and in China a “social credit system” which is in trial combines bill-paying, spending patterns, their social interactions, their “compliance”, etc. to give a score that influences access to jobs, schools, mortgages.
Where do you want to see data protection rights end up?
Most of the businesses using data in lets say sketchy ways didn’t mean to do anything evil, they just a cool idea they wanted to build, but data selling and data analytics are a really easy way to finance and tailor products, especially if privacy doesn’t matter too much to you (more likely when you are young, privileged, undiverse). The result is that our personal data is now big business, but taking a more informed and proactive role gives us much more opportunity to rethink how and why we use the digital tools we do, to challenge potential inequalities, to avoid some of those risks to our right to a private life, and to really tell our own stories.
We are at a pivotal moment for personal data: we are all just about used to pervasive access to the internet and smart phones but a lot of the business models that support social media, online content, etc. aren’t mature yet, and some of the most exciting and potentially most privacy invading technologies around big data have a long way to go. Now is the time to really engage and start shaping our own futures before someone does it for us!
So come join me at CODI to hear some more scare stories, and plot some of the ways to start demanding your data and your data rights!
Vive la personal data revolution!