WELCOME to another edition of Connected Rights, your weekly blast of digital rights news and analysis. Why is it now coming out on Wednesdays instead of Thursdays? Because I’m still playing around with the format, and I suspect it may be more useful today than in the slide towards the weekend. As always, let me know what you think!
A YEAR FROM TOMORROW, people in the EU will get a world-leading set of online privacy protections. Thanks to the General Data Protection Regulation (GDPR), which will come into effect on 25 May 2018, people there will be able to tell companies to stop profiling them unless there’s a good reason for doing so. They’ll be able to tell companies to tell them which personal data they hold on them – from names and photos to likes and messages – and demand that they delete it or hand it over, so they can switch to competing services. Companies will have to get real consent from people before they can start processing their personal data. And if they break the rules – wherever they’re based – there will be serious financial consequences.
The new regulation isn’t perfect, but it’s a necessary step towards where we need to be if we’re going to maintain our fundamental rights in the online age. Our personal data is us in digital form, and we need to be in control of it, just as we maintain power over our physical selves. That may sound abstract to some, but it’s one of the greatest challenges of our times.
And what of those who don’t live in the EU? Luckily (in this case), the bloc is influential. It has half a billion people in it and is one of the world’s preeminent trading forces. We’ve already seen countries around the world align their own privacy laws with those of the EU, to varying degrees, to ensure that their companies can keep doing business with the union – and because they too recognise how important these issues have become. There’s every reason to assume that the ideas embodied in the GDPR will spread. As long as these privacy principles are always balanced with protections for free expression and the public interest, that would be a very good thing indeed.
ONCE UPON A TIME there was a global push to give people the chance to tell companies they don’t want to be tracked all over the place as they surf the web. It was called Do Not Track – a switch built into modern web browsers that does what it says on the tin. At least, in theory; barely any web firms agreed to respect people’s wishes not to be tracked. One of the handful that did was Twitter, but no longer. As of 18 June, Twitter will by default track you across any sites that have tweet or follow buttons, or embedded tweets. As the Electronic Frontier Foundation put it, “these are not the actions of a company that respects people’s privacy choices: http://bit.ly/1KSCtpZ
GET READY for encryption to become a live issue in the UK again, very soon. The Sun reports that the Conservative government is preparing to demand that communications services give them access to encrypted messages, as soon as next month’s election is over: http://bit.ly/2qUZRRm. Services such as WhatsApp and Apple’s iMessage use strong, end-to-end encryption that leaves them without the keys – so unless they lower the security of their products and infuriate privacy-conscious users around the world, it’s impossible for them to agree to the British government’s demands. We’ve already seen the issue play out in Brazil, where the government repeatedly blocked WhatsApp across the country before courts scrapped the bans for being disproportionate. Is Theresa May’s government ready to incur the wrath of her nation of WhatsApp users?
Although strong encryption is absolutely necessary to make online life as safe as possible for all of us, the fight to defend it is sometimes difficult. This is particularly true when we are faced with terrorism, as we are now. Jim Killock, the head of the UK’s Open Rights Group, has penned a lengthy piece in the wake of this week’s horrific Manchester bombing: http://bit.ly/2rgO6b6. Here’s a key point: “Terrorists can… adapt their behaviour to avoid surveillance technologies, by changing their tech, avoiding it altogether, or simplifying their operations to make them less visible. This does not mean we should give up, nor does it mean that technology can play no role in surveillance. It does however mean that we should not assume that claims of resources and powers will necessarily result in security.”
WHAT ARE THE UK PARTIES’ POLICIES about online rights? With the election looming, I wrote an article for the International Association of Privacy Professionals about that very subject: http://bit.ly/2rzBQSR
HATE SPEECH is a big problem online, and one that Germany is trying to combat with a new law that would force Facebook et al to automatically filter out objectionable material. Digital rights groups point out that technology can’t yet determine what is and isn’t legal, because algorithms can’t understand context as well as people can. They’re pleading with the European Commission to make sure the German law doesn’t undermine fundamental rights to free speech and privacy (automated filters only work in tandem with automated surveillance of social media platforms, which is currently illegal in the EU): http://bit.ly/2qcjuDv
FACEBOOK’S GUIDELINES for content moderation are now public, thanks to the Guardian: http://bit.ly/2r6h5yb. There’s a lot of complex stuff to grapple with in there, such as Facebook’s decision not to censor livestreams of people self-harming, to allow some (but not all) footage of child bullying, and to delete statements such as “Someone shoot Trump” while accepting the likes of “To snap a b*tch’s neck, make sure to apply all your pressure to the middle of her throat”. The core problem here is the volume of human activity that now takes place on Facebook, and the fact that one company gets/has to decide how to control it.
BIOMETRIC SECURITY is supposed to be the natural alternative to problematic passcodes – instead of faffing around with strings of characters that must be simultaneously easy to remember and hard to guess, why not just use fingerprints or your unique iris? Because fingerprint and iris scanners can be fooled. Germany’s Chaos Computer Club, which four years ago showed how the iPhone’s fingerprint scanner was vulnerable to trickery, has now done the same with the iris scanner on Samsung’s flagship Galaxy S8. And all the trick needs is a bog-standard camera with night mode: http://zd.net/2qQFKGf