WELCOME to Connected Rights, your knock at the door of digital rights news and analysis.
FAREWELL THEN, HEIKO MAAS. Germany’s erstwhile justice minister is off to become foreign minister in the new coalition government: http://bit.ly/2p6A3SO
While Maas’s opposition to far-right extremism is an obvious plus, from a digital rights perspective it’s nonetheless good riddance.
Maas was the driving force behind Germany’s strict social media rules, pushing platforms to take down content at ever-increasing speed, and thereby risking bad decisions (for more on this topic, read Jennifer Baker’s excellent piece for The Next Web: http://bit.ly/2Gi0Hj1).
He was also partly responsible for Germany’s 2015 data retention law, despite having once opposed this sort of surveillance. And it was also Maas who last year hugely expanded the number of offences which the German police can investigate by hacking into people’s phones and computers (by sneaking the amendment into a separate law about driving bans, just before the Bundestag shut up shop for the pre-election period).
Let’s hope Maas’s successor, Katarina Barley (her dad was from Lincolnshire, in case you’re wondering), does a better job.
…SPEAKING OF THE FIGHT AGAINST “FAKE NEWS”, here’s a Politico piece arguing that Facebook ought to “give researchers, academics and journalists access to data it collects on people’s individual Facebook pages”, in order to “track, analyse and predict how waves of online misinformation… are spread on the world’s largest social network”: http://politi.co/2DlMFd5
The idea, of course, is to provide access to anonymized datasets. Facebook, however, is arguing that EU privacy laws make it impossible to hand over such data. Is that correct? I’d say not, if the data is anonymized to a degree where it is truly impossible to re-identify individuals.
…AND HERE’S THE FINAL REPORT of the “High Level Expert Group on Fake News and Online Disinformation” convened by the European Commission (and referenced in the Baker piece I linked to above): http://bit.ly/2p4qjZr
The group advised the Commission “against simplistic solutions. Any form of censorship either public or private should clearly be avoided.” (The group, it should be noted, includes top policy people from Facebook, Google and Twitter, albeit in a diverse mix that also includes media, academia and consumer advocates.)
Instead, the group said, aims should include more transparency of online news, better media literacy, more tools for journalists and users “to tackle disinformation and foster a positive engagement with fast-evolving information technologies”, more diversity in the European news ecosystem, and more research on the reality of online disinformation.
…ALSO, YOUTUBE HAS JUST ANNOUNCED that it will link to Wikipedia and other “fact-based” websites under videos that promote conspiracy theories. http://cnb.cx/2FQrC7D
To support my work, please consider visiting my Patreon page or buying my book, Control Shift: How Technology Affects You and Your Rights. Here are the links to the book’s British, American and German Amazon pages.
WHATSAPP HAS PROMISED NOT TO SHARE people’s personal data with the Facebook mothership until it is sure it can do so in compliance with the upcoming General Data Protection Regulation. The news was announced this morning by the UK’s Information Commissioner’s Office: http://bit.ly/2Hx8vwB
The commissioner, Elizabeth Denham, investigated WhatsApp’s notorious (and arguably promise-breaking) data sharing with Facebook and found that WhatsApp had identified no legal basis for doing this. It also wasn’t giving users clear information about the sharing, and it was breaking purpose-limitation rules by taking the data for one reason (using WhatsApp) and repurposing it for another (better targeting of ads on Facebook).
“I reached the conclusion that an undertaking was the most effective regulatory tool for me to use, given the circumstances of the case,” Denham wrote. She wasn’t able to fine the company for breaking the existing UK Data Protection Act, as it claimed it hadn’t shared UK user data with Facebook (but if it had, it would have broken the act).
TIM BERNERS-LEE GOT A HUGE AMOUNT OF COVERAGE of his open letter this year, issued to mark the web’s 29th birthday. That’s because he said big tech firms may need to be regulated to stop their platforms being “weaponised”: http://bit.ly/2tBzJQH
TimBL’s regulation call was fresh, but he has set out the underlying concerns many times – disinformation, data theft and so on. What’s changed? The broader mood has caught up with him. People and regulators are more open than ever before to the idea of regulating Silicon Valley in new and profound ways. And for that, we can probably thank Russia and Trump.
RESEARCHERS AT THE UNIVERSIY OF ARIZONA RETROACTIVELY SPIED ON STUDENTS by tapping into their student ID card’s timestamped location logs. They did so in order to predict which students in their first year were likely to drop out, and the system seems to be reasonably effective: http://for.tn/2tNCEWK
The researchers tracked people’s movements to see if they have routines and interact with other people a lot. If not, apparently that indicates a greater likelihood of dropping out. The data is supposedly “anonymized”, though it isn’t really: they’re using pseudonymisation to shield what the researchers can see.
Taking personal data intended for one thing and using it for something else? Profiling people without meaningful consent? I don’t think this sort of thing would fly in Europe.
THE BRITISH GOVERNMENT HAS DELAYED ITS AGE VERIFICATION PLANS for those visiting pornographic websites in the country. It pulled out two weeks before the new rules were to be introduced, in order to “get the implementation of the policy right”: http://bit.ly/2pf6gqu
Well, indeed. This is going to be a tough one to implement, given the data protection risks associated with creating lists of porn enthusiasts. As Open Rights Group legal director Myles Jackman put it in the piece linked to above: “This is a chance for the government to rethink the absence of safeguards for privacy and security, but it is frightening to consider that this policy was two weeks away from launch before it was pulled.”
If you’d like me to write articles for you about digital rights issues, speak at your event or provide privacy advice for your business, drop me an email at david@dmeyer.eu.
CONGRATULATIONS TO ED FELTEN FOR BEING NOMINATED by Donald Trump to the Privacy and Civil Liberties Oversight Board: http://bit.ly/2HzNQYV
The Princeton professor is very well regarded in the tech policy world, having argued in the wake of the Snowden revelations that yeah, all that phone record data can be pretty revealing. He served at deputy chief technology officer under Barack Obama. As someone commented on Twitter: “So for once, when [Trump] claims to have the BEST people, he’d be right!”
A NEW “AI” ALGORITHM IS CAPABLE OF CLONING SOMEONE’S VOICE after being given a mere 3.7 seconds of audio. Well, sort of. Baidu’s algorithm results in some pretty weird-sounding “voices”, as you’ll hear from the clips in this piece: http://bit.ly/2FMwLhb
Still, if someone were to end up perfecting this sort of technique, who knows what havoc they’d be able to wreak…
THIS NEWSLETTER DEALS WITH A LOT OF TECH-ETHICAL ISSUES, BUT this story definitely presents some new ones. A Silicon Valley startup called Nectome promises to freeze your brain so that your memories can one day be uploaded, but in order to (supposedly) do this, it needs to kill you. If that’s okay with you: http://bit.ly/2Dm5YCV