WELCOME to Connected Rights, your home on the range of digital rights news and analysis.
APPLE HAS SCRUBBED DOZENS OF VPN APPS from its Chinese App Store. Virtual private networks (VPNs) allow people to re-reroute their online activities in a way that makes it hard to tell where they are, and to read things that are blocked if they access the internet normally. The Chinese government doesn’t like them, and requires providers to have special licenses. So Apple has kicked out the unlicensed VPN apps: http://bbc.in/2vkz5ai
Apple’s full statement: “Earlier this year China’s [IT ministry] announced that all developers offering VPNs must obtain a license from the government. We have been required to remove some VPN apps in China that do not meet the new regulations. These apps remain available in all other markets where they do business.”
Well, Apple could have refused, which would probably have doomed the iPhone in China. As Apple fan John Gruber has noted, this would deprive Chinese people of Apple’s encrypted communications services, which haven’t yet been nixed like WhatsApp has: http://bit.ly/2vfzTg7. Gruber ultimately blames Apple’s app-distribution model for the mess, writing: “They’d rather deal with the negative consequences of the App Store as a choke point than give up the benefits (including the profits) of maintaining complete control over all software on the platform.”
Apple CEO Tim Cook also weighed in yesterday, saying: “We would obviously rather not remove the apps, but like we do in other countries, we follow the law wherever we do business… We strongly believe in participating in markets and bringing benefits to customers is the best interest of the folks there and in other countries as well.” http://for.tn/2uiRSxJ
OK, but… this was not a good decision. VPNs are good for most people, apart from those who want to control people. This was a decision that will leave a lot of people less safe, regardless of how good Apple’s own communications apps are. As ExpressVPN, one of the affected providers, wrote, Apple’s call will “threaten free speech and civil liberties”: http://bit.ly/2uSghOn. This is a company choosing money over what’s right, whichever way you spin it.
Meanwhile, Vladimir Putin has signed a Russian law that also bans VPNs and other tools that people can use to get past censorship and surveillance.
NSA whistleblower Edward Snowden is sheltering in Moscow under Russian asylum, but that didn’t stop him sounding off over the new law(s). “Whether enacted by China, Russia, or anyone else, we must be clear this is not a reasonable ‘regulation,’ but a violation of human rights,” he thundered over Twitter, also calling the Russian law a “tragedy of policy”. My story: http://zd.net/2tW5bFa
If you’d like to support the creation of this newsletter and help me develop further Connected Rights resources, please check out my new Patreon page.
MORE ENCRYPTION FACE-PALMERY, this time courtesy of British home secretary Amber Rudd, who is still determined to get around the fact that encryption only works at all if it works for everyone. Rudd said the British government supports strong encryption and wouldn’t ban it, but would also like companies to stop using it: http://bit.ly/2u1cC1D
The idea that end-to-end encryption stops people from snooping on conversations “might be true in theory”, the not-a-mathematician wrote, but “real people often prefer ease of use and a multitude of features to perfect, unbreakable security… who uses WhatsApp because it is end-to-end encrypted, rather than because it is an incredibly user-friendly and cheap way of staying in touch with friends and family?”
Rudd sounds very much like someone who’s enthusiastically grasping the wrong end of the stick. The tension between security and usability is a real thing and, until the aftermath of the Snowden revelations, there weren’t many options for staying secure in a super-user-friendly way. But now there are. WhatsApp’s encryption, for example, doesn’t come at the expense of usability – you don’t even have to think about it to use it. There’s no longer anything to be gained from being less secure. This ship has sailed.
WhatsApp owner Facebook still thinks strong encryption is swell, of course. Chief operating officer Sheryl Sandberg has a pitch for law enforcement agencies for giving up their crackdown quest: “The message itself is encrypted, but the metadata is not. If people move off those encrypted applications to other applications offshore, the government has less information, not more.” http://bit.ly/2f4TBoE
REMEMBER WHEN THE ROOMBA COMPANY said it wanted to sell maps of people’s homes to the highest bidder (it wasn’t long ago: http://bit.ly/2hlf90E)? Well, iRobot now says it will “never sell your data”. “Information that is shared needs to be controlled by the customer and not as a data asset of a corporation to exploit,” it said: http://zd.net/2tQ6hpL.
As for the idea that it wanted to sell mapping data derived from the robotic vacuum cleaners, this was “a misinterpretation”. Indeed, the original Reuters story that kicked off this fuss has now been changed to replace “sell maps” with “share maps for free with customer consent”: http://reut.rs/2uWjdcX
If you’d like to hire me to write articles about digital rights issues, speak at your event or provide advice for your business, drop me an email at firstname.lastname@example.org.
GERMAN RESEARCHERS POSED as a marketing company in order to get the “anonymous” browsing histories of over three million German citizens, which they received for free from a data broker.
The researchers – a journalist and a data scientist – were able to de-anonymise many of the people involved, reportedly learning of “a judge’s porn preferences and the medication used by a German MP”: http://bit.ly/2uTOonk. As the Guardian article says, the identification can come from web addresses (URLs) that have people’s usernames in them, or by correlating as few as 10 URLs.
There are two things to take away from this. Firstly, the data broker industry is a shadowy morass that really should become a regulatory target at some point soon – our information goes all over the place and we have no way to even know what’s happening, let alone stop it. Secondly, “anonymising” data is sometimes not simple. If it’s possible to retrace your steps by matching up the many fragments of your activities, you’re exposed.
GOOGLE’S YOUTUBE CLAIMS THAT ITS ALGORITHMS are “in many cases” better than humans at flagging videos that need to be taken down for their extremist or terrorism-related content, thanks to advances in machine learning technology: http://bit.ly/2hjCP5F
That’s interesting, because Google’s senior director for European public policy, Nicklas Lundblad, said not long ago that the technology wasn’t ready yet. “Is it possible within the existing computer science knowledge to build a system that can automatically do content regulation and decide that this little piece of content here is hate speech and this little piece of content over here should stay up?” Lundblad asked at the re:publica conference in Berlin in May.
“I think that we are still far, far away from being able to do that. I think that the notion of context and the ability to detect context is very hard. It’s an open computer science problem.” http://bit.ly/2hmC242
A MASSIVE DRUG CASE IN AUSTRALIA has collapsed because the police agency involved refused to tell anyone – even the court – how its surveillance techniques work. Judges called the police and prosecutors’ actions “difficult to understand”: http://bit.ly/2uWACjC