close

What should social media giants do to protect children? | technology

TThis week, the technical leaders at GCHQ and the National Cyber ​​Security Center intervened powerfully in an incredibly controversial debate: What should social media companies be doing to keep children safe on their platforms?

But the intervention was not received that way by all parties. Others heard something very different: tired arguments against end-to-end encryption, dressed in new clothes but disguising the same assault on privacy with the same excuse law enforcement always comes up with.

From our story:

Tech companies should move forward with controversial technology that looks for images of child abuse on users’ phones, according to technical leaders at GCHQ and the UK’s National Cyber Security Center have said.

With so-called “client-side scanning,” service providers such as Facebook or Apple would develop software that monitors communications for suspicious activity without having to share message content with a central server.

Ian Levy, the NCSC’s technical director, and Crispin Robinson, the technical director for cryptanalysis – codebreaking – at GCHQ said the technology can protect children and privacy at the same time. “We have found no reason why client-side scanning techniques cannot be safely implemented in many of the situations that will be encountered‘ they wrote in a new discussion paper .

You may remember the client-side scanning debate a year ago. To quote myself:

Apple takes a big step into the unknown. That’s because his version of this approach, for the first time from a major platform, is scanning photos on users’ hardware rather than waiting for them to be uploaded to the company’s servers.

By normalizing on-device scanning for CSAM [child sexual abuse material]Critics fear Apple has taken a dangerous step. From here, they argue, it’s just a matter of degree whether our digital lives are being monitored online and offline. It’s a small step in a direction to expand scanning beyond CSAM; expanding it beyond simple photo libraries is one small step in another; it is one small step into another to expand beyond perfect matches of known images.

So why is Levy and Robinson’s intervention important? To me, it’s a sincere attempt to address the concerns of these critics, to show the benefits of client-side scanning in combating specific categories of threats — and to suggest sensible solutions to common fears.

The devil is in the details

To take an example from the 70-page paper, the two seek to counteract fears that lists of photos scanned for CSAM could be expanded beyond the familiar CSAM to include, for example, images of a political nature. To put it plainly, what would stop China from requiring Apple to include the famous images of Tank Man in its scanning apparatus and forcing the company to flag any iPhones containing that image as potentially criminal?

Robinson and Levy propose a system that would do just that. They suggest the list of images be compiled by child protection groups around the world – organizations like the US’s National Center for Missing and Exploited Children or the UK’s Internet Watch Foundation (IWF). Each of these groups already maintains a database of ‘known’ CSAMs, which they work together to keep as comprehensive as possible, and the scanning database can only be built from these images in all of the groups’ lists.

They can then publish a hash, a cryptographic signature, of that database when they give it to tech companies, who can display the same hash when loaded onto your phone. Even if China could force its domestic child protection group to add Tank Man to its list, it wouldn’t be able to do the same for the IWF, so the image wouldn’t make it onto the devices; and if Apple were forced to load a different database for China, the hash would change accordingly, and users would know the system was no longer trusted.

The point is not that the proposed solution is the best possible way to solve the problem, Levy and Robinson write, but to demonstrate that “details matter”: “Discussing the issue in generalities, using ambiguous language, or exaggerating almost certainly leads to the wrong result.”

The fear and anger is real

In a way, this is a powerful rhetorical move. Insisting that the conversation focuses on the details is an insistence that people who reject client-side scanning in principle are wrong: if you believe that the privacy of private communications is, and should be, an inviolable right, then Levy and Robinson effectively argue that you are being left out of the conversation in favor of more moderate people willing to discuss compromise.

But it’s frustrating that much of the reaction has been the same generalities that accompanied Apple’s announcement a year ago. Technology news site The Register, for example, ran an irate editorial that said: “The same argument has been used many times before, usually against one of the four horsemen of the infocalypse: terrorists, drug dealers, child sexual abuse material (CSAM), and organized crime.”

I’ve spent enough time speaking to people who work in child protection to know that the fear and anger at the damage being caused by some of the world’s largest corporations is real, regardless of whether you think so that it is correct. I do not pretend to know Levy and Robinson’s motivations, but this paper represents an attempt to create conversation rather than proceed with a yelling struggle between two irreconcilable sides of an argument. It deserves to be treated as such.

Table of Contents

It’s not ‘your trade’

What Minecraft is is mine.
What Minecraft is is mine. Photo: Chris Bardgett/Alamy

Minecraft is big. You may have heard about it. So when the game makes a moderation decision, it matters a bit more than Bungie decided to nerf scout rifles in Destiny 2. Especially when the moderator’s decision is:

Minecraft will not allow the use of non-fungible tokens (NFTs) on the popular gaming platform, which the company describes as antithetical to Minecraft’s “values ​​of creative inclusion and co-gaming.”

Minecraft represented an attractive potential market for NFTs, with a user base – estimated at more than 141 million as of August 2021 – already engaged in sharing unique digital items developed for the game.

But Microsoft’s own development studio behind Minecraft, Mojang, has put an end to speculation that NFTs could be allowed in the game. In a blog post on Wednesday, developers said blockchain technology isn’t allowed because it contradicts Minecraft’s values.

The incredible success of Minecraft is due to its expandability. As well as the built-in creative aspects of the game – often described as the 21st century Lego answer – users can modify it in ways big and small to create new experiences. This flexibility proved enticing to NFT developers, who embraced the idea of ​​creating new features in Minecraft and selling them as digital assets.

In theory, it’s the perfect NFT opportunity: a digital-native creation with an actually viable use case and a proven viable market. Startups flocked to the field: NFT Worlds sells pre-generated Minecraft landscapes that people could build experiences on and resell for profit; Gridcraft runs a Minecraft server with its own crypto-based economy.

Or they did. Well, it seems NFTs have become such a toxic phenomenon that even passive acceptance is too much for a company like Mojang. If you want to make it in this world, you have to make it on your own.

That broad TechScape

Leave a Comment