Give Ofcom emergency powers to investigate Facebook encryption plans, says NSPCC chief

However, activists warn Ofcom may not have its powers until 2024, at which time the Meta crypto plans will likely be ready.

Following the NSPCC’s call, Antigone Davis, Global Head of Safety at Meta, said: “We do not tolerate child exploitation on our platforms. We agree on the need for robust security measures that work with end-to-end encryption, and have developed a clear approach to including them in our end-to-end encryption plans. to a party.

“We are focused on preventing the harm from happening in the first place by restricting adults on Facebook and Instagram from sending messages to children and giving away their accounts under the age of 18 to private or ‘friends only’ accounts.

We also provide more controls for people to protect themselves from harm and respond quickly to user reports and valid requests from the police.

“The vast majority of Brits already rely on encryption to keep them safe from hackers, fraudsters and criminals, and any solutions we develop need to ensure that protection stays the same.

“We will continue to work with outside experts to develop effective solutions to combat such abuse because our work in this area is never done.”

A government spokesperson said: “Children will be at the center of our new online safety laws with tough penalties for social media platforms that fail to protect young people from harm.

“This ground-breaking legislation will give Ofcom additional powers, with the most significant penalties imposed on companies that do not comply.

“We believe it is possible for companies to implement end-to-end encryption in a way that is consistent with public safety, and does not preclude action against child abuse.”

‘Meta’ can no longer be a judge and jury about her behavior while children’s safety sits on the edge of a cliff

Written by Sir Peter Wanless, CEO of NSPCC

It’s been nearly two years since the NSPCC led a global coalition of 130 child protection organizations to report Mark Zuckerberg.

We asked him to stop plans to roll out end-to-end encryption on Facebook, Instagram and messaging services until they realize direct messaging is the first line of child sexual abuse and demonstrate that they have systems in place to disable it.

Since we’ve written to them, Facebook, now Meta, has been shedding a conveyor belt of security scandals with obfuscation and denial, interrogating our questions and concerns with unsatisfactory answers.

What is clear is the scale of abuse children face on their sites.

Each year, Instagram alone is used in about a third of reported grooming crimes on social media. Crimes that will not be detected under Meta’s comprehensive end-to-end encryption schemes.

It was encouraging to read in The Telegraph that the company will pause publishing until 2023 to consider the implications for child protection.

As we have always said, Meta should only move forward with these measures when they can demonstrate that they have built technical mitigations that can ensure that children are not at greater risk of abuse.

But on closer reading, Antigone Davis offers nothing new.

It was strong in rhetoric but light in detail, and made it difficult to conclude anything other than being a move to play time while the tech giant was making tough headlines.

Ms Davis cited WhatsApp as an example of action against abuse in end-to-end encrypted environments, but this is not the silver lining Meta likes to suggest.

The numbers speak for themselves.

In 2020, the National Crime Agency received about 24,000 reports of child abuse from Facebook and Instagram, but only 308 from WhatsApp.

WhatsApp data shows that less than 15 percent of accounts they suspend for child abuse lead to actionable reports to police. Meta knows the abuse is happening, but they can’t see it and they can’t act on it.

Meta would have announced that it would follow Apple’s lead in developing child safety measures that could operate in end-to-end encrypted environments.

However, Will Cathcart, head of WhatsApp, called Apple’s plans “disturbing” and flatly rejected a similar approach.

By sticking to the status quo and continuing to promote, at best, plaster solutions, Meta still has a clear plan to protect children. It is disingenuous to say otherwise.

Mark Zuckerberg can take steps today to restore confidence. In May, Facebook’s board of directors successfully blocked a shareholder’s proposal to risk evaluating the effects of end-to-end encryption on child abuse.

They must admit they made a mistake and commit to a full and independent risk assessment.

Actions speak louder than words.

As Detective Francis Hogan’s discoveries show, transparency is key.

In the past six months, Meta’s latest Community Standards report has revealed a record number of child abuse-related removals.

Nearly 50 million items of child abuse material have been removed from Facebook and Instagram, more than triple in the previous six months.

Meta has attributed the massive increase to improving their “ability to detect” but it remains unclear whether the company is playing a role in catching up after apparent technical issues last year, or whether the risk of child abuse is amplifying.

In this context encryption sits end-to-end. We know it can take away the safety of children and have a huge impact on identifying child grooming and abuse materials.

But since the agencies have no authority to ask questions, we have no idea how bad the tsunami could be.

Meta often mentions how they would welcome the organization to help guide their response to abuse. But we can’t wait another two years before we can even start asking for answers.

That is why we urgently call on the government to expedite the tracking of Ofcom’s powers to investigate the Internet Security Bill. Let us give the organizer the power to start asking the necessary questions and be able to look into the inner workings of the meta without delay.

The crypto controversy and whistleblower discoveries highlight that the Meta can no longer be a judge and jury over its behavior while children’s safety is on the brink.

We can’t wonder if the Meta announcement is triggering a fundamental reset of their plans or if it’s just another tactic from their PR machine.

The government can take the lead by giving Ofcom the power to demand answers.


Leave a reply:

Your email address will not be published.