Facebook risks becoming a ‘one-stop grooming shop’ with encrypted messages

Facebook risks becoming a ‘one-stop grooming shop’ if it presses ahead with plans to encrypt its messaging services, NSPCC warns

  • End-to-end encryption on Facebook would bring greater privacy to messages
  • But the NSPCC says this would give offenders a ‘place to lie in the shadows’
  • The children’s charity identified more than 4,000 child abuse instances that took place on Facebook, WhatsApp or Instagram in 12 months

Facebook has been warned it risks becoming a ‘one-stop grooming shop’ by the NSPCC over the tech giant’s controversial encryption plans.

In a statement, the NSPCC’s head of child safety online said the social network will be giving offenders ‘a place to lie in the shadows’ if it brings end-to-end encryption to Facebook and Instagram.

Facebook’s founder Mark Zuckerberg announced plans for end-to-end encryption on Messenger at the start of May. 

Under the proposed plans, Facebook users would be afforded the same level of privacy as its WhatsApp service.

Scroll down for video  

Facebook (stock photo) has come under fire over its plans for end-to-end encryption by Home Secretary Priti Patel 

WHAT IS END-TO-END ENCRYPTION?

End-to-end encryption ensures only the two participants of a chat can read messages, and no-one in between – not even the company that owns the service.

End-to-end encryption is intended to prevent data being read or secretly modified when it is in transit between the two parties.

The cryptographic keys needed to access the service are automatically provided only to the two people in each conversation. 

In decrypted form, messages are accessible by a third party – which makes them interceptable by governments for law enforcement reasons.

Facebook-owned WhatsApp is already encrypted, and now Mark Zuckerberg is looking to do the same with Facebook Messenger and Instagram Direct. 

But the children’s charity believes the move would enable more child sex offenders to exploit their victims and avoid detection from law enforcement agencies. 

‘If Facebook fails to guarantee encryption won’t be detrimental to children’s safety, the next government must make clear they will face tough consequences from day one for breaching their duty of care,’ said Andy Burrows, NSPCC’s head of child safety online policy.

The charity has also highlighted the scale of child abuse images and online sexual offences via messenger services in England and Wales that have been identified by police.

Out of 9,250 such instances, more than 4,000 were carried out on Facebook, WhatsApp or Instagram.

The data – obtained by the charity from freedom of information requests to police forces between April 2018 and 2019 – shows that 22 per cent were reported on Instagram and 19 per cent on Facebook or Facebook Messenger.

Only three per cent, or 299 instances, were from Facebook-owned WhatsApp, which has had end-to-end encryption since 2016.

The NSPCC said this figure highlights how easily some child abuse cases can go unnoticed when encryption is enabled.

Only 32 of the 43 police forces that were approached provided information, the NSPCC said, and the charity fears the total number of sex abuse cases via messaging services could be much higher.

‘For far too long, Facebook’s mantra has been to move fast and break things but these figures provide a clear snapshot of the thousands of child sex crimes that could go undetected if they push ahead with their plans unchecked,’ said Mr Burrows.

Facebook has said it has been proactively removing more harmful and abusive content than ever but the NSPCC fears end-to-end encryption on Messenger could be highly dangerous (stock photo)

 Facebook has said it has been proactively removing more harmful and abusive content than ever but the NSPCC fears end-to-end encryption on Messenger could be highly dangerous (stock photo)

The charity is also pushing for Facebook to delay plans to encrypt adult accounts until the tech giant proves it can swiftly detect and remove child abuse on the platform.  

Last month, the NSPCC accused Facebook of using ‘selective big numbers’ to promote its latest transparency report.

The social network said it had detected 11.6 million pieces of content related to child exploitation and abuse from July to September this year.

Around 99 per cent of child exploitation content had been detected and removed, Facebook claimed.

But the NSPCC responded by saying the social network’s statistics ‘underplayed’ the experience of vulnerable young people who see such images.

Home Secretary Priti Patel also said earlier this year that Facebook’s plans for greater encryption will hurt the fight against terrorists and paedophiles.

‘Where systems are deliberately designed using end-to-end encryption, which prevents any form of access to content, no matter what crimes that may enable, we must act,’ she wrote.

A Facebook spokesman previously said: ‘There is no place for grooming or child exploitation on our platforms.

‘We use technology to proactively remove it and are developing further ways to detect patterns of harmful behaviour in order to ban and report those responsible. ‘We work closely with child protection authorities in the UK, and we’re consulting with experts on the best ways to implement safety measures before fully implementing end-to-end encryption.’