Safety Law for Messaging App really need?

The Technology Secretary has defended a controversial part of the Online Safety Bill that requires messaging apps to access the content of private messages as required by regulator Ofcom.

She said it was a sensible approach to protect children from abuse.

But several tech companies, including WhatsApp and Signal, have threatened to leave the UK if they are forced to weaken their messaging security.

The bill is expected to be passed in the fall.

Michelle Donelan spoke to the us during a visit to University College London, where she announced £13 million funding for artificial intelligence projects in healthcare. The tech and cybersecurity community has criticized the government’s proposal that the contents of encrypted messages should be accessed if it is deemed to be a risk to children in them.

Currently, messages sent this way can only be read by the sender and receiver, not the tech companies themselves.

Several popular messaging services, including Meta’s Whatsapp and Apple’s iMessage, use this popular privacy feature by default.

But once in, it’s not just the good guys that will use it, that’s the argument, and some companies say they will pull their services out of the UK entirely rather than impacting safety. whole.

Ms. Donelan claims the government is not against encryption and access will only be required as a last resort. “Like you, I want my privacy because I don’t want people to read my private messages. They will be bored but I don’t want them to,” she said.

“However, we do know that on some of these platforms, they are sometimes hotbeds of child sexual abuse and exploitation.

“And we need to be able to access that information should that issue arise.”

She also said tech companies will invest in technology to solve this problem.

“Technology is being developed to allow you to encrypt and access this specific information, and the security mechanisms we have in place are very clear that this information can only be used for exploitation and abuse. use children”. The current favorite for this is called client-side scanning – it involves installing software on the device itself that can scan content and send alerts when triggered. But it didn’t prove itself:
Apple halted the lawsuit after the backlash and he was dubbed a “spy in your pocket”.

The children’s charity, NSPCC, says its research shows that the public is “very supportive” of efforts to tackle child abuse on cryptocurrency platforms.

“Tech companies should demonstrate their ability to lead the industry by listening to public opinion and investing in technology,” said Richard Collard, head of Child Safe Online Policy at NSPCC. technology that protects both the security and privacy rights of all users.

But Ryan Polk, director of internet policy at the Internet Association, a global nonprofit charity focused on internet policy, technology and development, is skeptical about the technology’s availability.

“The government’s Safe Technology Challenge Fund, which should have come up with a magical technical solution to the problem, failed to do so,” he said. Mr Polk said scientists at the UK’s National Research Center for Online Privacy, Harm Reduction and Adversarial Impact had discovered serious problems with the technologies in question. output, “including their invasion of privacy and end-to-end security necessary to protect the security and privacy of UK citizens.” If the UK Government cannot see that the Online Privacy Bill would in fact ban encryption, it has purposely blinded its eyes to the dangers ahead.