[ad_1]
The U.K.’s passage of the so-called “Online Safety Bill” in the House of Lords last week continues to garner mixed reactions, along with expert dissent. The bill empowers Ofcom, an official U.K. regulator, to police the Internet in new ways.
The United Kingdom’s passage of the so-called “Online Safety Bill” in the House of Lords last Tuesday continues to garner mixed reactions, along with expert dissent.
Framed as an effort to tackle some of the toughest longstanding issues in Internet regulation, including child sexual abuse materials (aka CSAM), violence or animal cruelty, and content related to terrorism, the bill empowers Ofcom, an official U.K. regulator, to police the Internet in new ways.
“This new legislation seeks to protect people from illegal content and activity online, with a critical focus on child protection so that they do not encounter content and activity that could be harmful to them,” Stewart Room, global head of data protection and cyber security at DWF Law LLP, wrote in Forbes.
Violations could come with a hefty price tag of up to 10% annual turnover or £18 million (approx. $22 million), whichever is higher, according to a Tech Crunch report.
The bill in context
Proponents of the measure celebrated its passing as a step towards greater public safety. “Today is a major milestone in the mission to create a safer life online for children and adults in the U.K. Everyone at Ofcom feels privileged to be entrusted with this important role, and we’re ready to start implementing these new laws,” Melanie Dawes, CEO at Ofcom, said in a statement quoted by Tech Crunch. “Very soon after the Bill receives Royal Assent, we’ll consult on the first set of standards that we’ll expect tech firms to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism.”
However, not everyone was celebrating the bill, with some experts and industry figures expressing their concerns in no uncertain terms.
“The regulated parties are the providers of user-to-user online content, which captures the big social media platforms and the like, the providers of search services, which include search engines and, to a lesser extent, publishers of online pornography,” Room wrote. “Broadly speaking, the platforms will have to perform risk assessments to understand the extent to which their users will encounter illegal or harmful content, block access to this content and arbitrate between other interests that might be engaged by blocking, such as journalistic interests and freedom of expression.”
An official explainer video from UK regulator Ofcom, explaining its mission and mandate. Video credit: Ofcom. |
Controversy: defining “harmful content”
Due to a lack of clarity on exactly how private tech firms are expected to comply with the sweeping mandate, Room added, along with allegedly broad and sweeping provisions regarding what constitutes “harmful content,” has given rise to a backlash from some experts about implications for privacy-focused software like the end-to-end encryption promoted by global firms like Apple, Meta and others.
“Plainly, this is a wide array of content and it is broad enough to capture many shades of opinion and controversy,” Room said. “For example, no one would sensibly argue against trying to curtail the spread of terrorism or child abuse content, but where do we draw the line on stunts, which, conceptually at least, will capture shows such as Jackass and countless YouTube channels? Is there a bright-line test to distinguish between the performance of a stunt and the encouragement of it?
“There are real and substantial freedom of speech issues here and, no doubt, the legislation will trigger both macro and micro legal disputes, ranging from landmark court cases through to discrete complaint and claims that the regulated service providers will have to manage in a judge and jury sense.”
Section 122 aka “the Spy clause”
“Reading section 122 at face value, it certainly poses a threat to communications encryption, which in turn creates two risks,” Room said. “Firstly, there is the obvious privacy risk: why should private sector companies become a tool of law enforcement and be empowered to survey everyone’s communication? Isn’t this mass surveillance by the back door? Secondly, there is the risk to security itself, with the argument being that once you break encryption, it’s broken for everyone, which in turn helps the bad guys. The reaction to these powers has included threats or insinuations by various technology companies that provide encrypted communication channels (e.g., end-to-end encryption) that they will shut up shop in the U.K. Many privacy activists have been horrified.”
“Clause 122, known as the ‘spy clause’, could see the private sector being mandated to carry out mass surveillance of private digital communications,” Rasha Abdul Rahim, director of Amnesty Tech at Amnesty International, said in an Amnesty International report on the issue. “It would leave everybody in the U.K. — including human rights organizations and activists – vulnerable to malicious hacking attacks and targeted surveillance campaigns. It also sets a dangerous precedent.
“It remains undeniably true that it is not possible to create a technological system that can scan the contents of private electronic communication while preserving the right to privacy,” Rahim continued. “Encryption is a crucial enabler of the rights to privacy and freedom of expression, and also has a significant impact on other human rights. UK lawmakers must urgently address Clause 122 and ensure the Online Safety Bill upholds the right to privacy before it is signed into law.”
Exactly how the bill is to be interpreted and enforced remains to be seen.
The full text of the Online Safety Bill, posted publicly by the U.K.’s Parliament.
Daniel Brown is the editor of Digital Signage Today, a contributing editor for Automation & Self-Service, and an accomplished writer and multimedia content producer with extensive experience covering technology and business. His work has appeared in a range of business and technology publications, including interviews with eminent business leaders, inventors and technologists. He has written extensively on AI and the integration of technology and business strategy with empathy and the human touch. Brown is the author of two novels and a podcaster. His previous experience includes IT work at an Ivy League research institution, education and business consulting, and retail sales and management.
[ad_2]
Source link