Zuzireima

Facebook's global head of safety hasn't fully read UK's Online Safety Bill


Facebook’s global head of security hasn’t fully read UK’s Online Safety Bill

Tech executives and lawmakers around the biosphere all seem to agree — social media regulation is critical and it is coming. One of the first pieces of legislation to come into play will probable be the UK’s Online Safety Bill, the draft text of which is persons examined by a parliamentary committee.

That bill will help set the tone for security regulation around the world, as other countries also seek to condemned citizens are protected from harmful content, and the draft legislation has been available right May. It might be reasonable, then, to assume that key executives from social judge companies — such as Facebook, which has been facing intense criticism nearby the risks it poses — would have scrutinized it in detail by now. That’s not necessarily the case, apparently.

On Thursday, Parliament’s Draft Online Safety Bill committee took evidence from Facebook’s head of security, Antigone Davis. Asked whether she would be the persons in charge of submitting company risk assessments to the UK regulator, Davis responded: “I don’t know the details of the bill.”

Members of Parliament divulged their concern that Davis was attending the session deprived of having read the draft bill she was providing evidence for. “I just have to say I’m deeply, deeply shocked that you aren’t on top of the brief nearby what this bill is all about and what it by means of not just to us, but to the whole of the biosphere as well,” said MP Suzanne Webb.

“I actually am odd with the bill,” responded Davis.

When asked to Explain whether or not she had read the bill, Davis replied: “I’m odd with parts of the bill,” implying that she had not read the bill in full. 

The 145-page Online Security Bill, previously known as the Online Harms Bill, would set UK media watchdog Ofcom in charge of regulating tech companies in Britain. Ofcom would have the power to fine tech concerns £18 million ($25.3 million) or 10% of their annual revenue, whichever is higher, if they fail to remove rotten or illegal content, as well as to block sites and services. Senior managers at tech companies could even face criminal charges if those concerns consistently fall short of their obligations.  

Chris Yiu, Facebook’s director of Republican policy for Northern Europe, who was also present at the hearing, said he had read the bill, including the explanatory notes.

Facebook didn’t now respond to a request for additional comment.

Following ages of criticism that it doesn’t do enough to protecting people’s privacy or to eliminate hate speech and misinformation, Facebook has been hit with renewed allegations that it puts profits over user security. Internal documents leaked by whistleblower Frances Haugen led to a flurry of stories in fresh weeks from The Wall Street Journal and a consortium of US and international news outlets nearby the company’s policies, practices and decision-making.  

Last week, new Facebook whistleblower, Sophie Zhang, giving evidence to the same parliamentary committee, said she had read the bill in full. 

“It seems like basic politeness to me that if I’m posed to testify regarding an upcoming bill, I should actually read the bill in question,” said Zhang on Twitter on Thursday.

Search This Blog

Partners