Tech firms must ‘tame’ algorithms under Ofcom child safety rules | Social media
[ad_1]
Social media firms have been told to “tame aggressive algorithms” that recommend harmful content to children as part of new Ofcom safety rules.
The child safety codes introduced as part of the Online Safety Act, pos Ofcom sets new, strict rules for internet companies and how they can interact with children. It calls on services to make their platforms child-safe by default or implement robust age checks to identify children and provide them with safer versions of the experience.
For these age-verified sites, Ofcom will require algorithmic management to be adjusted to limit the risks to younger users. This would require sites like Instagram and TikTok to ensure that suggested posts and “about you” pages explicitly consider the age of children.
They will also have to make extra efforts to suppress the spread of harmful content, such as “violent, hateful or offensive material, online harassment and content promoting dangerous challenges”.
More seriously harmful content, including that related to suicide, self-harm and eating disorders, will need to be completely excluded from children’s feeds, as will pornography.
Implementing the new requirements will be a challenge. Algorithmic curation is often described as a “black box,” with some companies not sure how their own systems decide what content to promote and suppress. But Ofcom is confident its enforcement will be effective, says Gill Whitehead, the regulator’s head of online safety.
“We’ve spoken to 15,000 children over the past two years to date and they’ve told us the types of harmful content they see, what it looks like and how often they see it. We also have very strong information collection powers to request this data and to require technology companies to provide us with this data.
“The big change is that the very harmful content that [children] see must be filtered so they don’t see it. And then harmful content, such as violent or harmful substances, or dangerous challenges or stunts, should be downgraded so that they see it much less often. So these kinds of powerful combinations of volume and intensity will not be as productive or harmful to children as they are today.
The draft code is open for consultation until July 17 before it is finalized and presented to parliament, with services given three months to carry out their own child risk assessments, which must be completed before implementation begins.
Ofcom chief executive, Melanie Dawes, said: “We want children to enjoy life online. But for too long their experiences have been marred by seriously harmful content they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. This needs to change.
“In line with the new online safety laws, our proposed codes firmly place the responsibility for greater child safety on tech companies. They will need to tame aggressive algorithms that push harmful content to children in their personalized feeds and implement age checks so that children get an experience that is appropriate for their age.
“Our measures, which go far beyond current industry standards, will provide a step change in the online safety of children in the UK. Once these come into effect, we will not hesitate to use our full range of enforcement powers to hold platforms to account. This is a promise we make to children and parents today.”
UK Technology Secretary Michelle Donnellan said: “The Government has commissioned Ofcom to enforce the act and today the regulator was clear; platforms need to implement the kinds of age checks that young people face in the real world and deal with algorithms that all too easily mean they come across harmful material online.
“Once implemented, these measures will fundamentally change the way children in the UK experience the online world.”
Online child safety campaigner Ian Russell, the father of 14-year-old Molly Russell, who took her own life in November 2017 after viewing harmful material on social media, said more needed to be done to protect themselves young people from online harm.
In his role as chairman of the online safety charity Molly Rose Foundation, Russell said: “Ofcom’s job was to seize the moment and propose bold and decisive measures that can protect children from widespread but inherently preventable harm.
“The regulator has proposed some important and welcome measures, but the overall set of proposals needs to be more ambitious to prevent children coming across harmful content that costs Molly her life.”
[ad_2]