91热爆

Instagram launches new parental controls in UK

  • Published
A teenager looks at her phoneImage source, Getty Images

Instagram owner Meta is launching new parental controls across the platform in the UK on 14 June.

They include the option of setting daily time limits of between 15 minutes and 2 hours, after which a black screen appears on the app.

Parents can also schedule break times and see any accounts their child reports, and why they did so.

In addition, the tech giant is rolling out a parent dashboard on all Quest virtual reality headsets worldwide.

Parents can now invite their children to activate the supervision tools - previously these could only be initiated by the young person.

The new VR controls include purchase approval, app blocking and the option to view their child's friends' lists.

Another Instagram feature being trialled is a "nudge" tool which prompts teens to look for different subjects, if they are repeatedly searching for the same thing.

The Instagram tools were introduced in the US in March.

Image source, Instagram
Image caption,

A selection of the new screens around time limits and supervision

Anxiety and depression

Instagram is officially for young people aged 13 and over, and Meta says its Oculus VR content is also designed for teens and above - although there are younger children using both platforms.

In 2021, Instagram paused plans to create an Instagram platform for children below the age of 13, following a backlash.

Also last year, the Wall Street Journal reported that Meta - which owns Facebook and WhatsApp as well as Instagram - had conducted some research which found that teenagers blamed Instagram for increased feelings of anxiety and depression - and then kept the study secret.

Instagram said the story focused "on a limited set of findings" and cast the company "in a negative light".

In 2017, 14-year-old Molly Russell killed herself after viewing self-harm and suicide content on the platform.

At a pre-inquest review in February 2021 the coroner heard that she had used her Instagram account more than 120 times a day in the last six months of her life.

In a statement, Instagram said it "does not allow content that promotes or glorifies self-harm or suicide and will remove content of this kind".