Amidst a surge of lawsuits over social media’s alleged harm to children and privacy violations, there are increasing calls – locally and internationally – for stricter regulation, particularly regarding age restrictions on the platforms.
Officially, popular social platforms owned by Meta (Facebook, Instagram, and Threads), X (formerly Twitter), YouTube, and TikTok have set 13 as the minimum age for user account registration.
Despite the age requirements, platforms do not require identity verification during the registration process, which allows children below the minimum age to create accounts.
The Human Rights Commission of Malaysia (Suhakam) commissioner, Prof Datuk Noor Aziah Mohd Awal, urged setting a minimum age of 16 for accessing social media.
She stressed that younger children are particularly vulnerable to being manipulated and exploited online.
They may not be aware of lurking dangers such as paedophiles and grooming, which involve building trust with a child to engage in sexual abuse.
“I think 13 or 14 is too young. It is so easy to influence children because they are so vulnerable and at the age of trying and looking for things.
“That’s the reason why we need to have some sort of control over all these accounts. There must be control and limits on children’s usage of social media apps.
“Children have the right to information, but even then, some information could be restricted or limited according to age.
“It is true that we are in a digital age, but we still have to protect children,” she said in the report.
The Malaysian Communications and Multimedia Commission (MCMC) has yet to implement specific age restrictions for social media accounts, though they are in its sights. In April, the agency announced that it is working with TikTok and Meta to address the issue of social media account ownership by children under the age of 13.
The Deputy Communications Minister, Teo Nie Ching, said in a report on April 30 that social media platforms leave account creation entirely to users, without any mechanism to prove that their age meets the minimum requirements.
“The Communications Ministry has had a series of meetings with platform providers such as Meta and TikTok to discuss ways to tackle the issue of social media ownership among children under 13.
“The MCMC will work closely with social media platforms to prevent the setting up of accounts by children under 13, as well as shutting down the social media accounts owned by children under 13,” she said.
She further advised parents and guardians to educate themselves on the platforms’ parental control tools and safety features while also staying up-to-date on the latest trends and risks to provide better guidance.Raising the bar
Stricter regulations for the minimum age to use social media, along with more effective verification tools, have been a hot topic globally.
France enacted a law in June of last year, establishing a “digital majority” at the age of 15, which mandates parental approval for social media use by children under that age. This also ensures that parents retain control over their children’s access to the digital space until they reach this designated age.
Companies that fail to comply will be subjected to a fine of up to 1% of their global revenue.
In April, French President Emmanuel Macron pushed for the law to be adopted European Union-wide.
This call was echoed by Denmark’s Prime Minister Mette Frederiksen last month, who emphasised the importance of effective age verification tools to enforce these regulations.
He wrote in the Danish news daily Politiken that the EU’s Digital Services Act, which went into full effect in February, is not enough to address the challenges posed by social media platforms. He proposed a ban on addictive designs and advertising to minors, requiring notifications that inform users about the amount of time spent on online platforms.
Meanwhile, the United Kingdom’s Office of Communications (Ofcom) plans to implement strict age checks for social media, with platforms required to perform identity and age checks using personal identification documents.
The fine for failure to comply is hefty, at up to 10% of a social media firm’s global turnover. The code of practice would also ban a self-declaration of age, which currently allows underaged users to create social media accounts.
In the United States, states like Florida are ahead of the curve, having already codified age restrictions into their legal framework. It enacted a law in March requiring children under 15 to obtain parental consent to have a social media account. The law goes into effect next year.
Ohio has also enacted prohibitions against child social media use, setting a slightly higher minimum age of 16 without parental consent, with similar policies under development in states like Arkansas, New Jersey, California, and Utah.
In an op-ed published in The New York Times, the US surgeon general, Vivek Murthy, called for the US Congress to require social media platforms to include warning labels similar to tobacco products.
Murthy cited figures indicating that spending in excess of three hours daily on social media increases the risk of anxiety and depression twofold. Additionally, he cited data showing that during the summer of 2023, the daily average usage was nearly five hours.
“A surgeon general’s warning label, which requires congressional action, would regularly remind parents and adolescents that social media has not been proven safe,” he wrote.
“Evidence from tobacco studies shows that warning labels can increase awareness and change behaviour,” he wrote.
He further drew a comparison to how mandates for seatbelts, airbags, and crash testing were used to address vehicular accidents and make cars safer, also describing social media as a key contributor to the mental health crisis plaguing young people.