A recent study by Australia's eSafety regulator found that over 80% of children aged 12 and under accessed social media platforms not intended for them, prompting discussions about a potential ban on platforms for users under 16.
Growing Concerns Over Young Children's Social Media Usage in Australia

Growing Concerns Over Young Children's Social Media Usage in Australia
New report reveals alarming statistics about the prevalence of social media among Australian children, leading to potential regulatory changes.
In a striking new report, Australia's eSafety regulator has revealed that more than 80% of children aged 12 and under utilized social media or messaging platforms intended for users aged 13 and above in the previous year. The most frequented platforms among this young demographic include YouTube, TikTok, and Snapchat. This critical finding comes alongside the Australian government's plans to implement a ban on social media usage for individuals under the age of 16, expected by the year's end.
The report scrutinized several major companies, such as Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok, and Twitch, though none responded to requests for comments regarding the findings. While many platforms require users to be at least 13 years old to create an account, some exceptions exist. Notably, YouTube offers Family Link and a children-specific YouTube Kids app, which were not included in the report due to their supervised nature.
According to eSafety commissioner Julie Inman Grant, the information sourced from this report will critically shape future regulatory actions. Highlighting that online safety for children is a "shared responsibility," Grant emphasized the roles of social media companies, tech developers, parents, educators, and lawmakers in safeguarding young users.
The report presented findings from a survey involving over 1,500 Australian children aged 8 to 12. A staggering 84% indicated they had used at least one social media or messaging service within the last year. More than half accessed these platforms through a parent's or guardian's account, while a third had their own accounts, with 80% receiving assistance from a parent or caregiver during account setup. Alarmingly, only 13% of children had their accounts closed by companies for being underage.
The report also pointed out inconsistencies in the mechanisms employed by social media platforms to verify users' ages, emphasizing the absence of robust checks to prevent underage users from providing false information during account registration. While platforms like Snapchat, TikTok, Twitch, and YouTube claim to use age detection technologies, the report notes that these methods depend on user engagement and may take time to identify underage users, potentially exposing them to risks during that period.
As Australia prepares to tighten regulations around social media usage among minors, the implications of this report raise significant questions about the current state of child safety online.