Australia’s digital safety watchdog has raised concerns about the compliance of social media platforms operated by Meta, TikTok, Snapchat, and YouTube, with the country’s ban for users under 16 years old. In a report published on Thursday, the watchdog highlighted potential violations of the platform ban and warned of possible court enforcement if they fail to comply.
The platform ban, which was implemented in 2020, restricts social media platforms from allowing users under the age of 16 to access their services. The ban was put in place to protect young users from potential online risks, such as cyberbullying, sexual predators, and exposure to inappropriate content. However, the digital safety watchdog’s report suggests that some of the most popular social media platforms in Australia may not be fully complying with the ban.
According to the report, which is the first of its kind since the platform ban was introduced, there are concerns about the easy access to these platforms by underage users. The report also highlights the lack of effective age verification methods, which could potentially allow children to bypass the ban and access the platforms. This is a major concern for the safety watchdog, as these platforms have a huge user base and may expose young users to significant online risks.
The report also mentions the potential use of false or inaccurate birthdates by underage users to gain access to these platforms. This is a common tactic used by young users to bypass age restrictions and access platforms that are not suitable for their age group. This could result in young users being exposed to harmful content or interactions with older users, putting them at risk of becoming victims of cybercrime.
The digital safety watchdog has stressed the importance of social media platforms fully complying with the platform ban to protect young users in the country. They have called for stricter enforcement of age verification measures to prevent underage users from accessing these platforms. If the platforms fail to comply with the ban, the watchdog has warned of potential court enforcement, which could hold them accountable for violating the ban and putting young users at risk.
The report has also raised concerns about the lack of transparency on the part of social media platforms in providing information on their compliance with the platform ban. This makes it difficult for the watchdog to assess the efficacy of the age verification measures and their overall compliance with the ban. The lack of transparency also makes it challenging to hold these platforms accountable for non-compliance, which could have serious consequences for young users.
In response to the watchdog’s report, TikTok has stated that they take the safety of their users very seriously and have implemented strict age verification measures to comply with the platform ban. Snapchat has also assured that their platform is designed for users aged 13 and above and will continue to work with the watchdog to ensure compliance with the ban. YouTube has also emphasized their commitment to keeping young users safe on their platform and stated that they will continue to improve their age verification processes.
Meta, which owns popular platforms like Facebook and Instagram, has not commented on the report yet. However, the watchdog has reminded all social media platforms, including Meta, of their responsibility to ensure the safety of young users on their platforms.
In conclusion, the compliance report by Australia’s digital safety watchdog has highlighted potential violations of the country’s platform ban for users under 16 by social media platforms operated by Meta, TikTok, Snapchat, and YouTube. The report has called for stricter enforcement of age verification processes and more transparency from these platforms to protect young users from potential online risks. It is crucial for these platforms to take immediate action to comply with the ban and ensure the safety of their young users. The digital safety of our future generations must be a top priority, and these platforms have a responsibility to protect them.

