Principal Pushes for Stricter Age Limits on Snapchat for Teen Safety

November 20, 2024

Dan McMahon, a principal with over 30 years of experience in the education sector, has voiced strong concerns regarding the need for stricter age restrictions on Snapchat due to its potential misuse and harmful impacts on young people. Snapchat is a widely-used platform by young Australians, which allows users to send photos, videos, and messages. Unlike platforms such as Facebook, Instagram, or X, Snapchat does not feature a posting board. This distinction has led to debates about whether it should be classified as a messaging service rather than a social media platform. This categorization could enable Snapchat to bypass the Albanese government’s proposed social media ban for children under 16. McMahon’s call to action emphasizes the urgency of addressing the unregulated space that Snapchat occupies, particularly in the context of its influence on young users.

The Debate Over Snapchat’s Classification

Communications Minister Michelle Rowland highlighted that Snapchat could potentially evade the ban if it argued successfully that it is primarily a messaging service. However, McMahon points out that Snapchat is a prevalent tool among online bullies and is commonly employed to inflict harm. He cited devastating cases where students’ lives were tragically lost due to online bullying facilitated through Snapchat. This underlines the significant concerns related to the app’s impact on young users and stresses the need for stricter regulations. The current framework, which does not necessarily recognize Snapchat’s unique role within the social media landscape, presents challenges in creating a safe online environment for teenagers. The call for categorization of Snapchat as a social media platform aims to ensure comprehensive coverage of the app under new regulatory measures designed to protect the most vulnerable online users.

Government’s Planned Legislation

In response to rising concerns, the Albanese government plans to introduce legislation for a social media age ban, which will encompass a broad and robust new definition of social media, tailored to cover a variety of services perceived as harmful to young people. Minister Rowland assures that the new definition will be comprehensive enough to include platforms traditionally viewed as social media. The approach involves evaluating platforms against transparent criteria, with inputs from experts, parents, young people, and social media companies themselves. The objective evaluation aims to determine whether specific services fall within the category of social media and, therefore, should be included in the ban. This ensures a holistic understanding of the landscape, providing a more solid foundation for regulatory actions aimed at shielding young individuals from digital platforms’ adverse influences.

Opposition’s Stance on Loopholes

David Coleman, serving as the opposition communications spokesman, has vehemently called on the government to close any existing loopholes that may allow platforms like Snapchat to evade restrictions. He underscored the outrage that would arise if Snapchat were to be exempt from social media age limitations, given the app’s detrimental effects on young Australians. Coleman’s stance highlights the necessity for comprehensive and airtight legislation to ensure effective regulation of all potentially harmful platforms. His emphasis on scrutinizing and addressing loopholes is essential to safeguard against unforeseen bypasses by platforms, thereby ensuring a more secure online environment. Closing these gaps guarantees that all forms of harmful digital interactions are addressed under the law, reinforcing a unified and robust approach to youth online safety.

Digital Duty of Care Legislation

Alongside age restrictions, the Albanese government intends to legislate a digital duty of care for major digital entities such as Meta, Google, and TikTok. This legislation aims to mitigate the negative mental health impacts posed by social media. Drawing from similar provisions in the UK and European Union, the proposed laws will require social media companies to proactively monitor their platforms for potential risks and undertake reasonable measures to prevent foreseeable harms. This might encompass provisions that compel platforms to tackle issues related to mental health, youth, problematic internet usage, and harmful algorithms. The goal is to create a safer online space by holding companies accountable for their roles in promoting or preventing mental health crises. This shift from mere content regulation to a proactive, systems-based prevention approach is expected to cultivate healthier interactions within digital environments, prioritizing user safety and well-being.

Shift to a Systems-Based Prevention Approach

Rowland has emphasized that this new framework will shift focus from traditional content regulation towards a systems-based prevention approach, aiming to create safer and healthier digital platforms. This strategy incentivizes companies to actively prevent harm, rather than merely responding to instances of harmful content. The proposed age restrictions and digital duty of care legislation, scheduled for parliamentary discussion upon reconvening, have garnered bipartisan support. These efforts are part of a broader governmental initiative to protect young people from online harms. The comprehensive legislative movement symbolizes a significant departure from reacting to individual cases of harmful content, highlighting a preventive mentality. The involvement of various stakeholders underscores a collective commitment to reform digital platforms and foster a safer online atmosphere.

Comprehensive Legislative Efforts

This expansive legislative endeavor reflects a pronounced shift in regulatory focus from merely responding to harmful content incidents to establishing a preventive system aimed at safeguarding young users. The Albanese government’s stringent stance aligns closely with the concerns of parents, children, and the public at large, striving to ensure a safer online environment for young Australians. The proposed broad definition of social media platforms and the digital duty of care articulate a unified acknowledgment and consistent approach to mitigating the adverse impacts of digital platforms on young people. By adopting a preventive and systemic methodology, the efforts move beyond reactive content control, fostering a healthier digital ecosystem. This new direction in regulation aims to equip digital platforms with the responsibility of proactively ensuring user safety, thus marking a transformative shift in online safety protocols.

Commitment to Online Safety

These measures underscore the shift toward a preventive and systemic approach, moving away from merely reactionary content regulation, aiming to establish a healthier digital ecosystem. The approach demonstrates a significant commitment to addressing the complex and nuanced issues that arise from young people’s use of digital platforms, ensuring their online safety and overall well-being. The resultant legislation aspires to hold social media companies accountable for taking affirmative steps to prevent foreseeable harms, thereby marking a pivotal change in the landscape of online safety regulations. With this stance, Australia aims to align with global initiatives aimed at creating safer and more supportive digital environments for future generations, ensuring that young users can navigate the digital space without undue risk.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later