Australia: The scope of the Online Safety Act
The Online Safety Act 20211 ('the Act') commenced on 23 January 2022, replacing a patchwork of online safety legislation to create a more consistent and clearer regulatory framework. The Act gives new powers to the eSafety Commissioner, Julie Inman Grant, and aims to protect Australians from online harm and bullying. Katherine Sainty, Director of Sainty Law, provides an overview of how the Act works and what it aims to achieve.
Who does the Act protect?
The Act creates a world-first Adult Cyber Abuse Scheme to protect Australians aged over 18. The eSafety Commissioner can now fine or penalise individuals who post cyber-abuse material that targets adults.
The Act also introduces a broader Cyberbullying Scheme to protect Australian children from harm that occurs on additional online services, expanding from the previous regime that only addressed harm on social media. The eSafety Commissioner may order service providers to remove illegal and restricted content such as child sexual exploitation material and terrorist content on any online service. This includes online game chats, websites, and direct messaging platforms.
What is considered adult cyber-abuse?
The Act creates a much higher threshold for what is considered online bullying of adults, in comparison to how the Act deals with online bullying towards children.
The abuse must be both 'intended to cause serious harm', and 'menacing, harassing or offensive in all the circumstances' according to the eSafety Commissioner. This could include making realistic threats or placing individuals in real danger, and being excessively malicious or unrelenting. Offensive, defamatory, or disagreeable comments alone will not be considered adult cyber-abuse under the Act.
The Government created this high threshold to address concerns of censorship and free speech. However, the eSafety Commissioner will provide support, information, and advice to individuals impacted by online bullying where the content does not meet this high threshold.
What does the Act do?
The Act makes social media platforms, websites, and online services more accountable for the online safety of their users. The eSafety Commissioner can now receive reports of bullying content or the non-consensual posting of intimate images where social media platforms fail to remove the content first. If a platform fails to remove the bullying content or image within 24 hours of being notified by the eSafety Commissioner, they may be fined up to AUD 555,000 (approx. €349,500).
The eSafety Commissioner may also demand that internet service providers remove access to content that 'promotes, incites, instructs in or depicts abhorrent violent conduct'. This could include content involving rape, torture, murder, attempted murder, and terrorist acts.
What should I do if I am being bullied?
According to the Act, you should first ask the website or platform to remove the bullying content from the site. If the website or platform does not remove the content, you can report it to the eSafety Commissioner2. An investigation will commence, and if successful, the eSafety Commissioner will inform the website or platform to remove the content within 24 hours or risk a fine.
However, the Act only allows the eSafety Commissioner to order the removal of content and apply fines. If further action is required, you should inform the police.
If you want to find out more about what the new Act means for you or your business, you can visit the eSafety Website3.
Katherine Sainty Director
Sainty Law, Sydney