Internet intermediary liability for defamatory third-party content in Australia: the next chapter
Most Australian States and Territories look set to introduce defamation law reforms dealing with the liability of internet intermediaries for third-party content by 1 July 2024. If you host third party content online, review your practices before the new laws arrive.
For a number of years in Australia, the liability of internet platforms and intermediaries for content posted by users of their platforms has been a complex and ever-changing area of law. A series of recent cases, drawing on well-established defamation principles on publication, confirmed that search engine operators and page hosts are considered publishers of the content they host (for example Fairfax Media Publications Pty Ltd v Voller [2021] HCA 27, and Google Inc v Duffy [2017] SASCFC 130. The High Court decision in Google LLC v Defteros [2022] HCA 27 seemed to be a retreat from the high watermark of earlier cases in the area, with a majority in that decision finding that Google was not a publisher of content made available via hyperlinks in certain search results.
In 2018, the Council of Attorneys-General Defamation Work Party was convened to review defamation law. Emerging from that process was "Stage 2, Part A": a set of proposed defamation law reforms aimed at clarifying the position of internet intermediary liability for third party content (following the initial "Stage 1" reforms introduced in July 2021). On 22 September 2023, the Standing Council of Attorneys-General announced that the internet intermediary liability defamation laws had been approved by a majority of States and Territories, and are aiming to implement the reforms by 1 July 2024. These reforms, the Model Defamation Amendment (Digital Intermediaries) Provisions 2023, will amend Australia's "uniform" defamation laws, and will affect most providers of online services that accommodate posts by third parties, from internet start-ups to social media page administrators.
Online services and digital intermediaries
Underpinning the new laws is the concept of a "digital intermediary". A digital intermediary, in relation to the publication of digital matter (for example, online content such as a social media post), means a person, other than the author, originator or poster of the matter, who providers or administers the online service by means of which the matter is published.
"Online service" means any service that enables a person to use the internet. It includes a service that allows a person to access or connect to the internet (for example, an ISP), send or receive content, search for content (for example, a search engine), share content, interact with other persons, or store content (for example, a cloud service). The new Provisions give the following examples of online services:
- an internet-based social media platform.
- a forum creator or administered by a person using a facility provided by an internet-based social media platform that enables users to share content or interact with other users about a topic.
- a website or other internet-based platform that enable knowledge to be shared by or with its users.
Exemptions and defences
The new Provisions categorise different types of digital intermediaries, according to their relationship to the content that they host. For example, on one end of the spectrum, are "passive" service providers, or "mere conduits". These are caching services and conduit services (for example, ISPs or cloud storage providers). Those service providers are exempt from defamation liability in relation to the content they host (new section 10C).
In the next category are search engine providers. They are exempt from liability in relation to defamatory search engine results they provide to users (save for any sponsored search results or autocomplete predictive searches) (new section 10D).
Finally, there is a defence for all other digital intermediaries, modelled on the innocent dissemination defence, set out in new section 31A. To obtain the benefit of this defence:
- The digital intermediary must have an accessible complaints mechanism for the plaintiff to use. This includes a dedicated website to lodge complaints, an email address or direct message address.
- The digital intermediary must not have been actuated by malice in providing the online service. For example, a forum administrator who creates a page specifically inviting people to post defamatory content on the page, might be found to have been actuated by malice in providing the service, and will not be able to benefit from the defence.
- If the plaintiff gave the defendant a written complaint under this section about the publication, the digital intermediary must take "reasonable access prevention steps" in relation to the relevant content on their service, before, or within 7 days after the complainant was given. An "access prevention step" means a step, reasonable in the circumstances, to remove or block access to the relevant content. For example, if a forum administrator receives a complaint about a post on their forum, they have 7 days in which to consider the complaint and remove the defamatory content. If those steps are taken, the administrator would have a defence.
A written complaint under section 31A must notify the intermediary of the following:
- The name of the plaintiff.
- The matter and where it can be located.
- That the plaintiff considered the matter to be defamatory.
The explanatory note to the new Provisions makes clear that a complaint including the basic information required would be insufficient for a concerns notice because concerns notices require more detailed information (although, a concerns notice would operate as a sufficient complaint if it included the basic information required in a written complaint under section 31A).
If the digital intermediary does not have a defence under section 31A, they are not precluded from seeking to rely on other defamation defences (such as justification).
In light of the above reforms, the Standing Council of Attorneys-General have also announced that the Australian Government will prepare an exemption to state and territory defamation laws from section 235(1) of the Online Safety Act 2021 (Cth). This provides a broad safe harbour from liability arising under State and Territory laws, from content hosted by Australian hosting service providers and internet service providers. That provision and its forebear, clause 91 (1), Schedule 5 of the Broadcasting Services Act 1992 (Cth), have historically received minimal judicial consideration in the context of defamation law.
Other key defamation reforms
- Digital intermediaries with potential exemptions from liability under sections 10C or 10D can seek early determination of whether they have an exemption (the provision is like that governing the early determination of serious harm) (new section 10E).
- When a Court is considering preliminary discovery orders to assist in the identification of posters of defamatory internet content, the Court will be required to consider privacy, safety or other public interest considerations (new section 23A).
- If a plaintiff obtains judgment against a defendant in a defamation matter, the Court may make orders against a third party digital intermediary to remove the defamatory matter from the intermediary's online service (new section 39).
"Uniformity" lost?
The Model Defamation Provisions are agreed between the States and Territories and passed individually in each jurisdiction. South Australia however, has said it does not agree to all parts of the reforms, and will undertake further review of these proposals, giving rise to further divergence between States and Territories of Australia in defamation law (the earlier suite of defamation reforms (Stage 1), is yet to be implemented in Western Australia or the Northern Territory).
Key takeaways
In an August 2022 discussion paper on the proposed reforms, it was suggested that the new defence for digital intermediaries might incentivise the removal of content, because the relevant intermediary is often ill-equipped to assess the defensibility of the content they host, and such an assessment is difficult to make without access to original source material regarding the veracity of the information. Nevertheless, the new Provisions are intended to clarify who an intermediary is in the context of online defamation and give digital intermediaries certainty as to when they may become liable for content hosted on their services.
If you or your business host third party content online, you will need to review the practices relating to the hosting of that content before the new laws arrive. Consider whether your organisation acts as an intermediary in respect of any online content and intends to seek the benefit of any relevant defences. Review technical process (such as whether you have an adequate process for receiving and assessing complaints within the relevant timeframes). Please contact Clayton Utz if you have any questions or require assistance.