But we didn’t write that! Are we at risk for third-party posts on our public social media pages?
Online trolls, bullies and stirrers may give business a new headache following a recent decision of the NSW Supreme Court in Voller v Nationwide News and Ors [2019] NSWSC 766.
The case is the first time an Australian court has had to consider whether a business can be held responsible for defamatory public comments on a public Facebook page, but the decision has ramifications for users of other forms of social media.
An article on Mr Voller attracts some attention
Dylan Voller, former detainee at the Don Dale Youth Detention Centre, sued the owners of the Sydney Morning Herald, The Australian, Sky News and others over public comments made by others on news articles posted by the owners to their Facebook pages.
The court was asked to determine a preliminary, and novel, question on:
- whether the news owners met the "publication element" of the defamation test under the NSW Act; and
- whether the new owners could rely on the "innocent dissemination defence".
Broadly, the news owners argued that they were a passive medium and it was not reasonable or practical for them to moderate all third-party comments before they were posted. They claimed to have no control or prior knowledge over the contents of comments before they are posted by users, and could not require users to seek approval prior to posting.
In the case of Fairfax, the court was told that each post could receive anywhere from 100 comments to thousands of comments, and be posted at any time during the day or night. A further complication in monitoring of all comments was the use of sub-threads of comments , particularly because any comment, whether on a sub-thread or otherwise, may be posted many days after the initial post by the Administrator. Users could also circumvent key word moderation, for example, by slightly misspelling words (eg. "Facebo0k") and people would still be able to post pictures and memes.
The news owners claimed that it would be “physically impossible” or "would require disproportionate amount of effort" for their teams to monitor every comment. (There was evidence that some posts receive up to 1,800 comments). Additionally, the continual monitoring, blocking and/or hiding Facebook comments on the news owners' public Facebook pages would require a disproportionate amount of effort to the actual number of occasions that users would be blocked or comments deleted or hidden.
Is Facebook a bus shelter, and other mysteries of defamation law
The Court looked to other cases for comparisons and said this case was to be differentiated from:
- cases of persons putting up defamatory posters at Council bus stop shelters because Councils do not build the bus stops for hosting posters and did not approve, agree to, or encourage the posting of posters;
- Google search results, because Google merely summarises an article published elsewhere on the internet. By contrast, a business can either vet comments prior to publication using the broad filter strategy or disable comments entirely; and
- third-party comments posted to a private Facebook page where there was no "invitation" to the public to post comments and evidence was to the effect that unlike public Facebook pages, the page owner is unable to exercise meaningful editorial control.
What was happening here, said the Court, was more analogous to an online forum where forum owners play an active role in encouraging postings, create channels for users' different interests, aim to attract as many users as possible, and derive an income from advertising.
Three types of moderation strategies
What about moderation of the comments? The Court found that there are essentially three ways to moderate comments on businesses' Facebook pages:
- disable all public commenting;
- allow comments, but remove unlawful comments periodically or in response to complaints (this was broadly the news owners' strategy); or
- hide all comments by applying a filter of very common words and approve comments that are not defamatory (in our words, the broad filter strategy).
Why the news owners were liable in defamation for third party comments
With that technical background, the Court's key findings were that:
- comment that is compiled or authored by the third-party user on a public Facebook page "is not published until such time as the owner of the public Facebook page allows persons to read it, or, in the words of the High Court in Gutnick, it is the owners of the public Facebook page that would render the material comprehensible and allow it to be downloaded";
- it is possible for a public Facebook page to be produced that would not allow any comments;
- the news owners could, with sufficient staff, use the broad filter strategy to hide all comments for prior review and approval. The evidence suggested that it takes no more than an average of 10 seconds to review a comment, in which case 7,000 comments per day (the greatest number of comments reported on any of the defendants’ pages on any day) would require the equivalent of an additional 2.5 employees, assuming, contrary to the evidence, that no work hours were currently expended on the task;
- the news owners' original posts would predictably excite negative comments about individuals. Indeed, the most important purpose of their public Facebook pages was probably to generate comments to increase the popularity of the page, and generate income; and
- if the news owners considered the defamation risks of their original posts, they would have likely concluded that those posts would give rise to "nasty and defamatory comments".
When the hosts of a public Facebook page could control and supervise the material posted to their sites, the news owners could not say they could not be aware of specific defamatory posts. Hence the Court regarded them as primary publishers of the third-party comments, not mere conduits. Further, the defence of innocent dissemination does not apply to primary publishers.
Presumably, the parties did not consider that Schedule 5, cl 91(1) of the Broadcasting Services Act 1992 (Cth) applied to the facts (eg., if the news owners were not hosting internet content in Australia), as it was not considered in the judgment.
Conclusion
Media reports suggest that at least one of the defendants has indicated an intention to appeal the decision. Calls have also been made for defamation law reform (timely as submissions have recently been made in response to the Council of Attorneys-General's Review of Model Defamation Provisions Discussion Paper).
However, appeals and law reform take time. In the meantime businesses (in particular news organisations) will need to decide whether the risk of defamation claims warrants engaging the necessary staff to be able to adopt the broad filter strategy, or to risk the occasional defamation claim and address those on an ad hoc basis.
Each case will of course turn on its own factual matrix, and in Voller there were a few distinguishing factors that made it harder for the particular defendants. These included:
- The news organisations were in an informed position to assess whether a post was going to be controversial and monitor accordingly;
- News organisations benefit from controversial comments and engagement with controversy, as a large portion of the digital site access/ subscription can be driven through the Facebook page referrals and click-throughs. The more comments and engagement with a post, the better the placement on the site due to algorithm operation; and
- The news organisations were in most cases of a size where they could, with sufficient staff (eg. an additional 2.5 employees), hide all comments for prior review and approval. Small business and non-corporates might view their ability to respond differently.
Where business considers it shares similar risks to the news organisations, it may, pending the outcome of any appeal be prudent to adopt the broad filter strategy discussed in the Voller case, or use the equivalent tools available on other forms of social media.