SHARE

For better or worse, social media has a big impact on our lives now. In Oamaru, Facebook pages and groups have become popular platforms for discussions about local issues. But what happens when debate turns into abuse? And who is responsible for what gets posted? Gus Patterson investigates.

It started as a discussion about ideas on a local Facebook page.

But then the comments became personal attacks.

Oamaru woman Lisa Scott, who was the subject of online abuse by people commenting on a post on Oamaru Today, found the comments “extremely harmful”.

“Any kind of personal attacks from somebody who is a random stranger, in the first instance it’s bewildering because they don’t know you,” Ms Scott said.

“In the second instance, it’s very hard to put yourself in their shoes and think that you would write something so vile about a complete stranger without it even occurring to them what it might do to you if you read it.”

Lisa Scott

Some of the comments were personal attacks from people she had never met.

“It was in a town that I had felt quite safe in,” she said.

“In a small town, it’s far from anonymous.”

Unfortunately, Ms Scott’s story is far from uncommon.

“It’s a channel that’s on the turn and we can either turn it into something good, a community noticeboard where everyone is welcome, or it’s just going to get worse,” Ms Scott said.

Waitaki Mayor Gary Kircher said social media was having positive and negative impacts in the district.

“Compare it to the old days of writing a letter to the editor in the paper, by the time they wrote it . . . there was time for them to really consider what they had written,” Mr Kircher said.

“People write things [online] they would never say to another person face-to-face, but for some reason because it’s online, they feel they have the right to be abusive.

“It takes away from the positive ability to share views and inform people, and turns into disparaging anyone’s views that don’t match their own.”

Victim of online abuse, or defamation, could take legal action under the Defamation Act 1992 or the Harmful Digital Communications Act (HDCA), established in 2015, Berry and Co solicitor Louise Laming said.

Defamation allegations were pursed in civil jurisdiction, often the High Court, while the HDCA was intended to be an alternative, Ms Laming said.

“There are differences between what constitutes a defamatory statement and harmful communication,” she said.

“However, in many cases a communication [or] statement will fall within both definitions.

“The disadvantage of proceeding under the HDCA is no ability to claim damages and/or financial losses resulting out of harmful communications.

“The advantage is that claims will be dealt with and resolved quickly.”

Legal precedent was still being established as to who could be found liable under New Zealand law, she said. A legal case in Australia found a group of media companies were guilty of defamation because of comments made by individuals in the comments section of their Facebook pages, even though the media companies did not write the comments themselves, Ms Laming said.

However, a similar case was yet to be tested in the New Zealand courts, she said.

If a person considered themselves a victim of harmful digital communication, they should report it to the police, she said.

Oamaru Today founder and administrator Allan Dick said he was aware he could be liable for comments made on his page, but he had never been contacted in relation to the HDCA.

Before setting up the Oamaru Today Facebook page in 2013, Mr Dick had a career in radio broadcasting.

“I run the page like a radio talkback show. That is basically what I do,” Mr Dick said.

“I like discussion and debate; I don’t like abuse.”

Discussion starter . . . Oamaru Today founder Allan Dick says he likes debate on his page, not abuse. PHOTO: GUS PATTERSON

Mr Dick said he moderated the comments, removing about one a month. He had also banned people from the page.

He had been approached about removing comments and posts in the past, and said he would make a judgement call at the time.

“You judge that on how important the issue is.

“If there is a genuine point, and just because it embarrasses somebody in a position who is answerable, or it is important to society, then [it stays].

“I am genuinely interested in the community.”

The tone of comments on Oamaru Today had changed over the past seven years, and there were a lot of people who had no understanding of what was acceptable to post, he said.

“Social media has destroyed a lot of the old norms, and old values,” Mr Dick said.

“[People] don’t know the legal or the moral boundaries.

“I find younger women the most aggressive in demanding so-called free speech, and demanding the right to say anything they want.”

Mr Dick said comments on his posts were “more serious” now.

“The crap has gone elsewhere, and I’m pleased about that.”

Stephen Carter, founder of Facebook page Waitaki Voice, said he had been approached by individuals about removing comments, but never by Netsafe or the police.

“I don’t [moderate comments] very often, but once or twice I have seen some of them and thought removed some of them or hidden them,” Mr Carter said.

“I’m fairly liberal, so it has got to be pretty nasty before I’ll remove it, but from time to time I have.”

Mr Carter said some of the comments could be “very negative”, both in general, towards other people and towards him as the administrator.

“Sometimes I wonder why I continue with it, but the enjoyment comes from people putting stuff out there and getting people putting decent feedback through from time to time.”

Mr Carter was aware he could be held liable for posts and comments made on the page, but he was not concerned he could face legal action.

“None of the comments, I believe, are that bad.

“I think some people are a little bit precious, to be honest.”

Netsafe chief executive Martin Cocker said the most simple course of action was using tools provided on the social media platforms to report offending comments.

“For people who are facing ongoing abuse, or abuse on many platforms, then we recommend engaging Netsafe,” Mr Cocker said.

“We will sometimes negotiate with authors or admins to have remedial action taken. That usually means having content removed, but it can also mean an admin might issue an apology, or simply allow a person a right of reply.”

Mr Cocker said annual Netsafe surveys showed that the number of people experiencing online harm had risen each year.

In its 2018-19 annual report, Netsafe found one in 10 adults had personally experienced online harm, and 11% had been a victim of online hate speech.

Harmful Digital Communications Act 2015

  • Under Section 22, a person is deemed to be in breach of the Act if their online comments or posts intend to cause harm to someone, would cause harm to an ordinary reasonable person, or cause harm.
  • The court will consider: the extremity of the language used, the age and characteristics of the victim, whether the digital communication was anonymous, whether it was repeated, the extent of circulation, whether it was true or false and the context in which it appeared.
  • Section 24 of the Act provides protections for digital platform owners. It grants immunity to the administrators of digital platforms provided they take the offending post down within 48 hours of being notified.
  • If found guilty, the administrator could face up to two years in jail, or a fine of up to $50,000.