Daily Bulletin

The Times Real Estate

.

  • Written by The Conversation
imageFacebook is trying to manage abuse and hate speech.mkhmarketing/Flickr, CC BY

It took some persuading, but Facebook has agreed to join an international social media task force to help combat online hate in the wake of anti-refugee xenophobia on its pages.

It’s a good outcome for the German Chancellor, Angela Merkel, and her Justice Minister, Heiko Maas, who last week called on Facebook to remove racist comments in line with German law. This came after users complained that Facebook was not responding to their reports of racist abuse and threats.

It’s also a relative win for the tech giant, which recently boasted about one billion active users in a day and has a market value of US$245 billion.

Facebook is keen to avoid any new legislative limits on its operations and to minimise direct censorship. The company said it preferred to allow “robust” debate and discussion, rather than deletion.

But Germany has now joined the Israeli, French and Australian governments in asking Facebook to remove dangerous, offensive or illegal content.

So the pressure is mounting for it to develop more open and responsive ways of dealing with these problems.

To ban or not to ban

The escalating debate about who Facebook should protect, ban or report to local authorities, and how fast it should intervene, is a reaction to the way social media companies are carving out their own transnational, libertarian policies.

To a large extent, this imposes a US free speech paradigm on countries used to more interventionist media regimes, even though the legal limits of that paradigm are being thoroughly tested by hate speech.

Facebook would much rather we police the pages and posts we make and read, rather than it having to regulate other people’s bad behaviour. Safety, it says, is “a conversation and a shared responsibility”. Users are advised to keep themselves safe by hiding or deleting offensive comments and blocking abusers.

Where content does breach local laws but not its community standards Facebook says “we may make it unavailable only in the relevant country or territory”.

But neither strategy stops hate posting, they just reduce its social visibility.

Another way the free speech push plays out is with Facebook’s policy on public figures. Its community standards say the company will act on complaints of harassment and direct threats against private individuals, but it allows more critical discussion of public figures.

The company’s definition of a public figure is worryingly broad:

We permit open and critical discussion of people who are featured in the news or have a large public audience based on their profession or chosen activities.

This would include academics, journalists and community spokespeople.

The presumption seems to be that people who enter public debate should expect abuse, or that they are better equipped to deal with it than average users. This premise is demolished by the suicide of Australian celebrity Charlotte Dawson who was the focus of much abuse on social media.

Facebook’s standard partly explains why it didn’t immediately act on explicit, sexualised threats made recently against journalist Clementine Ford, after she visually sledged Sunrise, posting a selfie that included some explicit language written on her bare chest.

Ford claims moderators moved to temporarily close her account because she had breached the community standards. Facebook denies this.

As the company is not publicly accountable for the policing of its standards, we do not have a clear account of what actually happened.

Regulating alone or together?

Ford’s experience, and that of UNSW after its site was hacked twice recently, illustrate the problems that Facebook has in managing and accounting for its procedures for tackling online violence.

Facebook’s infographic (below) shows how complicated the workflow is for responding to a complaint.

imageFacebook’s infographic on how it deals with reports of abuse.Facebook

There’s frustration among those Facebook business partners who find they can’t get a quick resolution to reports of anti-social or illegal activities. The ABC struggled for several months to get vigilante sites taken down after presenter Jill Meagher’s murder.

There’s no doubt that Facebook is investing in research, policy and education measures to combat online violence. Its psycho-social strategies, suicide prevention tools and other safety measures demonstrate this.

But promoting self-protection is a small part of a larger equation. Facebook needs more open, collaborative approaches to tackle violence online.

At the recent SWARM 2015 conference of Australian online community managers, conference co-founder Venessa Paech noted that Facebook had yet to formally consult members of its network about the efficacy of its universal standards.

She said the community managers were keen to give feedback about the challenges of applying these standards across very different types of communities, many of which are built on Facebook groups or its commenting platform.

As one of the world’s largest digital intermediaries, Facebook is at the vanguard of a new industry sector that is confronted by violent online behaviour every day.

So while the company is rightly wedded to the free and open credo of internet communication, it has to recognise that collaborative policy development – with governments and professionals – is paramount.

It’s the principle of working with all your stakeholders, rather than on behalf of them, and it’s vital to our mutual investment in social media.

Fiona R. Martin is a Discovery Early Career Research Award fellow, whose ARC funded project 'Mediating the Conversation' (DE130101267) investigates best practice approaches to governing news commenting. She is also a co-convenor of the University of Sydney's Everyday Social Media Research group.

Jonathon Hutchinson does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond the academic appointment above.

Authors: The Conversation

Read more http://theconversation.com/why-facebook-needs-to-do-more-to-protect-you-from-online-abuse-47268

Business News

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals