Daily Bulletin

The Times Real Estate

.

  • Written by Kath Albury, Professor of Media and Communication and Associate Investigator, ARC Centre of Excellence for Automated Decision-Making + Society, Swinburne University of Technology
3 in 4 people experience abuse on dating apps. How do we balance prevention with policing?

A 2022 survey by the Australian Institute of Criminology found three in four app users surveyed had experienced online abuse or harassment when using dating apps. This included image-based abuse and abusive and threatening messages. A further third experienced in-person or off-app abuse from people they met on apps.

These figures set the scene for a national roundtable convened on Wednesday by Communications Minister Michelle Rowland and Social Services Minister Amanda Rishworth.

Experiences of abuse on apps are strongly gendered and reflect preexisting patterns of marginalisation. Those targeted are typically women and members of LGBTIQA+ communities, while perpetrators are commonly men. People with disabilities, Aboriginal and Torres Strait Islander people, and people from migrant backgrounds report being directly targeted based on their perceived differences.

What do these patterns tell us? That abuse on apps isn’t new or specific to digital technologies. It reflects longstanding trends in offline behaviour. Perpetrators simply exploit the possibilities dating apps offer. With this in mind, how might we begin to solve the problem of abuse on dating apps?

Trying to find solutions

Survivors of app-related abuse and violence say apps have been slow to respond, and have failed to offer meaningful responses. In the past, users have reported abusive behaviours, only to be met with a chatbot. Also, blocking or reporting an abusive user doesn’t automatically reduce in-app violence. It just leaves the abuser free to abuse another person.

Wednesday’s roundtable considered how app-makers can work better with law enforcement agencies to respond to serious and persistent offenders. Although no formal outcomes have been announced, it has been suggested that app users should provide 100 points of identification to verify their profiles.

But this proposal raises privacy concerns. It would create a database of the real-world identities of people in marginalised groups, including LGBTIQA+ communities. If these data were leaked, it could cause untold harm.

Read more: Right-swipes and red flags – how young people negotiate sex and safety on dating apps

Prevention is key

Moreover, even if the profile verification process was bolstered, regulators could still only respond to the most serious cases of harm, and after abuse has already occurred. That’s why prevention is vital when it comes to abuse on dating apps. And this is where research into everyday patterns and understanding of app use adds value.

Often, abuse and harassment are fuelled by stereotypical beliefs about men having a “right” to sexual attention. They also play on widely held assumptions that women, queer people and other marginalised groups do not deserve equal levels of respect and care in all their sexual encounters and relationships – from lifelong partnerships to casual hookups.

In response, app-makers have engaged in PSA-style campaigns seeking to change the culture among their users. For example, Grindr has a long-running “Kindr” campaign that targets sexual racism and fatphobic abuse among the gay, bisexual and trans folk who use the platform.

A mobile screen shows various dating app icons
Match Group is one of the largest dating app companies. It owns Tinder, Match.com, Meetic, OkCupid, Hinge and PlentyOfFish, among others. Shutterstock

Other apps have sought to build safety for women into the app itself. For instance, on Bumble only women are allowed to initiate a chat in a bid to prevent unwanted contact by men. Tinder also recently made its “Report” button more visible, and provided users safety advice in collaboration with WESNET.

Similarly, the Alannah & Madeline Foundation’s eSafety-funded “Crushed But Okay” intervention offers young men advice about responding to online rejection without becoming abusive. This content has been viewed and shared more than one million times on TikTok and Instagram.

In our research, app users told us they want education and guidance for antisocial users – not just policing. This could be achieved by apps collaborating with community support services, and advocating for a culture that challenges prevailing gender stereotypes.

Policy levers for change

Apps are widely used because they promote opportunities for conversation, personal connection and intimacy. But they are a for-profit enterprise, produced by multinational corporations that generate income by serving advertising and monetising users’ data.

Taking swift and effective action against app-based abuse is part of their social license to operate. We should consider stiff penalties for app-makers who violate that license.

The United Kingdom is just about to pass legislation that contemplates time in prison for social media executives who knowingly expose children to harmful content. Similar penalties that make a dent in app-makers’ bottom line may present more of an incentive to act.

In the age of widespread data breaches, app users already have good reason to mistrust demands to supply their personal identifying information. They will not necessarily feel safer if they are required to provide more data.

Our research indicates users want transparent, accountable and timely responses from app-makers when they report conduct that makes them feel unsafe or unwelcome. They want more than chatbot-style responses to reports of abusive conduct. At a platform policy level, this could be addressed by hiring more local staff who offer transparent, timely responses to complaints and concerns.

And while prevention is key, policing can still be an important part of the picture, particularly when abusive behaviour occurs after users have taken their conversation off the app itself. App-makers need to be responsive to police requests for access to data when this occurs. Many apps, including Tinder, already have clear policies regarding cooperation with law enforcement agencies.

Read more: Tinder fails to protect women from abuse. But when we brush off 'dick pics' as a laugh, so do we

Authors: Kath Albury, Professor of Media and Communication and Associate Investigator, ARC Centre of Excellence for Automated Decision-Making + Society, Swinburne University of Technology

Read more https://theconversation.com/3-in-4-people-experience-abuse-on-dating-apps-how-do-we-balance-prevention-with-policing-198587

Business News

Why Brite Storage Tanks Are a Game-Changer for Your Beverage Production

Whether you’re brewing a refreshing pale ale or perfecting your cider, every step shapes the final product. That’s where brite storage tanks truly shine. Crafting beverages Australians love takes mor...

Daily Bulletin - avatar Daily Bulletin

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals