ASPI suggests government work with platforms to fight disinformation for hire

ASPI says there's growing evidence of states using commercial influence-for-hire networks and the problem can only be solved with cooperation from government and industry.
Written by Asha Barbaschow, Contributor
Young attractive woman awake late at night using smart phone lying in bed in a dark bedroom. Using mobile for chatting and sending messages in internet addiction, mobile abuse and insomnia concept
Getty Images/iStockphoto

Political candidates should formally commit to treating campaigning as a mode that's distinct from engagement with citizens when in government, a report from the Australian Strategic Policy Institute (ASPI) says.

"A healthy online public sphere requires political will," ASPI's latest report [PDF], Influence for hire: The Asia-Pacific's online shadow economy, says.

"Transparency about government funding of public messaging when in office would allow citizens and civil society to engage with trust in the digital public sphere.

"Political representatives should commit to not using networks of inauthentic, fake, or repurposed social media accounts to manipulate political discourse."

But it isn't just political, with ASPI recommending for platforms to take on some of the accountability.

"Platforms could implement country-specific oversight committees to manage prominent account bans, to ensure the consistent application of content moderation policies to capture inauthentic behaviour, and to participate in mandatory transparency reporting," ASPI says.

There is also a case for government and industry to work together to develop policies and initiatives that offer digital entrepreneurs pathways beyond low-cost content-farm work and that reward ethical content creation.

"The influencer economy could be encouraged to self-regulate through the development of codes of conduct," the report says.

According to ASPI, commercial influence-for-hire services will continue to proliferate for as long as there's a market for them and cheap digital labour to deliver their services.

ASPI said this creates risks for societies that aspire to meaningful democratic participation and opportunities for foreign interference.

"A manipulated information environment doesn't serve democracy well," it added. "It's particularly harmful to societies that are emerging from historically more authoritarian forms of governance, have weak democratic governance, fragile civil societies, or any combination of those factors."

In line with testimony provided recently by Facebook, ASPI said there was growing evidence of states using commercial influence-for-hire networks -- PR firms.

It pointed to research [PDF] from the Oxford Internet Institute that found 48 instances of states working with influence-for-hire firms in 2019-20, an increase from 21 in 2017-18 and nine in 2016-17.

"A surplus of cheap digital labour makes the Asia-Pacific a focus for operators in this economy," ASPI added.

While currently, much of the responsibility for taking action against the covert manipulation of online audiences falls to the social media companies, ASPI said solutions must involve responsibility and transparency in how governments engage with their citizens.

"The technology industry, civil society, and governments should make that alignment of values the bedrock of a productive working relationship," it said. "Structures bringing these stakeholders together should reframe those relationships -- which are at times adversarial -- in order to find common ground."

Further recommendations made by ASPI to ensure that the information environment and digital economy best align with democratic forms of governance, include multi-stakeholder "whole-of-society" approaches, which would require a revisit of the existing "adversarial approach" between governments and the companies that provide the infrastructure for the digital economy.

"Democracies and industry must partner to fund capacity-building programs that bolster civil society organisations in emerging democracies in the Asia–Pacific region. Civil society organisations can work to apply transparency to state manipulation of the information environment," it wrote.

It has also suggested the creation of an Asia-Pacific centre of excellence in democratic resilience could provide a vehicle for public-private multilateral partnerships designed to maintain the health of the region's online public sphere.

ASPI has been calling for the establishment of an independent statutory authority to oversee operations of all social media platforms that operate down under.

"We suggest an independent statutory authority that is empowered to observe and report on how the incentives, policies, algorithms, and enforcement actions of social media platforms are operating, with the ultimate goal being to maximise benefits and reduce harm for society and its citizens," ASPI wrote in a to the Senate Select Committee on Foreign Interference through Social Media last year.

ASPI hopes for such an authority to be granted explicit insight into how content is filtered, blocked, amplified, or suppressed, both from a moderation and algorithmic amplification point of view.

"Crucially, these obligations should be placed on all social media operating in Australia, including those companies that originate from authoritarian regimes and those fringe platforms servicing niche communities -- not just the dominant Western platforms such as Facebook, Twitter, Instagram, and Snapchat," it said.

"These transparency and oversight measures would go some way towards countering the default incentive towards sensational, provocative, and potentially polarising content."


Disinformation for hire: PR firms are the new battleground for Facebook

Facebook's head of security policy has testified before an Australian Parliamentary inquiry that his company has witnessed an increasing use of marketing firms or PR agencies that are essentially hired to run disinformation campaigns.

Australia warned to not ignore domestic misinformation in social media crackdown

Committee has been warned against outsourcing the job of deciding what is true or false in an Australian context to a handful of private US companies.

Countering foreign interference and social media misinformation in Australia

DFAT, the Attorney-General's Department, and the AEC have all highlighted what measures are in place to curb trolls from spreading misinformation across social media.

Editorial standards