It's time for a conversation about misleading information.
From climate change to COVID-19 to the war in Ukraine, we have seen how misleading information has the ability to creep into our dialogue, both online and in the real world, and we will likely see it again at the upcoming federal election.
This is not a new problem, but as online behaviours evolve, the potential for harm is increasing.
Too often, the debate around fixing this issue is framed through the prism of content removal alone. Regulators often point to removal of content or accounts as the answer.
Unfortunately, this oversimplifies the issue and has the potential to cause further confusion or inflate conspiracy theories. People seeking to exploit the online ecosystem to undermine elections and spread disinformation will not be deterred by simply having their accounts suspended.
It's time for the government, tech industry, academics, nonprofits, and civil society to start to work together to address the challenge in a meaningful way - not in silos or in opposition to each other.
We must all think about our collective role in offering context, de-amplifying content, ensuring political reach is earned and not bought through ads, providing advanced media literacy capacities, creating space for public diplomacy, and embracing transparency.
Effective content moderation is more than just leave up or take down.
Providing users with context - whether it's about an account, a piece of content, or form of engagement - is key to effectively countering misleading information.
Since the last federal election, Twitter has taken a number of proactive steps to reduce the risk of misleading content causing harm to Australians.
Twitter recently announced a new beta feature to specially deal with the challenge of misinformation, allowing people who use our service in Australia to flag content they believe to be misleading. Last year, a number of technology companies, including Twitter, committed to the Digital Industry Group Inc's new code of practice to reduce the risk of online disinformation and misinformation causing harm to Australians, which contains a new independent service oversight and public complaints facility.
The bigger picture of how people find content and how it is amplified on digital platforms is also important.
It's why Twitter is focused on labelling potentially misleading content around voting and COVID-19, and offering context to content that may have been doctored or can cause harm.
We use labels on tweets by public figures when they violate our rules - as outright removal of the content distorts the historical record and dilutes the public's right to express dissent or debate in response.
Since the war in Ukraine began, for example, Twitter has amplified trusted, reliable, real-time news and resources in multiple languages, which has been viewed more than 8.08 billion times as part of our ongoing global efforts.
We have to be deeply thoughtful about these issues and avoid simplistic framing of the argument.
One example of this is when people point to anonymous or pseudonymous accounts as fueling the spread of misleading information. This oversimplified argument ignores the important role anonymity plays in providing people with the ability to express themselves freely and safely.
Contrary to what some have argued, there is a lack of conclusive evidence that requiring the display of real names and identities online will actually reduce online abuse. Rather, many studies have documented the problems removal of anonymity or pseudonymity actually creates.
Many communities rely on anonymity and pseudonymity - like political activists, journalists, and whistleblowers. If you're a young person exploring your sexuality or you're a victim of domestic violence who is looking online for help and support, pseudonymity is a safety measure. While pseudonymity has been a vital tool for speaking out in oppressive regimes, it is just as important in democratic societies like Australia.
Protecting and promoting the health of the public conversation is Twitter's top focus - and we cannot do it alone. Non-profits, governments, and the public must have a seat at the table. If we are to really tackle the challenge of misleading information, these issues cannot be viewed in isolation, and we need to stand together.
Sign up for our newsletter to stay up to date.