The dispute between news publishers, tech platforms, and the Australian government exploded last week, as Facebook threatened to pull news from its platform in Australia if proposed legislation on a new bargaining code is passed. Google published an open letter warning of the dangers of the new law.
The legislation would create a mandatory bargaining code between publishers and Facebook and Google, requiring the two platforms to host news from any publisher that participates in the code and to pay for that content at a price set by an arbitrator.
Matt Perault is the director of the Center on Science & Technology Policy at Duke University and an associate professor of the practice at Duke’s Sanford School of Public Policy. He was previously a director of public policy at Facebook.
Most of the reporting on the proposed legislation has focused on the requirement that platforms pay for news. That’s a controversial proposition, since neither Facebook nor Google force news outlets to share their content on Facebook or in Google News, since both Facebook and Google already share revenue with publishers, and since news doesn’t make much money for tech platforms. In addition, the code envisions only one-way payments from platforms to publishers, even though in other markets like app stores and internet-based TV, content providers pay for distribution.
But the code’s biggest problem isn’t that it requires distributors to pay for content, which is the model in cable TV. What makes the proposed Australian regime so problematic—and unique among democratic countries—is that it requires platforms to carry the content of any Australian news organization that participates in the code at a cost set by mandatory arbitration.
Those two components create a state-sponsored media regime, where a government process determines the news that appears in News Feed and Google News and also sets the price. It would substitute a government-sanctioned arbitrator’s judgment for the platforms’, even on issues where judges and government officials have historically demonstrated a striking lack of expertise, such as ranking results in News Feed or search.
This regime would be a dramatic departure from other markets where distributors and content creators negotiate over the terms of distribution, but the parties can walk away when the terms aren’t workable. HBO Max, for example, isn’t available on Roku TVs because the two companies couldn’t reach an agreement. But in Australia, platforms would be prohibited from charging for the cost of distribution and couldn’t refuse to carry a publisher’s content, even if distributing it at the arbitrator-set price imposes steep costs.
The proposed law is deeply flawed industrial policy, with the government intervening in a business dispute between two industries that compete with each other for advertising. And it will likely distort the tech market as well, since the law applies only to Google and Facebook, even as companies like Apple and TikTok grow their offerings in news distribution.
More importantly, this regime would shift power away from the people who use these platforms, who currently are able to choose what publications to follow on Facebook and Google, and toward unelected regulators and arbitrators.
Of course, Facebook’s decision to withdraw news from Australia will impose real costs on Australians as well. The law makes it impossible for Facebook to block only Australian news content, so the company’s plan is to prevent Australians from sharing news from any news site, including international publications like The New York Times and The Guardian.
It’s also likely to hurt publishers, including smaller local publications, who benefit from traffic directed to their sites from Facebook. Facebook will still allow sharing of user-generated news, but it’s inevitable that there will be less high-quality content on its site and more financial pressure on publishers at a time when they’re already struggling.
Reaching this impasse was avoidable. Promoting a stronger news ecosystem is an important goal, but other policy mechanisms are more likely to achieve it. One option is to use voluntary codes between platforms and their users to establish stronger norms around consumer choice, misinformation, and subscription options when news is distributed on tech platforms. These types of codes have been effective in areas like human rights, where they’ve helped to establish best practices, provided transparency on company practices, and used audits to hold companies accountable.