Google's AI Tool For News Seems a Lot Like Plagiarism
The absence of clear sources or credible bylines attached to news will deal a severe blow to journalism as a check on power and purveyor of accurate information
News media is struggling around the world as their business models have collapsed with platforms which mediate audience attention cornering a lion's-share of advertising revenue. This existential crisis for news media is likely to be exacerbated with the advent of AI.
As per reports, Google is beta testing a product that uses artificial intelligence to produce news stories. Google is reportedly paying some publishers a monthly fee to produce a fixed number of stories and other content from this tool every month. This tool will allow publishers to compile a list of relevant websites and publish news stories based on articles from these sites. These AI-generated news stories will then be approved and published without necessarily acknowledging that the story was generated by AI.
There is limited information about this tool in the public domain; however, in a statement, Google said that “The experimental tool is being responsibly designed to help [...] produce high quality journalism using factual content from public data sources – like a local government’s public information office or health authority”. The statement adds that “publishers remain in full editorial control of what is ultimately published on their site. These tools are not intended to, and cannot, replace the essential role journalists have in reporting, creating, and fact-checking their articles.” This seems disingenuous. An automated tool which merely spins plain vanilla factual information from public sources such as the weather or health bulletin into news reports is unlikely to have much value for any news organization. The layering of any reporting context will necessarily require leaning on existing reports by other news organizations. The final news report will thus necessarily be a compilation and summary of news reports by other organizations.
If this seems like theft, it is. The core purpose of news is original and factual reporting to keep citizens informed and thus provide a check on power. The value of news is entirely contingent on its quality - which is a function of its relevance and accuracy. The determination of what and how to report on any issue is a judgment call honed by experience and understanding. At the same time, original reporting, fact-checking and verification costs time and money. Bypassing this entire chain of experience and effort to simply repurpose existing reporting into an entirely new story for the purpose of diverting traffic/audience is intellectual theft. It is also monetary theft given that revenue models online are largely contingent on traffic. What is remarkable is that a company like Google should be developing a tool to do exactly this.
Google is already under fire from news publishers for displaying news snippets directly in its search results, resulting in numerous zero-click searches and costing news publishers lost ad revenue. Adding an intermediary layer of publishers to do the scraping and summarisation for the company may help disembody news content from its original publishers. It has already been reported that Google News makes no distinction between “sites that rip-off other outlets by using AI to rapidly churn out content” and genuine news organizations which do painstaking original reporting. Another possibility is that Google is trying to recompense news publishers for lost ad revenue by helping them automate news content and cut reporting costs. Neither of these two possibilities are to Google’s credit.
It will however be a mistake to make this only about Google. Given the capabilities of freely-available large language models, it will be trivial to replicate the ability to generate “news reports” by summarizing related news. Social media, particularly messaging apps like WhatsApp, have already contributed significantly to detaching news content from its original source, and AI threatens to accelerate this trend of treating news as fungible content. The absence of clear sources or credible bylines attached to news will deal a severe blow to journalism as a check on power and purveyor of accurate information. It will become an untenable situation of dubious content floating around in AI-generated forms without substantive accountability for both the news and powers that be.
Also Read: