Future news: How generative AI can enhance newsroom practices and enable creativity
AI journalism is inevitable. How do we make sure that it doesn't destroy society?
In 2023, the sociopolitical environment surrounding the field of journalism has become ever more divisive. As Americans continue to get their news from nontraditional sources and political bias perpetually fuels news coverage, maximizing trust in the media has become more critical than ever. A Gallup poll conducted in 2022 found that only 34% of Americans have a “great deal” or “fair amount” of confidence in media.
Furthermore, a crisis is spreading throughout America like wildfire: since just before the COVID-19 pandemic, over 360 newspapers have closed across the United States as of June 2022—roughly two shutdowns per week. Cash-strapped publishers are cutting staff and circulation, setting America up to lose one-third of its newspapers by 2025.
As the barrier to entry for harmful, politically extremist content declines, news publishers have struggled to keep up. However, generative AI can help mitigate this risk and provide vital accountability and essential local coverage.
While industries across the spectrum look to artificial intelligence to maximize efficiency and create boundless opportunities, the news media has been slow to react—and for a good reason. A jarring lack of regulation is a reasonable cause for concern, and without laws, bad actors can use generative AI to further the spread of misinformation.
However, with proper regulation and common humanitarian principles, generative AI can open a world of possibility in journalism. Generative AI can reduce the amount of human work necessary for producing hard, breaking news, allowing journalists to free up their time to cover more deep-cutting, community-focused reporting on more general, underrepresented topics. As the landscape of journalism shifts rapidly, the media should embrace the possibilities of generative AI to meet the needs of a changing society and continue to spread the truth and inform the masses.
The purpose of news is to inform.
At its core, the objective function of news is to inform. According to the American Press Institute, the purpose of journalism is “to provide citizens with the information they need to make the best possible decisions about their lives, their communities, their societies, and their governments.”
Thus, journalism is an industry that, principally, must be rooted in truth. Sure, this may not always be the case in practice: in the wake of Fox News’s $787.5 million defamation settlement with Dominion Voting Systems, it is clear that even some of the most prevalent media sources have had a role in pushing conspiracies and misinformation. However, we as a society aim for truth in the news; therefore, facts are of utmost importance in the news media industry.
The human spirit is at the core of journalism.
At first glance, the news is not a creative field. However, the human condition is at the core of journalism: choosing which issues to cover and which ones to ignore.
Every publication and every journalist is different. Inherent biases at both the writer and organization level—influenced by upbringing, political leaning, etc.—impact coverage strategy.
In terms of content, human creativity is irrefutably important. Features stories—soft news reports that usually cover people or human-interest topics—are inherently creative and are fueled by human empathy. People care about other people, and writers convey these emotions in feature writing.
However, hard news is not, and should not, be creatively written. The purpose of hard, breaking news is to inform efficiently, and this process is incredibly formulaic. From a young age, we are taught the five Ws: who, what, when, where, and why. Hard news articles answer every one of those questions within the first paragraph. Journalists call this the “nut graf”—all the information a reader absolutely needs to know, in a nutshell. For these types of stories, there is virtually no creativity besides choosing to publish about the specific topic that is covered.
Generative AI can create efficiency in the production of hard news.
Generative AI has the potential to automate the written prose of hard news. This is what I like to call “Tier 1” AI journalism: using language models to produce content when inputed with the 5 Ws, or the fundamental facts, of the situation.
This would be incredibly easy to implement. Consumers can already use OpenAI’s ChatGPT to produce full-on essays with a simple text prompt. With journalism, AI language models trained on decades of hard news publishing can write ledes and nutgrafs for hard news stories. It would almost be foolish not to do this.
Generative AI should NOT be used to produce facts.
However, there is a slippery slope with using Generative AI in journalism. AI “hallucination” occurs when generative AI outputs incorrect information while presenting it as fact. This is one of the most significant issues facing the rapid growth of AI: how do we distinguish truth from fiction?
Generating any news story, especially hard news stories, should not be fully automated by any means. A vital function of a journalist’s job is to do their due diligence in checking the credibility of their sources and choosing to speak to trustworthy sources to begin with. The capability of moral and factual judgment is one that AI simply cannot handle.
Hypothetically, even if there were an AI model advanced enough to pull from live police scanners and local government feeds, this could interfere with one primary function of journalism: holding institutions accountable. There is no guarantee that information from any source is truthful—that is why journalists interview multiple sources about the same situation to get a complete understanding. This “Tier 2” AI journalism becomes incredibly dystopian. Of course, in an ideal world, there would be no doubt in the one undeniable telling of the truth. However, that is simply impossible.
Sure, generative AI can be used to produce less hard-hitting content. Sites like Buzzfeed that make lifestyle content could even benefit from AI-written pieces for more efficient content outflow and search engine optimization. However, in pursuing impactful, time-sensitive news, generative AI cannot and should not be trusted to give factual, verified information.
Generative AI has confined capabilities in enabling creativity.
Up to a point, generative AI can enable creativity in the news to a varying degree across person, press, process, and product—the 4 P’s taxonomy of creativity.
“Person”
The “person” describes the center of the creative process. Kurt Lewin’s behavior equation, “B = f(P, E),” says an individual’s behavior is the product of their motivation and their surroundings. A person or publication chooses to report on what they, themselves, care about. It will be difficult to implement generative AI in the ideation process of journalism ethically. While researchers could quickly produce a language model that replicates ideas for news stories based on previous coverage, there is potential for a snowball effect of bias, in which AI can perpetuate divisive, harmful ideas. Human judgment is critical here—we have the moral compass to decide what is factually and ethically correct. While everybody’s moral mileage will vary, generative AI lacks any type of moral judgment, full stop.
“Press”
The “press” is the environment in which the person operates. Back to Lewin’s equation: the society in which the journalist operates will dictate what type of news coverage should be produced. Human-generated news is already notoriously biased toward white Americans—a 2010 Pew Research study found that most Americans believed coverage of whites and the middle class was “generally fair,” while coverage of poor people, Muslims, and blacks was widely seen as “too negative.” Much of this coverage is likely due to ratings—tune in to virtually any local news channel, and one will find that the majority of crime coverage pertains to affluent suburbs rather than lower-income, POC-majority neighborhoods.
An AI model trained on historical news coverage would only further perpetuate these biases. Moreover, AI models lack the foresight to enable any sort of democratized movement to increase equity in news coverage. With the “press,” journalists must unite in the pursuit of diversity in news—something that is incredibly difficult to attain, but that also cannot be helped by AI-generated ideation or prose writing.
“Process”
However, the “process” by which news is produced can benefit significantly from artificial intelligence. Of course, there are limitations—hard news answering the 5 Ws can easily be implemented; however, feature news requires a human level of empathy and sincerity. In theory, soft news prose can be automated. Still, a publication that relies on AI for human-interest stories will find that the increase in efficiency in content production will cannibalize the organization as human readers care less and less about uncreative, derivative writing.
While we humans cannot reliably differentiate between human and AI-generated text, a University of Pennsylvania study finds that humans can be trained to improve their decoding skills over time. As we increasingly encounter AI-generated writing in our daily lives, it may become easier to differentiate soulless writing from genuine passion.
“Product”
The “product” becomes key in discussing AI involvement in journalism. Hard-hitting investigative journalism cannot be produced by an automated AI model that cannot interview sources or judge credibility. Quality and originality are fundamental in society’s most important news stories. Generative AI cannot eliminate the curiosity of reporters’ work.
The decline of local news necessitates AI intervention.
Perhaps the most immediate ethical application for Generative AI in journalism can be found in the critical local news industry. As local news publishers continue to shut down at an alarming rate, it has become increasingly apparent that the industry simply lacks the funding to keep the robust reporting staff necessary to serve small communities. A massive one-fifth of the country’s population lives in a “news desert”—an area “with no local news organizations, or one at risk, with only one local news outlet and very limited access to critical news and information that can inform their everyday decisions or sustain grassroots democracy.”
Enter generative AI, which can free up precious time for the already-understaffed newsrooms nationwide. Reporters could use a journalism AI to focus on outputting crucial information to their local communities fast, prompting the AI with the 5 Ws of the news situation and proofreading the output. As a result, fewer reporters are needed to provide this crucial resource to news deserts across the country.
In fact, the Associated Press is already in the process of implementing AI into their workflow. As the leading provider for hard, accurate, unbiased news that is distributed to over 1,300 publishers around the world, the AP is the industry standard for producing clear, concise news coverage across virtually every locality. In 2014, the AP implemented “natural language generation” to automate their quarterly corporate earnings stories directly from financial data feeds.
To keep up with the ever-growing amount of information enabled by the rise of social media and nontraditional news platforms, newsrooms must adapt to become more efficient while remaining accurate sources for the people. Everything starts at the root: our immediate communities.
Hard news AI can enable writers to pursue more creative or impactful endeavors.
A critical goal in implementing AI ethically is to create a symbiotic relationship between the reporter and the computer. By freeing up time spent on writing hard news prose, journalists can spend their working hours on more impactful and creative endeavors. This provides a host of opportunities that did not exist before due to a lack of funding and resources necessary.
At the local level, this ideal symbiotic relationship would enable reporters to spend more time investigating corruption and identifying root causes of important issues rather than wasting time writing an article about a water main break or a random power outage. While these stories are important to informing communities, these local newsrooms already lack the bandwidth to cover their bases and currently do not have the capacity to serve one major function: holding those in positions of power accountable.
One common fear with the introduction of disruptive technologies is the notion of humans becoming lazy and more complicit when relying on technology. However, research indicates that AI can, in fact, improve human decision-making by instigating novel decisions never before seen. This, in turn, increases the opportunity for human creativity to shine.
Human intervention is still necessary for AI-generated work. AI hallucination is too dangerous to ignore—therefore, it must remain the responsibility of reporters and their editors to tediously fact-check AI-generated content.
Investigative pieces simply cannot be replicated by an AI. An AI language model does not have the intuition to ask the right questions and pursue the right sources to answer critical questions about serious issues. Similarly, with less-serious, human-interest feature stories, an AI also lacks the creativity to pursue topics that are deeply impactful to readers. However, a journalist’s job is to ask great questions and tell stories. By enabling AI to do the dirty, boring work of glorified data entry in hard news, journalists can focus on topics they and their readers feel passionate about.
Generative AI will (hopefully) not reduce journalism jobs.
While one can point to the statistics that show that newsroom employment fell 26% between 2008 and 2020, this does not accurately depict the lack of demand for journalism jobs. As a matter of fact, a 2017 Texas Tech survey found that undergraduate enrollment in journalism programs dropped 16.3% between 2013 and 2015. The truth is that journalism, in its current form, is not an attractive career to enter for many young professionals. The U.S. Bureau of Labor Statistics reports that the average median pay for a reporter was $48,370 per year. That’s roughly 22% less than the median U.S. income in 2021 of $69,717. Considering that most journalism jobs require at least a bachelor’s degree, potential journalists are opting for more lucrative careers.
As a result, we should not fear AI taking the place of journalism jobs when implemented in the ethical, symbiotic method we have discussed. In practice, this technology can complement the struggling journalists currently in the occupation. Supply for journalism jobs is low, but demand for these jobs is low, too. And, as we have discussed, AI lacks the judgment and human connection necessary to produce impactful journalism. There will always be a need for journalists in our society—but now, it is a matter of how to innovate the industry to support the creativity of news writers.
News companies must vow to ban AI-editorialized content.
Throughout this paper, I have discussed the implementation of generative AI through an ethical lens. However, it would be foolish not to acknowledge that strict, comprehensive legislation will be necessary to maintain the well-being of society. In the U.S., regulation of the big tech industry, in general, has been slow to implement, to say the least. In 2021, a Pew Research study showed that 56% of Americans supported more government regulation of major tech companies. We, as the public, must demand more action by voting in members of Congress who share the same deep-rooted concern and understanding about the damages an unregulated industry can cause.
In the meantime, it is the responsibility of news organizations to self-regulate. Media companies must vow never to use artificial intelligence to editorialize their content: in other words, AI should never be used to ideate or create facts for news stories. Allowing for this blatant abuse of AI would create a slippery slope for misinformation and lack of accountability.
Public regulation of generative AI may lead to some pushback from groups who may say that regulation on AI impedes freedom of speech. However, freedom of speech only applies to humans! AI-generated text does not have the same rights or protections as human speech, but it can still be subject to regulation in case it produces anything damaging or defamatory. Regulation of generative AI is necessary to ensure the accuracy and accountability of publishing AI-generated content.
Publishers should increase transparency around their AI practices.
Additionally, publications should take an open approach to publicizing their AI practices. Rather than hiding that AI is being used in their writing, publishers should explain exactly what roles AI plays in their work. Communication is key in this novel industry that much of the public has not learned to trust yet. Initially, there may be pushback from some groups about a news organization using AI at all. Thus, it will take the most trustworthy players in the news—The New York Times, The Wall Street Journal, National Public Radio, etc.—to normalize the use of generative AI through their transparent practices.
This communication can come in the form of disclaimers on pieces that have been AI-assisted. When executing these disclaimers, it is crucial to reassure the public that the content has been fact-checked to the same standards as any other news piece. These disclaimers can serve as a gesture of respect and compassion for the reader’s concerns as we transition to a world where AI-assisted news is normalized.
TL;DR: AI journalism is inevitable.
It is undeniable that AI will have a major role in the future of news. Now, it is our job as a society to ensure that AI is implemented in an ethical way that helps journalists do their job and does not further the spread of misinformation. The conversation around AI involvement in journalism will evolve rapidly—the content of this paper could possibly be completely outdated in just a few years. But one principle will remain the same: at all levels, from the reporter to the media company executive to the government regulator, a common goal of promoting the spread of valuable, accurate information for citizens to understand their communities and governments is absolutely paramount. If implemented correctly, AI can further assist in producing more creative and impactful news media.
Postscript.
Thank you for reading! This is such a nuanced and exciting topic, and I would love to hear your thoughts.
I wrote this piece as a part of an independent study course at the Boston University Questrom School of Business in Spring 2023. Special thanks to Professor Dokyun Lee and Eric Zhou for their input and expertise.