How extensively is the use of artificial intelligence already taking place in journalism and what opportunities does AI offer the media industry? An analysis. (Image © mast3r / Adobe Stock)
When people talk about AI or artificial intelligence in the context of journalism, they often mean the use of “robot journalism,” algorithms that can translate data into text. In 2019, The New York Times reported that about one-third of the content published by Bloomberg is already created with the help of automated technology. The system used by Bloomberg goes by the name Cyborg and can analyze a financial report immediately after it is published and create a news story with the most important facts from the information. The Washington Post’s self-developed technology, Heliograf, has also been active since 2019 and has written around 850 articles in its very first year, including around the Rio Olympics.
But these are just two of the best-known examples of the use of automation technologies in journalism; the development goes much further. In Germany, for example, media companies are also relying on corresponding technologies. For example, Retresco’s software has already been used to produce election results analyses for RP Online. The innovation lab “HHLab” of the NOZ Media Group is developing corresponding tools, and the Süddeutsche Zeitung also states that it uses AI technologies. As one of the pioneers among German media companies, the Stuttgarter Zeitung has developed a technology together with the company Arvato that automatically analyzes police reports and transfers the information they contain to a “crimemap” in prefabricated categories. The overview, which is designed as a reader service, now contains reports dating back to 2014.
The decisive factor for the use of AI in journalism is often the fact that corresponding technologies can analyze even very large volumes of data efficiently and extremely reliably – and pay attention to the smallest details. For example, AI technologies are increasingly being used in business journalism when it comes to evaluating financial reports or analyzing election results. While these analyses are often used to create and publish automated articles, it is ultimately still up to journalists to discover the story behind the data and develop it on a larger scale. Robot journalism is therefore less about replacing editors, journalists and authors through the automated creation of texts, and more about creating free space for more in-depth journalistic activities through the automation of routine tasks and the targeted and comprehensive analysis of large volumes of data. But, of course, journalists who merely write up the results of soccer matches without commenting on them run the risk of falling victim to “robot journalism” in the long term.
AI does not just mean robot journalism
In addition to research and data analysis, the use of AI in journalism offers other opportunities that support the development of media companies – for example, with regard to the analysis of reader behavior, the resulting personalization of content, or the analysis of reader feedback and comments. Focus Online, for example, has developed the so-called “Constructive Score” for itself, which provides information about how “constructive,” i.e., solution-oriented, the content of the news site is at any given time. Constructive is defined as anything that goes beyond pure reporting and offers or suggests solutions to problems, according to Florian Festl, editor-in-chief of the news portal. To introduce a corresponding metric, Burda says it analyzed 10,000 editorial articles, teasers and headlines and tested them for their solution-orientation. This analysis formed the basis for a corresponding AI technology.
“After feeding this content, the machine was able to learn over a period of months what effects individual words, word sequences or semantic combinations have on the constructiveness of an article,” says Burda. By combining a resulting score with special live feedback, the editorial team can see at any time how solution-oriented an article is perceived to be and, if necessary, optimize it in a solution-oriented manner – if possible. Whatever one’s opinion of the content of this approach to solution-oriented journalism, this example shows the potential offered by the corresponding technologies.
Another example of the use of AI for smarter and optimized reader service is the Voitto technology from the Finnish media company Yle. Voitto is both a robot journalist and creates its own texts as well as a personalized news assistant on users’ devices. As part of the NewsWatch app, Voitto is located on the user’s lockscreen and plays out recommendations for new news content. These recommendations are not only tailored to the user, but are also based on journalistic principles. This can be seen, for example, in the fact that users can be suggested two different articles with different perspectives on the same topic. In this way, readers are also challenged to broaden their horizons.
And a modern digital publishing platform like Sprylabs Purple DS, for example, already integrates AI technology in various forms ¬- from automated affiliation links to seasonal topic suggestions for editors. And with the print connection tohoop, such a publishing platform is even extended by the possibilities of AI-based creation of print layouts.
The use of AI in media companies does not mean the inevitable extinction of journalism or the declining reliability of media – on the contrary. AI technologies can support the work of editorial teams and mean an increase in journalistic quality. At least, as long as these technologies are integrated into editorial work in a meaningful way and guided by journalistic principles. This is all the more important as one should not underestimate AI language models that already exist. With corresponding technologies, highly authentic texts can already be generated, which are sometimes hardly distinguishable from texts with real “authorship”. Caution is therefore advisable when dealing with such technologies – especially since, for example, there is still no corresponding labeling obligation for AI-generated text.