AIs confuse fact with opinion, study finds
DW found that 53% of the answers provided by AI assistants to its questions had relevant issues, with 29% of cases related to the accuracy of the information.
Among the factual errors identified in the responses to DW’s questions is the assertion that Olaf Scholz is Chancellor of Germany, even after Friedrich Merz took office a month earlier. Another error was the appointment of Jens Stoltenberg as Secretary General of NATO, when Mark Rutte had already assumed the role.
Artificial intelligence assistants have become an increasingly common way of accessing information around the world. According to the Reuters Institute’s 2025 Digital News Report, 7% of online news consumers use AI chatbots to get information, with this number rising to 15% among those under 25 years of age.
The study therefore confirms that AI assistants systematically distort journalistic content of all types, the authors conclude.
“The research shows conclusively that these failures are not isolated incidents,” said Jean Philip De Tender, deputy director general of the European Broadcasting Union (EBU), who coordinated the study.
“They are systemic, transnational and multilingual, and we believe this puts public trust at risk. When people don’t know who to trust, they end up trusting nothing – and this can discourage democratic participation.”
