Microsoft's AI News Blunders: Misinformation and Insensitivity Abound

JJohn November 5, 2023 11:01 PM

Microsoft's AI news algorithm is under fire for promoting fake news and insensitive content. While AI is increasingly used in journalism, Microsoft's missteps highlight the technology's inherent pitfalls and the potential dangers of replacing human editors with machines.

MSN AI's questionable editorial calls

In the era of 'fake news,' the accuracy and sensitivity of the content we consume have never been more important. A recent CNN report takes aim at Microsoft's MSN AI model for falling short in these aspects. The AI was blamed for promoting stories that either spread misinformation or used offensive language. Instances include a false story claiming President Joe Biden fell asleep during a moment of silence and an obituary derogatorily referring to an NBA player.

Although Microsoft swapped out humans for algorithms in an attempt to modernize its content delivery, the results seem more akin to a social experiment gone awry than a revolution in news dissemination. Despite its promise, the system has repeatedly failed to catch problematic content, a task human editors would likely have handled with ease. This calls into question the readiness of AI to take over critical roles in journalism and content curation.

Challenges of AI integration in journalism

The use of AI isn't exclusive to Microsoft; other prominent organizations like the BBC, Macworld, and The Associated Press are also integrating it into their operations. While it has potential to streamline processes, there has been a significant number of error-ridden and problematic content distributed as a result. This trend underscores the ongoing challenge of ensuring AI-produced content maintains the high standards of traditional journalism.

Lack of accountability in AI-led journalism

The issue with Microsoft's automated system isn't just the occasional misstep. The system persistently features and generates content laden with inaccuracies and offensive language. Equally concerning is the apparent lack of concern or action from those involved in the process. Without human journalists or editors to hold accountable, it’s just software following its programming, raising questions about accountability in AI-led journalism.

More articles

Also read

Here are some interesting articles on other sites from our network.