The Korea Times
PARIS – Algorithms are not value-neutral. Yet for over a decade now, we have allowed Big Tech to deploy them as the gatekeepers to our information ecosystem, without demanding transparency or accountability in return. The consequences have ranged from the amplification of polarizing and sensationalist content to veiled personalized advertising, the proliferation of monopolistic behaviors, and forms of influence over public discourse that are antithetical to democratic deliberation. Even though we had to learn the hard way what happens when critical information infrastructure is handed over to corporate interests without oversight, we are now repeating the same mistake with AI chatbots – and the stakes could be far greater. Chatbots do not simply curate existing information; they generate and frame it. Facebook and Google decided which news articles to show you, whereas tools like ChatGPT, Claude, and Gemini synthesize that information into authoritative-sounding answers. This distinction matters, because the shift from curator to editor is making undue influence even less visible a
Go to News Site