Large language models, called “AI” in marketing terms, sometimes make shit up. It’s a problem that anyone recognizes if they use one for even a little while—and it’s not just basic information they get wrong. Wikipedia is wrestling with just one such example as it’s finding “hallucinations” in LLM-translated articles. The start of this story is positive and altruistic: a third-party, non-profit organization called the Open Knowledge Association (OKA) is paying a stipend to people who translate Wikipedia articles into other languages. Problems come from relying on large language models like Google Gemini and ChatGPT to perform the translations without human review. According to a report from 404 Media , Wikipedia editors performing routine reviews of the translated articles found basic informational mistakes that weren’t in the articles of the original language, and citations that were missing, swapped, or for book pages that had nothing to do with the subject. (LLMs are particularly bad at documentation, as many lawyers have discovered to their detriment .) Previously, the system used Grok—Elon Musk’s LLM that’s plugged into eX-Twitter and known for producing mass quantities of non-consensual sexual material —but OKA has reportedly changed its policy. Wikipedia writers are almost entirely unpaid. A person can edit Wikipedia as part of their broad job, such as a PR agent, but those edits require disclosure and are subject to additional scrutiny. Adding a commercial element to large-scale edits, even in the cause of translation, may be incentivizing the workers paid by the Open Knowledge Association to go fast and make mistakes. OKA pays its workers about $400 a month for full-time translation. That’s not much, but it goes a lot further in the Global South, where many of the translators live. Despite Wikipedia’s broad policy against LLM-generated articles , Wikipedia editors have decided to keep using OKA’s services. Since the English Wikipedia is still twice as big as the next-largest language (not counting the Cebuano Wikipedia, which seems to be almost entirely bot-generated), and languages with fewer speakers are often poorly served, translations for important articles are desperately needed. But OKA’s translators are getting much harsher editorial practices than regular users: after five documented errors, an OKA translator can be banned, and their previous translations can be wiped unless a more senior editor takes ownership.