Researchers poison their own data when stolen by an AI to ruin results

Poisoned knowledge graphs can make the LLM hallucinate, rendering it useless to the thieves.