In recent years, artificial intelligence (AI) systems like ChatGPT and Gemini have become part of daily life for adults and children. Once unimaginable, these technologies have now become tools for students to seek help with homework, entertainment and even emotional support. However, minors are using these systems without understanding how they work. Allowing unregulated AI exposure at such a vulnerable stage of development is not only irresponsible but also dangerous. Earlier this month, the Interscholastic Ethics Society (IES), a student-led organization representing over 150 students from 11 international schools in Korea, submitted a national petition urging the government to implement regulations on the use of generative AI by children under the age of 12. Their petition, posted on the National Assembly’s public platform, calls for three measures: Limiting AI access for children under 12 (with exceptions for supervised educational use); requiring safety features such as age verification, suicide-related dialogue blocking and usage-time limits; and providing AI-safety education