Meta is using AI to better detect teenagers on Instagram and, in some cases, it will automatically change their account settings for safety reasons.
In 2024, Instagram introduced AI to estimate users’ ages. The system looks for clues from the posts – such as birthday messages from friends or how users interact with posts, to determine if someone is under 18.
Meta has stricter rules for Teen accounts —by default, they are private and they can’t receive messages from strangers. They can also see limited content. Last year, Instagram automatically enabled safety settings for all teen accounts.
Now, Instagram says it will use AI to find accounts that claim to be adults but might actually belong to teens. If AI suspects that a user is underage, Instagram will switch their account to the stricter teen settings. The company admits that the AI might make mistakes, so users can change their settings back if needed.
Meta has been adding more safety measures for teens, often due to concerns from parents and government officials. In 2023, the European Union investigated whether Meta was protecting young users enough.
 Reports of predators targeting kids on Instagram led to a lawsuit from a U.S. attorney general. There’s also disagreement among tech companies—Google, Meta, Snap, and X—about who should be responsible for keeping kids safe online.
In March, Google accused Meta of shifting responsibility to app stores after a new law was passed in Utah.