Meta‘s messaging service WhatsApp is introducing a new system of parent-linked accounts designed specifically for children under 13, marking a significant shift in how the platform handles younger users. The new accounts will allow pre-teens to access WhatsApp’s core messaging and calling functions while placing strict oversight tools in the hands of parents or guardians. Parents will be able to manage privacy settings, approve contacts, and supervise group participation through linked controls, often secured with a parental PIN. The feature is being rolled out amid mounting political and public pressure on technology companies to better protect minors online and limit exposure to harmful content, scams, and inappropriate interactions. By limiting features and requiring parental approval for key activities, the system effectively turns the child’s account into a supervised communication channel rather than an unrestricted social platform. The initiative reflects a broader industry trend in which large technology companies are attempting to balance access to popular digital tools with stronger parental authority and tighter safety guardrails for younger users.
Sources
https://techcrunch.com/2026/03/11/whatsapp-is-launching-parent-linked-accounts-for-pre-teens/
https://www.reuters.com/technology/whatsapp-launches-parent-managed-accounts-pre-teens-amid-safety-concerns-2026-03-11
https://cybernews.com/privacy/whatsapp-parent-managed-accounts-kids-under-13
Key Takeaways
- WhatsApp is creating supervised accounts for children under 13 that give parents direct control over contacts, privacy settings, and group participation.
- The accounts restrict features primarily to messaging and voice calls, limiting the broader social media functionality that often raises safety concerns.
- The move reflects increasing political and regulatory pressure on technology companies to improve child-safety measures and age controls across social platforms.
In-Depth
WhatsApp’s decision to introduce parent-linked accounts for pre-teens represents a notable shift in the ongoing debate over children’s access to social media and messaging platforms. For years, most major digital services have maintained a nominal minimum age of 13, largely because of legal frameworks governing children’s data privacy. In practice, however, many younger children have still found their way onto these platforms by misrepresenting their age or using family devices. Meta’s latest initiative attempts to bring that reality into a more structured and controlled environment.
Under the new model, children under 13 can access WhatsApp only through an account that is directly tied to a parent or guardian. The supervising adult retains authority over key privacy and safety settings, including who can contact the child and which groups the account may join. Parents can also review requests from unknown contacts and prevent unsolicited interactions, effectively serving as the gatekeeper to the child’s digital communications.
Importantly, the accounts are intentionally limited in functionality. Pre-teen users can primarily send messages and make calls, while the broader ecosystem of features common to social media—such as public discovery or expanded community interactions—is restricted. The design reflects a cautious approach intended to minimize exposure to the kinds of risks that have drawn increasing scrutiny from lawmakers and child-safety advocates.
The rollout also arrives at a moment when governments around the world are intensifying pressure on large technology companies to take stronger responsibility for protecting minors online. Lawmakers in several jurisdictions have proposed or enacted legislation requiring age verification, parental consent mechanisms, and stricter controls on how social platforms interact with young users. In response, companies like Meta have begun experimenting with new account structures aimed specifically at younger audiences.
Supporters of the approach argue that it offers a pragmatic solution: rather than pretending children under 13 are absent from messaging apps, the system acknowledges their presence while placing authority firmly in the hands of parents. Critics, however, remain skeptical, warning that any expansion of social technology into younger age groups risks accelerating dependence on digital platforms.
What is clear is that the issue of children and technology is not going away. As messaging services continue to dominate how people communicate, companies will face increasing pressure to design systems that respect parental authority, safeguard young users, and address growing public concerns about the digital environment children are growing up in. WhatsApp’s new parent-linked accounts represent one of the latest attempts to strike that balance.

