
The Federal Trade Commission (FTC) has now issued a final rule amending the Children’s Online Privacy Protection Act (COPPA) and making major changes set to take effect on June 23, 2025. This is a turning point for U.S. privacy regulation in artificial intelligence.
Given that online services, apps, and products powered by AI technology are rapidly becoming indistinguishable, the FTC’s most recent action continues to tighten restrictions on the collection, storage, and use of children’s data, especially for purposes such as training machine learning or AI models.
AI Training on Children’s Data Now Requires Parental Consent
An innovative alteration in the revised COPPA Rule is the express mandate for parental consent before data is utilized to train or build AI technologies. The FTC clarified that these uses are not “core” to the website or online service, so they fall outside of the implied parental consent.
Companies looking to utilize children’s data to train AI models for improvement, whether for monetization, advertisement optimization, or feature enhancements, must now receive affirmative and verifiable parental consent.
This new provision reflects the growing regulatory concern over AI development and children’s privacy. As generative AI and predictive AI tools become more advanced and data-heavy, this change recognizes the risk this presents to youth privacy.
Ban on Indefinite Data Retention Introduced
The FTC also has strong views on data retention policies. The revised rule makes clear that online services can no longer keep children’s personal information “forever.” In the past, companies were able to keep that information “forever.” Now, companies must keep children’s personal information only as long as it is “reasonably necessary to accomplish the purpose for which the information was collected.”
Once the purpose is achieved, operators must delete the information, using data deletion systems that would prevent unauthorized access, and in a secure and reasonable manner. Operators must “publish a written data retention policy and include it in their online privacy notice.”
Changes to the retention policies are enforceable under these requirements; violating them may lead to fines of up to $53,088 for each violation.
Broader Definition of Personal Information Includes Biometrics
Due to technological advances, the FTC has expanded its definition of “personal information” to include biometric identifiers—specifically, information like fingerprints, retina scans, DNA sequences, patterns of gait, or facial templates that are increasingly part of information systems that use AI in their identification, authentication, or personalization processes.
The listing of biometric information as personal information reveals the FTC’s concern about the sensitive and vulnerable nature of any biometric information, particularly about children.
The Commission expressed concerns about storing biometric identifiers, even short term, as creating security vulnerabilities and enabling excessive tracking/advertising targeting.
Enhanced Parental Transparency and Control
Another key change involves expanding the level of transparency provided to parents. Websites and apps must now inform parents not only of the general purpose of data collection but also of the specific third parties who may receive their child’s data and for what exact reasons.
Furthermore, parents are given greater control over their child’s data by being allowed to consent to its collection while opting out of its sharing. These granular consent options are a significant improvement over the previous all-or-nothing framework and offer parents more agency in managing their children’s digital footprint.
Stronger Data Security Requirements for Operators
The updated COPPA Rule also introduces rigorous data security measures, modeled after the FTC’s Safeguards Rule. All operators must implement a written information security program, designate personnel to oversee data protection, and conduct annual assessments of both internal and external security risks.
Before sharing data with third-party providers, operators must ensure these partners are capable of maintaining data confidentiality and must obtain written assurances to that effect. These measures aim to reduce the risk of data breaches and unauthorized exposure, especially given the sensitive nature of children’s online information.
Implications and Industry Takeaways
The FTC’s 2025 COPPA update serves as a wake-up call for tech companies, app developers, and educational platforms catering to children under 13. With new obligations surrounding AI consent, data minimization, and biometric protection, the regulatory bar has been raised.
The FTC’s commentary also leaves some open questions, such as how to handle historical data already used in AI models or what happens if a parent withdraws consent after AI training has occurred. Nevertheless, the central message is clear: businesses must prioritize children’s privacy in an age where personal data powers algorithms and automated systems.
A New Era for Children’s Online Privacy
As AI is integrated across various digital platforms, protections for IoT users, specifically children, be expanded, must also adapt and improve. The FTC’s recent changes to COPPA and its updates to the rules regarding children’s data privacy are a major step towards ensuring that children’s data is adequately protected and not harvested irresponsibly, in a safe manner, and not fully under the radar.
Organizations should quickly evaluate their applicable privacy practices and make the necessary amendments to their privacy practices to avoid potential significant monetary fines and damaging reputational consequences. In 2025 and beyond, complying with COPPA will not only be a legal obligation but also a normative standard for digital ethics.025 and beyond, COPPA compliance will not just be a legal requirement—it will be a standard of ethical digital conduct.