California’s Age-Appropriate Design Code Just Survived a Federal Court — What Changes Now
A Major Legal Win for California’s Child Online Privacy Law
California’s Age-Appropriate Design Code Act, often called the CAADCA, just cleared a significant legal hurdle. A federal court ruled that the law can move forward, rejecting an earlier challenge that tried to block it on First Amendment grounds. This decision is a big deal — not just for California, but for the entire country when it comes to how tech companies handle children’s online privacy.
For parents, advocates, and digital rights watchers, the ruling signals that stronger protections for kids online are not just possible — they may now be inevitable.
What Is California’s Age-Appropriate Design Code?
Before diving into what changes now, it helps to understand what this law actually does. California’s Age-Appropriate Design Code is modeled after a similar law in the United Kingdom. At its core, the law requires companies that offer online products and services likely to be accessed by children under 18 to design those platforms with children’s safety and privacy in mind.
Here are some of the key requirements the law includes:
- Companies must assess and minimize risks to children before launching a product.
- Default privacy settings must be set to the highest level of protection for younger users.
- Companies cannot use dark patterns — sneaky design tricks — to get children to hand over more personal data than necessary.
- Platforms cannot collect, sell, or use the personal data of children in ways that are harmful to their health or well-being.
- Geolocation tracking of minors must be turned off by default.
The law applies to any business that operates online and is likely to be accessed by users under 18, regardless of whether the platform is specifically designed for children. That is a wide net — and intentionally so.
Why Was the Law Challenged in Court?
Shortly after the law was signed, the tech industry trade group NetChoice filed a lawsuit to block it. Their argument was that the law violated the First Amendment by forcing companies to restrict or moderate content and speech online. They also argued it placed too heavy a burden on businesses by requiring them to guess whether a user might be a minor.
A district court initially agreed with some of those concerns and blocked the law from taking effect. However, the case was appealed, and the Ninth Circuit Court of Appeals reversed that decision. The appeals court found that most of the law’s provisions are constitutional and that the state has a legitimate interest in protecting children from online harm.
It is worth noting that the court did strike down one specific provision — the requirement that companies assess whether their products could expose children to harmful content. The court found that particular piece raised valid free speech concerns. But the core of the law remained intact.
What Does This Ruling Mean in Plain Terms?
In simple terms, the ruling means California’s law is now on much stronger legal footing. Tech companies can no longer easily argue that the law is unconstitutional to avoid complying with it. The court made clear that protecting children’s privacy is a compelling enough reason for the government to place certain requirements on digital platforms.
This does not mean every part of the law is settled. There could still be further legal battles, and the case could potentially make its way to the Supreme Court. But for now, the law stands, and companies need to take it seriously.
How Does This Affect Tech Companies?
The ruling puts real pressure on technology companies that operate in California — which, given the size of the state’s economy and population, essentially means most major platforms operating in the United States.
Here is what tech companies will need to address under the law:
- Age Verification and Estimation: Companies will need to find ways to identify whether their users are likely minors. This does not necessarily require collecting identification documents, but it does mean using some form of age estimation or verification.
- Privacy by Default: Platforms must automatically set the highest privacy protections for child users. They cannot require kids to opt into privacy — privacy must be the starting point.
- No Manipulative Design: Any design feature that nudges children toward giving up more data, spending more time on a platform, or making purchases must be removed or reworked.
- Data Use Restrictions: Companies must limit how they collect and use data from minors, including banning targeted advertising based on personal data.
- Risk Assessments: Businesses must document how their products might affect children and take steps to reduce those risks before launching.
Failure to comply can result in civil penalties. California’s Attorney General has the power to enforce the law and seek fines against businesses that do not follow the rules.
What Does This Mean for Age Verification Technology?
One of the more complicated pieces of this law is the age verification requirement. Tech companies now face the challenge of figuring out how to identify young users without creating new privacy risks in the process.
There are several approaches being considered across the industry:
- Self-declaration: Simply asking users to enter their age, though this is widely considered ineffective.
- AI-based age estimation: Using machine learning to estimate a user’s age based on behavioral data or other signals.
- Third-party age verification services: Partnering with outside providers that can confirm a user’s age without sharing detailed personal information with the platform itself.
- Device-level verification: Working with operating systems or device manufacturers to verify age at the account level before a user even opens an app.
Privacy advocates are closely watching how companies handle this, because poorly designed age verification systems could create their own problems — like building up databases of sensitive personal information about users.
Could This Law Influence Other States?
California has a long history of leading the way on consumer protection and technology regulation. Other states have watched closely, and many have followed California’s lead on issues ranging from data privacy to emissions standards.
The fact that this law survived a major federal court challenge makes it much more likely that other states will feel confident passing similar legislation. Several states have already introduced or passed their own versions of online child safety laws, and this ruling gives those efforts a legal boost.
At the federal level, there has been ongoing discussion about updating national laws around children’s online privacy. The Children’s Online Privacy Protection Act, or COPPA, was last updated in 2013 and many argue it has not kept up with how the internet has changed. California’s law could serve as a model for what stronger federal rules might look like.
What Should Parents Know?
For parents, this ruling is encouraging news. If companies comply with the law as written, children who use apps, social media platforms, games, and other digital services in California will automatically have stronger privacy protections in place — without parents needing to dig through confusing settings menus.
That said, parents should not assume the law handles everything. It is still important to:
- Talk to children about what information they share online.
- Review the apps and platforms your children use regularly.
- Check privacy settings on devices and accounts.
- Stay informed as companies update their policies in response to the law.
The law creates a floor of protection, not a ceiling. Good digital habits at home still matter enormously.
What Happens Next?
Now that the court has upheld most of the law, California’s Attorney General is expected to begin active enforcement. Companies have had time to prepare, but many have been waiting to see how the legal challenge would play out before making major changes to their platforms.
That waiting period is effectively over. Tech companies that have not yet brought their products into compliance with California law will need to move quickly. The law’s requirements are detailed, and making the necessary changes to product design, data practices, and internal policies takes time.
There may also be further legal challenges — either from NetChoice or other industry groups — but the Ninth Circuit’s decision makes it harder to argue the law is fundamentally unconstitutional. Companies are more likely to focus their energy on lobbying for changes to specific provisions rather than trying to block the entire law.
The Bigger Picture: A Shift in How We Think About Kids Online
What this ruling really represents is a broader shift in public and legal thinking about children’s rights in digital spaces. For a long time, the default assumption was that the internet was a neutral space and that responsibility for protecting children fell entirely on parents. That assumption is changing.
Courts, lawmakers, and increasingly the public are recognizing that tech companies have enormous power over the online experiences of young people. Recommendation algorithms, autoplay features, notification systems, and targeted advertising all shape how children spend their time and what they are exposed to. The argument that companies bear some responsibility for designing these systems more carefully is gaining ground.
California’s law, and the court ruling that upheld it, represent a concrete step toward holding companies accountable for the environments they create. Whether or not you agree with every detail of the law, the underlying message is hard to argue with: children deserve better than the default settings that benefit platform engagement over user well-being.
Final Thoughts
The survival of California’s Age-Appropriate Design Code in federal court is a landmark moment for child online safety and consumer protection in the United States. It sends a clear message to the tech industry that designing products with children in mind is not optional — it is a legal requirement that courts are willing to enforce.
For everyday people, this ruling means more pressure on companies to clean up how they treat young users. For tech companies, it means the time for action is now. And for the broader conversation about digital safety, it means California has once again set a standard that the rest of the country will be watching closely.














