NetChoice's Minnesota lawsuit, in five claims
On April 29, NetChoice, the trade group for Meta, Google, YouTube, Amazon, Snap, and TikTok, filed a lawsuit against Minnesota in federal court. They are challenging HF 2 Section 13, a law signed by Governor Tim Walz on June 14, 2025, that requires social media platforms to show a clear mental-health warning to users in the state. The law will take effect on July 1, 2026. NetChoice claims that forcing platforms to display this warning violates the First Amendment by compelling speech. This is the second time NetChoice has taken Minnesota to court. They have also sued Texas, Florida, California, Utah, Ohio, Mississippi, New York, and Colorado over similar laws about child safety, content moderation, and digital age verification. Suing states over these issues is what they do.
NetChoice submitted a 48-page complaint in the U.S. District Court for the District of Minnesota to try to stop the state’s new social media warning label law. The case is called NetChoice v. Ellison. The tech industry is represented by Aaron Van Oort from Faegre Drinker Biddle & Reath. Van Oort served as Minnesota’s Solicitor General and is now leading the case against the state he once represented. Instead of hiring a typical D.C. law firm, NetChoice chose someone who knows the local federal courts well. The complaint is detailed, carefully written, and uses all the main legal arguments against compelled-speech rules.
What the Minnesota law actually requires
The press release describes the law as a “censorship label” requirement, but the actual statute is less strict.
Article 19, Section 13 of Minnesota House File 2 (2025 First Special Session, Chapter 3) created a new section of the Minnesota Statutes — § 325M.335, titled “Mental Health Warning Label.” Effective July 1, 2026, a covered “social media platform” must display a “conspicuous mental health warning label” each time a user accesses the platform. The label disappears when the user either exits the platform or “acknowledges the potential for harm and chooses to proceed to the social media platform despite the risk.”
The label must do two things. It must warn users about the “potential negative mental health impacts” of using the platform. And it must include access to mental health resources, specifically the federally administered 988 Suicide and Crisis Lifeline. The Commissioner of Health, in consultation with the Commissioner of Commerce, was charged with developing the actual content of the label by March 1, 2026, and the guidelines must be “based on current evidence regarding the negative mental health impacts of social media platforms.”
That covers the full requirement. There is no ban, age check, or content moderation rule. Users can say what they want, and platforms can publish what they choose. The law only asks for a one-click acknowledgment, similar to the “I understand” buttons people already see on cookie banners, age gates, terms of service screens, HIPAA notices, and the alcohol-cancer warning the Surgeon General called for in January 2025.
The five claims, examined
Claim 1: “This is compelled speech, a direct violation of the First Amendment.”
NetChoice frames any required disclosure as an automatic violation of the First Amendment. That overstates the doctrine in a specific, exploitable way.
Under Zauderer v. Office of Disciplinary Counsel, 471 U.S. 626 (1985), the Supreme Court held that the government may compel a commercial actor to disclose “purely factual and uncontroversial information about the terms under which his services will be available,” subject only to rational-basis-style review that the disclosure be reasonably related to a legitimate state interest and not “unduly burdensome.” This is the doctrine that has upheld tobacco warnings, alcohol warnings, prescription drug labeling, the Schumer Box on credit cards, FDA nutrition labels, and California’s Proposition 65 chemical warnings (National Association of Wheat Growers v. Bonta, 85 F.4th 1263 (9th Cir. 2023)). Compelled commercial disclosure of factual public health information is one of the oldest and best-established forms of First Amendment-compliant regulation in American law.
The Ninth Circuit, in the very NetChoice victory they keep citing, expressly preserved this rule. From NetChoice, LLC v. Bonta, 113 F.4th 1101, 1117 (9th Cir. 2024): “As for laws that compel the disclosure of ‘purely factual and uncontroversial’ commercial speech, such laws are subject to a form of rational basis review.” The court struck the California Age-Appropriate Design Code’s data protection impact assessment requirement only because it forced businesses to render a subjective opinion about whether their products harm children. That is affirmative subjective speech, not factual disclosure.
Minnesota’s law does the opposite. The platform displays a state-authored, evidence-based factual notice and a link to the 988 hotline. It is closer to the cigarette package than to the AADC’s compelled opinion-writing. The legal pivot is whether Zauderer applies. If it does, Minnesota wins. If a court is persuaded to apply strict scrutiny instead, Minnesota almost certainly loses. Everything else in this lawsuit is plumbing around that single doctrinal question.
Claim 2: The warning forces websites to broadcast “the government’s controversial views.”
The law requires the label to be “based on current evidence regarding the negative mental health impacts of social media platforms” (Minn. Stat. § 325M.335, subd. 2(a)). That evidence base is federal.
In May 2023, the U.S. Surgeon General released a national advisory titled “Social Media and Youth Mental Health.” The advisory concluded, “we cannot conclude that social media is sufficiently safe for children and adolescents.” It found that teens who spend more than three hours a day on social media have about twice the risk of depression and anxiety symptoms. On June 17, 2024, Surgeon General Vivek Murthy called on Congress to require a warning label on social media platforms, a policy Minnesota later adopted. The American Psychological Association’s 2023 Health Advisory on Social Media Use in Adolescence and its follow-up report, as well as the American Academy of Pediatrics, reached similar conclusions.
A factual statement supported by a federal Surgeon General Advisory, the APA, and the AAP is not “controversial” under the Zauderer standard. Federal courts have used the term to mean only value judgments that are politically or ideologically disputed, not scientific questions where public health authorities agree. See CTIA v. City of Berkeley, 928 F.3d 832 (9th Cir. 2019), which upheld a cell phone disclosure tracking the FCC’s own RF exposure guidelines.
NetChoice’s most strategic move here is to use the Surgeon General’s own honesty against him. The 2023 advisory admits there are “known evidence gaps” in the research. NetChoice argues this means the Surgeon General admits the science is unsettled. That is not the case. Admitting research gaps does not mean there is controversy. The Surgeon General made a public health decision after reviewing all the evidence, including its limits. This is how regulatory science works.
Claim 3: “Minnesota carved out TV networks and gaming platforms, yet targets places like YouTube and X because that is where free speech thrives.”
This is the easiest claim to rebut by looking at the statute itself, and it is worth showing readers directly. The press release suggests that Minnesota’s law targets certain platforms because of their political content, but the actual text shows the opposite.
Minn. Stat. § 325M.31(j) defines a “social media platform” as a service that “allows an account holder to create, share, and view user-generated content for a substantial purpose of social interaction, sharing user-generated content, or personal networking.” It then explicitly excludes:
- Internet search providers
- Internet service providers
- Email services
- Streaming services, online video games, e-commerce, and other websites where the content is not user-generated and interactive functions are incidental.
- Employer communication services for business activities
- Advertising networks
- Telecommunications carriers
- Broadband services
- Single-purpose community groups for education or public safety
- Teleconferencing or video-conferencing services
- Cloud computing services
- Technical support platforms
- Platforms designed primarily for creative professional users (portfolio platforms, creative networking)
None of these exclusions are based on political viewpoint. They are based on how the products work. The Minnesota legislature drew the line between services mainly designed for social interaction, which the Surgeon General and APA identified as increasing mental health risks, and services that do not work that way. ISPs are not excluded because of political reasons, but because no one has argued that their product design causes adolescent depression. The same logic applies to cloud storage, telecoms, and portfolio sites for creative professionals.
The Supreme Court’s decision in Williams-Yulee v. Florida Bar, 575 U.S. 433 (2015), and the Ninth Circuit’s opinion in Wheat Growers both confirm that a regulation does not have to be perfectly comprehensive to survive First Amendment scrutiny. Underinclusiveness is only a problem if it shows the stated interest is not genuine. In this case, the law draws the line exactly where the public health evidence does. That is not a pretext; it is regulatory specificity.
Claim 4: Minnesotans “will not be able to access digital content until they ‘affirm’ they understand the allegations the government is requiring them to see.”
This misrepresents the statute. Subdivision 1(a)(2) gives users two ways for the label to disappear: they can exit the platform, or they can “acknowledge the potential for harm and choose to proceed to the social media platform despite the risk.” This is a one-click acknowledgment, just like the “I understand” buttons users already see on cookie banners, age gates, terms-of-service screens, and HIPAA notices.
The law’s repeated-display requirement is a real design choice, and some courts may scrutinize the repetition. But repetition of a one-click acknowledgment is different from an access ban. It is not a paywall, age gate, or restriction on what users can read or post. For comparison, California’s Prop 65 carcinogen warnings, OSHA workplace hazard placards, alcohol pregnancy warnings under 27 U.S.C. § 215, and the alcohol-cancer warning the Surgeon General called for in January 2025 are not treated as access bans on the products they cover. Even NetChoice’s own argument admits that a brief acknowledgment does not “prevent” speech. It simply goes along with it.
Claim 5: Politicians are using “public health” as a “backdoor means to control online speech,” and adults can decide for themselves and for their children.
A warning is the opposite of speech control. It adds information and lets users make their own choices, which, NetChoice says, is the libertarian idea it supports. The Supreme Court has recognized for decades that disclosure requirements help, rather than limit, the marketplace of ideas because they provide consumers with information rather than suppressing speech (Zauderer, 471 U.S. at 650; Milavetz v. United States, 559 U.S. 229 (2010)).
Minnesota’s statute does not restrict any user’s content. It does not ban any platform. It does not prohibit any post. It does not condition access on identity verification. Those are the features that doomed other state laws NetChoice has won against. NetChoice v. Bonta (9th Cir. 2024) struck down California Age-Appropriate Design Code provisions that required platforms to opine on harm in DPIA reports and to enforce age-estimation, mitigation, and content-moderation duties — affirmative restrictions on operations and forced subjective speech. NetChoice v. Yost (S.D. Ohio 2024) and NetChoice v. Griffin (W.D. Ark. 2024) enjoined parental-consent and age-verification mandates that conditioned minors’ access to lawful speech. NetChoice v. Reyes (D. Utah 2024) struck a curfew/age-verification regime.
None of these laws were narrow factual-disclosure regimes. All imposed access restrictions or compelled subjective speech. NetChoice cites them throughout the Minnesota complaint as if they are interchangeable precedents. They are not. A federal court applying Zauderer faithfully should evaluate Minnesota’s law on its own much narrower terms, against the framework that has upheld tobacco, alcohol, drug, and chemical warnings for decades.
The argument NetChoice doesn’t lead with
The strongest legal claim in NetChoice’s filing isn’t in their press release. Spokesperson Paul Taske said Minnesota’s law is content-discriminatory because it targets some platforms and exempts others. As shown above, the statutory text rebuts that on its face. The exclusions are about product architecture, not viewpoint.
But there is a more serious legal argument NetChoice does make in the complaint itself, and Minnesota will have to defend against it: the law applies to adults as well as minors. NetChoice will likely point to Free Speech Coalition, Inc. v. Paxton, 606 U.S. 461 (2025), where the Supreme Court considered Texas’s age-verification law for adult content. The Court applied intermediate scrutiny — not strict — and ultimately upheld the Texas statute. NetChoice’s challenge will be to argue that Minnesota’s warning is more burdensome on adult speech than Texas’s age-verification requirement, which is a hard sell because Minnesota’s law requires no identity verification at all.
This is the argument Minnesota’s lawyers will need to win. The defensive frame is straightforward: the warning informs, it does not restrict. Acknowledgment is one click. The 988 hotline link is a public health resource, not a barrier. Adults retain full access to the platform with a single click, the same model used for OSHA workplace warnings and pharmaceutical packaging that adults have lived with for 50 years. The warning’s burden on adult speech is dramatically lighter than the Texas law that already cleared intermediate scrutiny. But it is a real argument, and pretending it isn’t would be naive.
What this lawsuit is telling us
Industries don’t typically spend this much on litigation unless they expect the regulation to bite. NetChoice hired a former Minnesota Solicitor General to handle a 48-page complaint over a one-click acknowledgment with a 988 link. That is a significant use of legal resources from a trade group whose members have almost unlimited legal capacity. They are not doing this because the warning is just symbolic. They are doing it because the warning could change user behavior at the most important moment: when people return to the platform.
That is the main purpose of a warning label. The Surgeon General called for one because the current system of industry self-regulation and platform-controlled disclosures led to the worsening outcomes he is now warning about. NetChoice’s complaint lists, in paragraphs 29 through 52, the many voluntary self-regulatory steps platforms have taken, including parental controls, screen-time tools, sensitive content filters, family pairing features, and default settings for teen accounts. All of these are real, but none have been enough. That is why the Surgeon General issued the advisory and why a Los Angeles County jury awarded $6 million in March 2026, finding Meta and YouTube failed to adequately warn users of the risks.
Minnesota took the chance that other states would follow. California already has. The lawsuit is the price of being first. It may also be the cost of being right.
References
- NetChoice, "Free Speech Online in Minnesota Doesn't Need Government Censorship Labels" (press release, April 29, 2026). https://netchoice.org/free-speech-online-in-minnesota-doesnt-need-government-censorship-labels/
- NetChoice v. Ellison, Complaint for Declaratory and Injunctive Relief, U.S. District Court for the District of Minnesota (filed April 29, 2026). https://netchoice.org/wp-content/uploads/2026/04/NetChoice-v-Ellison-2026-04-29-01-Complaint-for-Declaratory-and-Injunctive-Relief.pdf
- NetChoice, "NetChoice v. Ellison – Minnesota Censorship Labels" (case page). https://netchoice.org/netchoice-v-ellison-minnesota-censorship-labels/
- NetChoice, "About: Our Members." https://netchoice.org/about/
- Minnesota House File 2, 2025 First Special Session, Chapter 3, Article 19, Section 13, signed June 14, 2025; codified at Minn. Stat. § 325M.335 ("Mental Health Warning Label"); effective July 1, 2026. https://www.revisor.mn.gov/bills/bill.php?b=House&f=HF2&ssn=1&y=2025
- Minn. Stat. § 325M.31 (definitions and exclusions). https://www.revisor.mn.gov/statutes/cite/325M.31
- Zauderer v. Office of Disciplinary Counsel, 471 U.S. 626 (1985). https://www.law.cornell.edu/supremecourt/text/471/626
- National Association of Wheat Growers v. Bonta, 85 F.4th 1263 (9th Cir. 2023).
https://cdn.ca9.uscourts.gov/datastore/opinions/2023/11/07/20-16758.pdf - NetChoice, LLC v. Bonta, 113 F.4th 1101 (9th Cir. 2024).
https://cdn.ca9.uscourts.gov/datastore/opinions/2025/09/09/25-146.pdf - CTIA v. City of Berkeley, 928 F.3d 832 (9th Cir. 2019).
- Williams-Yulee v. Florida Bar, 575 U.S. 433 (2015). https://www.law.cornell.edu/supremecourt/text/13-1499
- Milavetz, Gallop & Milavetz, P.A. v. United States, 559 U.S. 229 (2010). https://www.law.cornell.edu/supremecourt/text/08-1119
- Free Speech Coalition, Inc. v. Paxton, 606 U.S. 461 (2025).
- NetChoice v. Yost, S.D. Ohio (2024).
https://netchoice.org/wp-content/uploads/2024/01/2024.01.05-NetChoice-v-Yost-Complaint-for-Declaratory-and-Injunctive-Relief-FILED.pdf - NetChoice v. Griffin, W.D. Ark. (2024).
https://netchoice.org/wp-content/uploads/2026/04/NetChoice-v.-Griffin-Arkansas-Act-900-Enjoined_Apr-20-2026.pdf - NetChoice v. Reyes, D. Utah (2024).
https://netchoice.org/wp-content/uploads/2023/12/NetChoice-v-Reyes_Official-Complaint_FINAL.pdf - U.S. Surgeon General, Social Media and Youth Mental Health: The U.S. Surgeon General's Advisory (May 2023). https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf
- Vivek H. Murthy, "Surgeon General: Why I'm Calling for a Warning Label on Social Media Platforms," The New York Times (June 17, 2024). https://www.nytimes.com/2024/06/17/opinion/social-media-health-warning.html
- American Psychological Association, Health Advisory on Social Media Use in Adolescence (May 2023). https://www.apa.org/topics/social-media-internet/health-advisory-adolescent-social-media-use
- American Academy of Pediatrics, Center of Excellence on Social Media and Youth Mental Health. https://www.aap.org/en/patient-care/media-and-children/center-of-excellence-on-social-media-and-youth-mental-health/
- U.S. Surgeon General, Alcohol and Cancer Risk: 2025 Surgeon General's Advisory (January 2025). https://www.hhs.gov/sites/default/files/oash-alcohol-cancer-risk.pdf
- 27 U.S.C. § 215 (alcoholic beverage labeling). https://www.law.cornell.edu/uscode/text/27/215
- Dani Anguiano, "Meta and YouTube designed addictive products that harmed young people, jury finds," The Guardian (March 2026) https://www.theguardian.com/media/2026/mar/25/jury-verdict-us-first-social-media-addiction-trial-meta-youtube
- California Assembly Bill 56 (2025–2026 Regular Session, chaptered) – social media warning label legislation. https://legiscan.com/CA/text/AB56/id/3273339
Further Reading
- Reuters Connect, "EXPLAINER: Why are social media companies under pressure over 'addictive' platform design?" — Useful international context on litigation and regulatory trends targeting platform design. REPEAT- EXPLAINER: Why are social media companies under pressure over 'addictive' platform design? | Reuters Connect
- United Kingdom Government, Online Safety Act 2023 — A simple guide. https://www.gov.uk/government/publications/a-guide-to-the-online-safety-act — A comparative example of how another democracy has addressed platform-design harms to minors.
- European Commission, The Digital Services Act. https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en — The EU's framework for platform transparency, risk assessment, and minor protection.
- Australian Government, "Social Media Ban in Australia: A Simple Guide." — Background on Australia's under-16 social media restrictions, often invoked in the U.S. policy debate. Social media ban in Australia | A simple guide
- World Health Organization, WHO Report on the Global Tobacco Epidemic 2019: Offer Help to Quit Tobacco Use. https://www.who.int/publications/i/item/9789241516204 — Cited frequently in disclosure-law debates as the gold-standard public health warning model.
- Noar et al., "Understanding Why Pictorial Cigarette Pack Warnings Increase Quit Attempts," PubMed Central. https://www.ncbi.nlm.nih.gov/pmc/ — Empirical evidence that point-of-use warnings change behavior, the underlying premise of Minnesota's law.
- U.S. Food and Drug Administration, "Prescription Stimulant Medications" consumer information. https://www.fda.gov/ — Example of federally compelled disclosure for products with documented harm to adolescents.
- Federal Trade Commission, Alcohol Marketing and Advertising: A Report to Congress. https://www.ftc.gov/ — Federal precedent for industry-targeted disclosure rules grounded in public health concerns.
- Federal Trade Commission, How to Make Effective Disclosures in Digital Advertising (.com Disclosures). https://www.ftc.gov/business-guidance/resources/com-disclosures-how-make-effective-disclosures-digital-advertising — The federal standard for what makes a digital disclosure "clear and conspicuous," the same statutory phrase Minnesota used.
- Federal Trade Commission, Competition and Consumer Protection Guidance Documents. https://www.ftc.gov/business-guidance — General federal framework for consumer-protection disclosures, useful background for readers new to compelled-speech regulation.

