Nebraska among states with new laws in 2025 to protect children’s online safety

October 9, 2025
|


Between the ages of 8 and 18, young people spend 4.8 years of their waking lives in front of a screen.

Most state legislators didn’t grow up in this digital world, but they’re immersed in it today — including as policy leaders looking to address concerns about the online safety and well-being of today’s generation of children and teens. In 2024, legislatures in 23 U.S. states passed 48 online child safety laws, according to New York University’s Center on Tech Policy. That legislative momentum has carried into this year.

To date, the proposed and enacted legislation has mostly taken two forms:

1. Demand of certain online sites or social media platforms age verification and/or parental consent (see maps); or
2. Require online service providers to have age-appropriate design features in their products.

Nebraska Sen. Carolyn Bosn learned about the latter approach early in her legislative career, and says she became interested in what California had done by what she saw as a focus on “product design” rather than “content moderation.” Passed in 2022, California’s first-in-the-nation Age-Appropriate Design Code Act soon became the subject of litigation, and has subsequently been blocked by federal courts on grounds that it violates free-speech rights.

Could Nebraska pass its own version of a design-code law, using California’s framework but making adjustments and addressing some of those constitutional concerns? Bosn set out to try, and the result was this year’s passage of LB 504. Nebraska is now one of only of four U.S. states where some version of an Age-Appropriate Design Code law has been passed. (California, Vermont and Maryland are the others.)

Generally, these laws set out to safeguard young people’s data, privacy and safety, as well as their mental well-being. They do so by regulating the design features of an online product — for example, through more-protective default settings for young users or a prohibition on certain design features. But the details of these legislative proposals can vary widely from state to state, and continue to evolve, says Bailey Sanchez, who tracked and analyzed this activity as a deputy director at the Future of Privacy Forum. (She left this position in late September.)

Not only does Nebraska’s new law diverge from California’s, Bosn says, the final version of LB 504 is considerably different from the bill she introduced early in 2025, a reflection of session-long negotiations and compromises.

“We took in a lot of feedback from a lot of people, including opponents, and we really did make an effort to say, ‘OK, what kinds of guardrails can the tech industry comply with and still be successful as an industry, but also will actually have meaningful protections for minors,’ ” Bosn says.

She expects more tweaks to the law next session, as well as legal challenges once it takes effect, in January 2026, and enforcement can begin, in July 2026.

Details on design-code law

Regulate product design features. Empower parents. They are two common, but often distinct, considerations for legislators when looking to protect children’s online safety.

“The design-code laws typically don’t mention the name ‘parent’ at all,” Sanchez says.

Nebraska’s LB 504 is unique, she says, because of its inclusion of language on design features and the rights of parents. Specifically, the law says online service providers “shall provide parents with tools to help [them] protect and support minors using covered design features” and that these tools “be enabled by default.”

This includes managing a child’s privacy and account settings, preventing financial transactions, and restricting the time spent by the child on an online site or platform.

According to Bosn, throughout debate on LB 504, the perspectives and concerns of parents were front and center. Nebraska legislators heard from a parent whose child died from a fentanyl overdose after purchasing the drug on a social media platform and from a mother who said her daughter’s addiction to social media led to a near-fatal eating disorder.

“It was really powerful testimony from a mom who thought she was really on top of things and who had lots of parental controls,” Bosn says. “But it still happens. It can’t simply be just, Oh, you’re not a good parent because you let your kid do this. No, you can have really good, attentive parents and you still see this happening.”

For Bosn, one priority with LB 504 was to ensure the availability of settings that limit access to harmful or “addictive feeds”: content served to a user based on recommendation algorithms.

“If a young girl looks up how to make a healthy smoothie, you as a parent don’t want the next video to be, here’s some good skills on being a bulimic” because of the algorithm, she says.

Nebraska’s new law requires “accessible and easy-to-use tools” that allow minors to instead choose a “chronological feed.” Those same tools must be provided in areas such as data privacy and online transactions. Additionally, minors (and their parents) must have the option of setting limits on how much time they spend on the site or online platform.

What Nebraska did not include, Bosn says, is a requirement that online service providers report on or assess the potential harms of content, a provision in the California law that proved problematic when challenged on First Amendment grounds.

Nebraska also narrowed which providers fall under the design-code. LB 504 only applies to an online service that derives 50 percent of its revenue from the sale or sharing of consumers’ personal data and that has more than $25 million in annual gross revenue. Exempt, too, are services with “actual knowledge” (based on marketing and advertising data or self-identification by individuals) that fewer than 2 percent of users are minors.

Nebraska legislators passed a bill on age verification and parental consent as well. Under LB 383, social media companies must use a “reasonable age verification method” when an individual wants to open an account on their platform. This can include use of a digitized ID card or another “commercially reasonable” method. If the person seeking a social media account is a minor, a parent or legal guardian must provide signature authorization. Parents also must be able to revoke consent and have a child’s social media account removed.

As of August 2025, 13 U.S. states had laws restricting children’s access to social media and/or requiring parental consent, according to the Age Verification Providers Association.

Nebraska is the only Midwestern state with such a law. An Ohio measure from 2023 has been permanently enjoined by a U.S. District Court judge, who cited several constitutional problems with the statute, including what he said was vague wording and an infringement of the free-speech rights and expression of minors.

Shield vs. sword

In a presentation this summer at The Council of State Governments’ Midwestern Legislative Conference Annual Meeting, Duke University professor Nita Farahany placed state policy responses into two categories: one the “shield,” the other the “sword.” The shield approach focuses on safeguards from the potential downsides of living in today’s digital world; the sword, she said, emphasizes building the capacity of individuals to thrive in the digital age and preserve their “cognitive liberty.”

“The sword side, reinvigorating those core capacities, may be where there is the most promise,” Farahany said, noting that some of the “shield” approaches are being struck down by the courts.

In Finland and Sweden, she said, part of the sword approach is integrating a digital education into classrooms. At a young age, students in these countries are exposed to online content and social media sites, and have the chance to evaluate them and understand how they work.

“[Classes] are teaching them the skills to navigate as digital natives from day one, recognizing that … depriving them of technology in the classroom is not going to prepare them for the inevitable future that they’re facing,” Farahany said.

Overview of Recent State Actions

Age-Appropriate Design Code Bills

Recent Legislative Activity in Midwest on Age-Appropriate Design Code

StateBillStatus as of October 2025
IllinoisSB 50 of 2025 and SB 51 of 2025 Did not pass prior to close of 2025 regular session
MichiganHB 5823 of 2024Did not pass
MinnesotaHF 2257 of 2023/SF 2810 of 2023Did not pass
MinnesotaHF 48 of 2025/SF 2810 of 2025 (prohibit certain social media algorithms targeting children)Did not pass
NebraskaLB 504 of 2025 (design code)Signed into law
Other Legislative Activity in Midwest

Age verification and parental consent

Ohio was the first Midwestern state requiring social media companies to verify a user’s age and obtain parental consent for children younger than 16. Its 2023 law (part of HB 33) has been blocked by a federal court. This year, Nebraska lawmakers passed LB 383, which requires age verification and parental consent before a social media account is opened for ages 17 and under. These two laws place the responsibility of age verification/parental consent on individual apps and platforms. A few states outside this region have made it the responsibility of app store developers as well.

New rules for adult sites

To date, more states have adopted narrower age-verification laws that apply to adult websites with sexually explicit content. Kansas was the first Midwestern state where such a law took effect (SB 394 of 2024). These measures also are in place in Indiana, Nebraska, Ohio, North Dakota and South Dakota. In a closely watched 2025 U.S. Supreme Court case, justices upheld a Texas law requiring users to verify their age before gaining access to these sites.

Cell phones in schools

Most Midwestern states now have policies to curb or prohibit students’ use of cell phones during the school day. The trend began with the signing of SB 185 in Indiana in March 2024. According to Education Week, cell phone laws or policies also are in place in Iowa, Kansas, Minnesota, Nebraska, North Dakota, Ohio and South Dakota. These measures often include some exceptions for certain uses or users, but under a general framework that students should not be using cell phones in school.

Compensation for minors

Illinois updated its child labor law in 2023 with a first-in-the-nation measure to protect the rights of minors under age 16 who are featured in monetized video blogs or other online content.

Under the law (SB 1782), a portion of earnings must be placed in a trust for these children. Minnesota also now requires that children be compensated for monetized online content (HF 3488 of 2024). Additionally, they can request that videos featuring them be permanently deleted. Under the same law, children 13 and under are prohibited from “engaging in the work of content creation.”

Warning label and privacy protections 

Minnesota will require social media platforms to post a “conspicuous mental health warning label,” a provision included in a larger finance bill (HF 2 of 2025). Additionally, a broader privacy law in Minnesota (HF 4757 of 2024) includes special protections for youths: Businesses must obtain parental permission before selling the personal data of children under age 16 and before using this data for targeted advertising.