Patchwork of age check, online safety legislation grows across US

As the U.S. waits for the Supreme Court’s opinion on the Texas case of Paxton v. Free Speech Coalition, which is expected next week and will establish a precedent for age verification laws, various pieces of legislation are continuing to ricochet through the courts – and some are facing the monolithic obstacle of Big Tech.
Judge blocks Mississippi age assurance law for second time
Mississippi’s age assurance legislation has been blocked – again. A judge in the U.S. District Court for the Southern District of Mississippi has granted a preliminary injunction on the 2024 law, known as the “Walker Montgomery Protecting Children Online Act,” while a lawsuit is challenging its constitutionality.
None other than NetChoice LLC is behind the lawsuit. The lobby group for the world’s largest tech firms has mounted multiple legal challenges to age verification legislation, and generally positioned itself as a primary protagonist to lawmakers pursuing laws around age verification and age estimation.
According to Bloomberg, Judge Halil S. Ozerden says that, because NetChoice is likely to succeed on its initial challenge to the Mississippi law, and because “NetChoice’s member companies faced irreparable harm in the form of loss of First Amendment freedoms and monetary damages,” it makes sense to pause the law until the case is resolved.
HB 1126 requires social media sites and apps to make “reasonable efforts” to verify a user’s age, and to attain parental consent before letting minors onto their platforms.
The judge already threw one injunction at Mississippi’s law, but the U.S. Court of Appeals for the Fifth Circuit vacated it, and sent the case back for a “factual inquiry into the law’s scope and its potentially unconstitutional applications.”
Ozerden says NetChoice “met its burden of showing the law wasn’t narrowly tailored to achieve the state’s goal of protecting children from harmful online content.”
Compliance part of the cost of doing business, says Tennessee judge
In Tennessee, however, it has not been so lucky. On the same day Judge Ozerden granted the Mississippi injunction, NetChoice was denied a preliminary injunction in a separate challenge to a similar Tennessee law.
Judge Eli Richardson of the US District Court for the Middle District of Tennessee questioned the trade group’s assertion that it would suffer “irreparable harm” from the law absent an injunction.
HB 1891 requires social media platforms to implement age-verification for all social media accounts. Like Mississippi’s law, it has parental consent rules for minors, and “requires a social media company to provide a minor account holder’s parent with means for the parent to supervise the minor’s account.”
The judge says “there was no evidence in the record that NetChoice members had faced enforcement or the threat of enforcement related to HB 1891, and the state hadn’t refused to disavow its enforcement until conclusion of the litigation.”
Moreover, Richardson waves off NetChoice’s argument that compliance costs related to the act would hobble its members, and therefore necessitates an injunction. He says compliance costs are “quite ordinary and are routinely borne (even if only reluctantly) by business entities as a natural (if unwelcome) cost of doing business.”
There is, however, more than money at stake: for violators, the law comes with a felony penalty – among the strongest punitive measures to be found in U.S. age assurance laws.
California, Oregon, Vermont laws get thumbs up from EPIC
The nonprofit Electronic Privacy Information Center (EPIC) has expressed support for California’s Age-Appropriate Design Code (CAADC). The law has faced two preliminary injunctions over questions about First Amendment violations. But MLex reports on EPIC’s comment to a U.S. appeals court that affirming those rulings – which argue the law’s definition of coverage make it entirely content-based – would risk destabilizing a broad swath of data protection law.
“Many foundational and noncontroversial data protection laws, most notably the Children’s Online Privacy Protection Act (COPPA), contain similar types of coverage definitions,” EPIC says. “Legislatures often write data protection laws to narrowly regulate specific industries and entities for common-sense reasons related to relevance and narrow tailoring, not to censor certain topics or viewpoints.”
EPIC has also published comment on two other pieces of state legislation: Oregon’s HB 2008, which amends the Oregon Data Privacy Law to ban the sale of precise geolocation data and the data of minors under 16, and the Vermont Age-Appropriate Design Code (AADC), which was recently signed into law.
Oregon’s law, it says, is good. “Banning the sale of location data and minors’ data protects Oregon residents from some of the worst privacy violations taking place today,” says EPIC Deputy Director Caitriona Fitzgerald in a blog. Oregon is the second state to enact a ban on the sale of some forms of personal data, following Maryland.
(Another amendment to the Oregon Data Privacy Law, HB 3875, addresses an emergent data protection pressure point: car manufacturers that process personal data from a consumer’s use of a vehicle.)
Vermont’s AADC is broader, but EPIC is no less enthusiastic about it. “For too long, Big Tech has exploited the presence of children on their platforms to turn a profit while ignoring the privacy and safety risks their design choices cause kids,” says Megan Iorio, senior counsel at EPIC. “The Vermont Age-Appropriate Design Code effectively mitigates these privacy and safety risks and does so in a way that Big Tech’s lawyers won’t be able to easily overturn in court.”
The code requires businesses to configure minors’ default privacy settings to the highest level of privacy, regulates how minors’ personal data is used in personalizing feeds, and requires companies to be transparent about how they use minors’ personal data. It also includes a private right of action, meaning individual Vermonters can sue for noncompliance.
EPIC says the code’s transparency provisions are focused on factual information about companies’ data and design practices, rather than requiring companies to assess whether content could harm minors. “The duty of care is narrowly scoped and explicitly does not include a duty to mitigate harm stemming from content, ensuring that the provision does not interfere with companies’ editorial discretion or users’ rights to information.”
“The law applies to businesses regardless of the content they host and does not include unjustified carveouts for special interests or industries.”
Article Topics
age verification | California | California Age Appropriate Design Code (CAADC) | children | EPIC | legislation | Mississippi | Netchoice | Oregon | Tennessee | United States | Vermont
Comments