Anson Stevens-Bollen
Social media’s core design puts children’s privacy and health at risk. That’s one of the takeaways from a 2021 study in which researchers set up social media accounts and used children as their avatars.
Within a day of account creation in this study, 14-year-old “Justin” received three solicitations containing pornography. After spending five minutes on Instagram for two days, two of the accounts were followed by pages featuring racist and derogatory content.
“Despite registering accounts as the age of a child, all accounts were solicited with sexual content, requests from adults for contact, self-harm and suicide material, crash diets and other extreme body image content,” the study, conducted by London-based child advocacy group the 5Rights Foundation, reads.
The New Mexico Attorney General’s office conducted a similar investigation across Meta platforms Facebook and Instagram, resulting in Attorney General Raúl Torrez filing a lawsuit last December against Meta for failing to remove child sexual abuse material across its platforms and harming minors through the addictive design of the sites.
For these reasons, more than 20 local and national organizations (including the Tech Oversight Project and New Mexico’s National Education Association) hope lawmakers across the US can make a dent in the problem by regulating the design of social media platforms for underage users. This year, their eyes are on New Mexico, Maryland and Minnesota, where legislators have introduced proposals.
Greta McAnany, a California-based national advocate for these bills and CEO of youth mental health app Blue Fever, says the main focus of this legislation would be to require social media companies to start with “safety by design and privacy by default” for children.
“Right now, these platforms are able to gather all kinds of data on young people and then go ahead and use that data—and some might say weaponize it—to create more engagement and more time on the app,” McAnany tells SFR. “They manipulate that personal data collection to profit off of it.”
After a version of the measure died in committee during New Mexico’s 2023 legislative session, Sen. George K. Muñoz, D-Gallup, has re-introduced the Age Appropriate Design Code Act bill this year: Senate Bill 68. Also known as The New Mexico Kids Code, this New Mexico bill would require tech companies that provide online products and services to operate with strict data privacy designed to prevent children’s personal data from being stored, sold or leaked. Companies violating the law would be forced to pay fines between $2,500 and $7,500 per affected child.
SB 68 co-sponsor Rep. Pamelya Herndon, D-Albuquerque, says the revamped bill has children’s safety at its heart.
“If people have online platforms they know students are engaging in, we don’t want them to be able to sell that data to others,” Herndon tells SFR. “They might use it in a nefarious way to target students unnecessarily and cause them to be involved in activities that not only harm them, but also members of their families.”
Herndon says she believes the bill has the potential to pass this year, but lawmakers at the Jan. 30 Senate Tax, Business and Transportation Committee tabled it on a 6-1 vote.
Even if SB68 passes and Gov. Michelle Lujan-Grisham—who has publicly expressed her support for the bill—signs it into law, New Mexico could face legal challenges from tech companies.
So far, California is the only US state to adopt an Age Appropriate Design Code Act, but it’s under a court challenge. Tech industry trade organization NetChoice—which counts companies like Google, Amazon, Meta, and TikTok as members—filed an action in federal court claiming the act violates the First Amendment, which resulted in a federal judge issuing a preliminary injunction to prevent the law from going into effect.
In New Mexico, SB68 states the Legislature does not intend for anything in the bill “to infringe on the existing rights and freedoms of children,” as a way to avoid such claims.
Sen. Craig Brandt, R-Rio Rancho, didn’t think that was enough protection.
“I just think the smart thing for us to do, at this point, is to wait and see where the lawsuits go, see if this is deemed to be constitutional under the US Constitution, before we try to take this kind of action,” he said the the committee hearing. “And honestly, the young people we think we’re going to block with this, they’re smarter than us, and they can get around it. I understand there’s problems with children getting to content they shouldn’t get to, but I am always going to side on the side of freedom at this point.”
Additionally, the bill received opposition from a few organizations for using vague language to define what constitutes “the best interest of a child,” which Equality New Mexico Executive Director Marshall Martinez pointed out could be used to target LGBTQ online content. Brandt, as well as Sens. Carrie Hamblen, D-Las Cruces, and Leo Jaramillo, D-Española, also expressed these concerns.
ACLU New Mexico Director of Public Policy Nayomi Valdez also noted, “We’re concerned that by design, this bill gives a lot of discretion to the Attorney General to determine what is and what is not harmful content, as well as some broad language mentioned previously, and the incentives that might create for platforms to restrict access to content that is constitutionally protected.”
The lobbying organizations behind the bill plan long-term to create a “domino effect” where more states begin developing and adopting Kids Code bills.
“It would be too difficult for these companies to have different user experiences across states, and there’s going to have to be some sort of national guideline,” McAnany explains.
Herndon says her work on pushing the bill has involved hearing from Eldorado High School students in Albuquerque.
“We have been speaking to students who are concerned about how they might be targeted, too,” Herndon says. “We are not interfering with anybody’s First Amendment rights. What we are doing is all about safety, and keeping individuals who are under the age of 18 safe.”