


Gov. Kathy Hochul and New York lawmakers are trying to limit kids in the Empire State from being exposed to the “addictive” features of TikTok and other social sites without parental permission.
The new legislation is meant to protect kids from targeted algorithms on social media sites, which try to keep users engaged for as long possible by offering up an endless stream of extreme and sometimes violent content, Hochul announced alongside New York Attorney General Letitia James and other pols.
The laws would require parental permission for kids to access parts of apps such as TikTok, Instagram and YouTube that are controlled by an algorithm. It would be the responsibility of the tech companies to come up with software to do this — but if they fail to comply, the Attorney General would step in with a new, dedicated “Internet” unit to issue fines and force compliance, the pols said.
“These kids are still experiencing the negative effects on their mental health driven by social media and the algorithms that bombard them incessantly that are ultimately intended to create an addiction,” Hochul said.
“They didn’t ask for this,” Hochul said, noting that studies have shown that children’s mental health takes a hit when they are excessively on social media.
“It can take a deadly toll and that’s not just hyperbole,” she said, adding that social media “preys on you.”
The leaders said they believe it would be easy for socal apps to comply with the law, as they already collect age data that they could use to filter out younger users from the predictive algorithms, pols said.
“There is a variety of methods that the social media companies can use to determine the age of a user. They have signals that they use. They can detect user browsers, and there’s other devices where they can alert individuals as to the age of the minor,” James added at the press conference Wednesday. “And all that we are asking is that they be prohibited from using this algorithmic feed (on minors).”
However, one anti-surveillance activist group called the Surveillance Technology Oversight Project — or S.T.O.P. — ripped the proposed legislation, saying age verification technology gives companies more power to track us all, while kids could easily get around the potential protections.
“Online message boards are full of advice posts on how teens can circumvent the tools already on the market,” said S.T.O.P. Executive Director Albert Fox Cahn. “And companies are increasingly incentivized to use more and more draconian means to track our identities.”
The first proposed law would require adult permission for kids under 18 to access suggested feeds on social media apps, which use “predictive algorithms” to force content on kids such as violent posts.
The bill would also allow parents to “opt out” of access for their kids between midnight and 6 a.m. and would ban the platforms from sending notifications to kids during this time, “without verifiable parental consent,” according the AG’s office.
The parents will also have to be given tools by the social media sites to put a cap on the total hours a day their kids use the apps.
The bill would allow the AG to seek damages of $5,000 for every time one of these sites violates the potential child protections and would allow the kids and their parents to sue for $5,000 “per incident,” the AG said.
But the social media companies will be given the chance to “cure” any violations, James said.
A second proposed bill would stop online sites from collecting and sharing personal data for minors unless they get informed consent.
For kids under 13, that informed consent must come from a parent.
The politicians said that a study conducted on young girls showed that the effects of excessive exposure to social media can be worse for them than drinking, sexual assault, obesity and hard drug use.
Other studies showed that rates of depression, anxiety, suicidal thoughts, poor sleep and self-harm are on the rise in children because of their increasing exposure to social sites.
TikTok and Google — the parent company for YouTube — all didn’t immediately return requests for comment Wednesday.
Meta, the parent company of Facebook and Instagram, said they are “evaluating proposed legislation” but already consider the safety of young people.
“We want young people to have safe, positive experiences across the internet. That’s why we’ve built safety and privacy directly into teen experiences,” they said.