Social media platforms like TikTok and Instagram have harnessed powerful algorithms to captivate millions of users with endless streams of entertaining videos and content, transforming the way new generations scroll through their screens.
But critics, from policymakers to concerned parents, say those algorithms lead young people down rabbit holes of problematic content, from misinformation and risky social media challenges to dangerous videos about eating disorders and self-harm.
New York State officials on Wednesday unveiled a bill to protect young people from potential mental health risks by prohibiting minors from accessing algorithm-based social media feeds unless they have permission from their parents.
Gov. Kathy Hochul and Letitia James, the state attorney general, announced their support of new legislation to crack down on the often inscrutable algorithms, which they argue are used to keep young users on social media platforms for extended periods of time — sometimes to their detriment.
If the bill is passed and signed into law, anyone under 18 in New York would need parental consent to access those feeds on TikTok, Instagram, Facebook, YouTube, X and other social media platforms that use algorithms to display personalized content. While other states have sought far-reaching bans and measures on social media apps, New York is among a few seeking to target the algorithms more narrowly.
Sign up for the New York Today Newsletter Each morning, get the latest on New York businesses, arts, sports, dining, style and more. Get it sent to your inbox.
The legislation, for example, would target TikTok’s central feature, its ubiquitous “For You” feed, which displays boundless reams of short-form videos based on user interests or past interactions. But it would not affect a minor’s access to the chronological feeds that show posts published by the accounts that a user has decided to follow.
“Our children are in crisis, and it’s up to us to save them,” Ms. Hochul, a Democrat, said during a news conference in Manhattan, using stark terms to compare the addictive effects of social media to underage drinking and smoking.
“Do you understand how an algorithm works?” she asked. “It follows you. It preys on you.”
The bill would also allow parents to limit the number of hours their children can spend on a platform and block their child’s access to social media apps overnight, from midnight until 6 a.m., as well as pause notifications during that time.
Meta, the parent company of Instagram and Facebook, said that algorithms allow teens to find like-minded interests and communities, and that it uses them to quickly identify and remove harmful content from its platforms. Both Meta and TikTok also pointed to a range of tools the companies have implemented to give parents increased control and limit the types of content teens can see.
“We want young people to have safe, positive experiences across the internet,” Antigone Davis, the head of global safety at Meta, said in a statement.
Tech:NYC, which represents more than 800 tech companies, raised objections to the proposed legislation, suggesting it could infringe on free speech. It also raised logistical and privacy concerns with verifying the identities and age of users, which it said could require sharing government documents.
Policies to curb the effect of social media on youth mental health have attracted bipartisan support. President Biden in his State of the Union this year called on Congress to ban targeted advertising online for children. In May, the surgeon general warned against the detrimental effect that intensive social media usage can have on anxiety and depression among adolescents.
Earlier this year, Montana enacted an outright ban on TikTok, resulting in a lawsuit being financed by the social media giant that has held up enforcement of the law.
In March, Utah became the first state to pass a law, set to take effect next year, that will require users under 18 to receive consent of a parent or guardian to create a social media account; Arkansas would have become the first state to implement such a law in August, but it was temporarily blocked by a federal judge.
And following a landmark law in the European Union last year, users there are allowed to easily opt out of so-called personalized content feeds.
The bill in New York, which could be considered as soon as January when the 2024 legislative session begins, is likely to confront resistance from tech industry groups. The bill’s sponsors, State Senator Andrew Gounardes and Assemblywoman Nily Rozic, said they were readying for a fight.
But Ms. Hochul’s enthusiastic support of the bill — she rarely joins lawmakers to introduce bills — is a sign that it could succeed in the State Capitol, which Democrats control.
A second bill unveiled on Wednesday is meant to protect children’s privacy by prohibiting websites from “collecting, using, sharing, or selling personal data” from anyone under 18 for the purpose of advertising, unless they receive consent, according to a news release.
“Social media platforms manipulate the content children see online to keep them on the platforms as long as possible,” Ms. James said. “They know that the more time children spend online, the more ads they will see and the more data that can be collected to sell to advertisers.”
Both bills would empower the state attorney general to go after platforms found in violation.