
The newest bill calling for tech companies to take responsibility for the impact social media has on consumers was rejected Sept. 1 by California lawmakers.
Senate Bill 680 was proposed in hopes of “holding social media platforms liable for promoting harmful content about eating disorders, self-harm and drugs’’ but was ended after “California’s influential tech industry, including TechNet and Netchoice, worked for months to defeat the bill,” wrote Queenie Wong in a Los Angeles Times article.
Similar hearings have occurred previously, such as a bill that attempted to pass in 2022 that would “allow local prosecutors to sue social media companies for knowingly using features that can cause children to become addicted,” but was denied by California lawmakers after TechNet and Netchoice lobbied against the bill, according to Adam Beam in a Los Angeles Times article.
Dr. Mark Kim, assistant professor of computing, software, and data sciences, explained why technological companies such as TechNet and Netchoice would rather not see bills like this one pass.
“These companies are trillion-dollar monopolies because they have all this data. They have all this data and insight into areas that are generally categorized as private,” Kim said. “They want to monetize that data and take no accountability about potential misuse of it because they have no obligation to police the internet. They already have all the power and all the data so they can sell whatever they want. Plus, they currently can’t get sued and don’t want to get sued for doing so – so they’ll kill the bills.”
Tech company executives have long been questioned about the responsibility they hold in ensuring healthy and safe experiences in the digital world.
In 2021, a Senate hearing involving executives from YouTube, TikTok, and Snapchat was prompted after a Wall Street Journal article detailed how internal Facebook research showed that Instagram can make body image issues worse for some young people.
According to Kim, the people creating the algorithms are the ones who bear the responsibility.
“Algorithms scrub things all the time. For example, misinformation is scrubbed. Who writes the rule as to what gets scrubbed? That’s the great question and that’s what people want to know,” Kim said. “We want to see the rules. We want to see the keywords that are created that throw out certain things. We want that transparency with these tech companies.”
Kim noted, though, that consumers are accountable as well.
“There’s a symbiotic relationship with technology. Companies get data but we get things such as Google Search for free,” Kim said. “There are disclosures on all apps that collect data and that is the consumer’s responsibility to understand when you download something, you are accepting conditions. But there is progress being made in transparency because before we didn’t even have disclosures. Now we do.”
Shannon Turpen, senior health science major, also said consumers should take responsibility for themselves, too.
“I don’t know if there’s a point to these bills because no matter what, people will find a way to search or find what they want to find online,” Turpen said. “I don’t think tech companies should be able to control what you can or can’t see at all. A more effective tool would be more resources for educating people on how to use technology and social media, especially for educating parents, because a lot of parents don’t understand what’s going on so they can’t educate their kids or help them.”
Arsema Tefferi, senior health science major, agreed with Turpen.
“At the end of the day, tech companies can’t control what people say, so it’s up to people to maintain their own responsibility. People thrive off rule-breaking so they’ll find a way to see what they want no matter what,” Tefferi said.
One bill that did pass in California was AB 587, a law requiring social media companies to publicly post their content moderation policies, according to adl.org.
Although Gov. Gavin Newsom signed the bill into law in 2022, there continues to be pushback. X, previously known as Twitter, began a lawsuit on Sept. 8 against the state of California over the law, claiming that “it has the effect of pressuring companies to remove and demonetize constitutionally-protected speech,” wrote Clare Duffy in an article for CNN. Netchoice also opposes the law, according to their website.
Although the ongoing debate between tech companies and the legislature has yet to conclude with a definitive answer as to what the future of tech safety may look like, consumers have the choice to protect themselves by choosing what content to download, follow, read, subscribe to, and partake in.
The best way to take action is to become informed and mindful.