FREE CASE EVALUATION | 888-285-3333

Section 230 and Other Laws Affecting Social Media

Section 230 was enacted to protect social media sites from liability, and they took advantage of this freedom to grow into rich and powerful companies. However, social media users, particularly children, who are harmed by the sites, pay the price of this power. If you or a family member are victims of social media addiction and self-harm, contact Cutter Law to fight for laws and compensation that will regulate social media platforms.

Enacted before the dawn of social media, Section 230 helped pave the way for social media sites to develop into the tech giants they are today. Despite safety regulations, social media companies have exerted control and influence over their users with algorithms designed to maximize time spent on social media. However, it is becoming apparent that this increased use of social media can have a devastating effect on mental health, especially for children. A California social media harm lawyer at Cutter Law can help hold social media companies accountable if you or your child has become addicted to social media or has engaged in self-harm due to social media.

Quick Links

Section 230

Section 230 is one of the main pieces of legislation that affects how the Internet operates.

What is Section 230?

Section 230 of the Communications Decency Act was enacted in 1996 to correct the legal outcomes of two court cases from the early 1990s. Those decisions differed over whether internet service providers could be held liable as publishers of the content generated by users.

In 1991, the Southern District of New York held that an internet platform could not be held liable for the content on its website if it did not actively moderate that content. But in 1995, the New York Supreme Court held that internet companies were liable as publishers for content on their platform if they had mechanisms to moderate it. 

In the wake of those decisions, websites that tried to moderate their platform for safety and filter out obscene content could be sued, but sites that made no move to moderate their platform could not be. This result prompted Congress to enact Section 230.

Under Section 230, internet companies are not treated as the “publisher or speaker” of the content if someone else provided the information, meaning the internet companies cannot be held liable for moderating or taking down content.

How Section 230 Allowed Social Media Platforms to Grow

Social media platforms would not have been able to flourish in a pre-Section 230 world because any attempt to moderate content would have exposed them to liability. Yet, they would not have had the resources to moderate all of the content on their platform to ensure all offensive and defamatory content is removed. 

Section 230 allowed social media companies to grow into the enormous corporations they are today because they are unafraid of liability for the content on their platforms.

Recent Controversies Around Section 230

In recent years, updates to Section 230 have been proposed to address current public concerns about moderation practices, free speech, and harmful online content.

There have been two recent cases regarding Section 230 in the Supreme Court: Gonzalez v. Google and Twitter v. Taamneh. These cases question social media sites’ role in allowing content promoting terrorism on their platforms. The outcome of these cases is expected in June 2023.

In 2021, in Lemmon v. Snapchat, the social media site Snapchat was sued by the families of three young men who died using Snapchat’s Speed Filter. They were driving 100 mph. The families argued that the Speed Filter tapped into the brain’s reward center, encouraging users to drive at irresponsible speeds.

The Ninth Circuit ruled that Section 230 did not bar their claim because the Speed Filter was not third-party content shielded from liability. Rather, the Speed Filter was designed by Snapchat and, therefore, not protected by Section 230.

This ruling ushers in a potential new era for the internet, where Section 230 might not be able to shield websites from all liability. 

In addition to these cases, lawsuits over social media harm and addiction are becoming more common nationwide.

The Role of Section 230 in Social Media Liability

Litigation against social media companies got a major boost in 2021 when Meta whistleblower Francis Haugen revealed thousands of internal pages she took with her when she left her job at Facebook. Those documents unveiled Facebook’s internal operations.

One of Haugen’s most revealing claims was that Facebook knew its content was harmful but recognized that such harmful content led to an increase in advertisements and active users. The harmful content attracted more users, which meant more profit. Facebook could have moderated content to keep users safe but chose not to because it did not want to lose revenue.

The social media harm lawsuits will test the extent of Section 230’s protection of social media companies.

Other Federal Social Media Laws

To make the internet safer, the federal government has passed several laws affecting social media companies’ operations.

The Children's Online Privacy Protection Act (COPPA)

Congress enacted the  Children’s Online Privacy Protection Act in April 2000. Its purpose is to protect the privacy of children under 13. It mandates that a website or online service must:

  • Get parents’ permission before collecting data about their child
  • Disclose to parents what information they collect about their child
  • Protect that information

The Computer Fraud and Abuse Act (CFAA)

The Computer Fraud and Abuse Act, a federal law enacted in 1986, criminalized computer use without or exceeding authorization. Congress did not clearly define what it meant by “without authorization” or “exceeds authorization.” A broad interpretation would criminalize violating a service’s terms of use, like sharing a social media password or checking social media on a work computer.

In a 2021 case, the Supreme Court said “exceeds their authorization” occurs when a user accesses parts of a computer where the user had no authorization to be. However, further clarification of “exceeds authorization” was not given, nor was a definition of “without authorization.”

The Digital Millennium Copyright Act (DMCA)

The Digital Millennium Copyright Act was enacted in 1998. Under the law, internet providers and those hosting digital content like social media companies must take an active role in removing copyrighted material from their platform once they get notice of the infringement from the copyright holder.

The Stop Enabling Sex Traffickers Act (SESTA) and Fight Online Sex Trafficking Act (FOSTA)

The Stop Enabling Sex Traffickers Act (SESTA) and the Fight Online Sex Trafficking Act (FOSTA) were enacted in 2018.  Social media platforms can now be held accountable for ads posted by third parties that promote sex trafficking. 

These two federal laws aimed to make it easier to go after social media sites with content promoting sex trafficking.

Dangers of Social Media on Mental Health

The human brain operates on a reward system where dopamine is released, allowing the person to feel “happy” and “pleasure.” Social media companies have taken advantage of this mechanism in the brain. The “reward” is triggered when a person checks their social media and gains followers or receives likes on their videos or posts. 

This dynamic keeps the user returning to social media, leading to an addiction. For addicts, social media’s effect is similar to the high experienced by drug users and gamblers. 

Social media can have an especially detrimental effect on the mental health of children and teenagers.

Meanwhile, social media exposes young people to:

  • Cyberbullying
  • Bigoted and offensive content
  • Content that creates or aggravates body image issues

Recent studies have shown that social media is responsible for the following increased risks for teens:

Current Social Media Harm and Addiction Multidistrict Litigation

Social media harm lawsuits have been filed across the United States. In March 2023, many cases were consolidated into a multidistrict litigation case, In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation

The defendants include social media platforms:

Parents allege that the social media companies’ algorithms caused their children’s eating disorders, attempted suicides, suicides, and other mental health issues. They assert several claims, including:

Many school districts from various states, including California, Washington, Pennsylvania, and Michigan, have joined this social media addiction and social media harm lawsuit.

These cases are still pending.

Protecting Your Children on Social Media

As parents, it’s our job to protect our children. Keep up with how much time your child spends on social media.

Talk about the risks of social media and when they should come to you. Keep an open dialogue so they can bring their concerns to you and so you can take action if you notice significant behavioral changes.

If your child has been seriously harmed by social media content, make sure they receive the necessary physical and mental care they need. You should also immediately seek out a social media harm lawyer. You and your child have rights, and social media companies need to be held accountable for knowingly causing harm to children.

FAQs

What damages can I get from social media companies?

If a social media company harmed you, you might be able to recover economic and non-economic damages. Economic damages include quantifiable expenses, while non-economic damages include damages that are not easily quantifiable. Examples of these damages include:

  • Doctor fees (including therapy)
  • Hospital fees
  • Lost wages 
  • Pain and suffering
  • Loss of enjoyment of life

Sometimes, courts may award punitive damages. The purpose of punitive damages is to punish the offending party for wrongdoing.

Every case is different, so we cannot guarantee damages awarded in one case will be the same in the next. Contact our California personal injury attorneys for a free consultation to see your legal options.

How can social media harm attorneys at Cutter Law help?

Our social media attorneys have experience in dealing with social media companies. We know it is not your or your child’s fault that your child fell victim to the social media sites’ algorithms. We will follow your case through to the end. We will discuss any legal options you have, gather evidence, negotiate with the social media companies, and go to trial to defend your rights if necessary.

Contact our Social Media Lawyers

Social media sites took advantage of Section 230 and grew to become powerful global corporations worth billions of dollars. However, there are federal laws that can circumvent Section 230, where social media companies can be held accountable.

If social media has harmed your child, contact Cutter Law. We have the experience and resources to hold social media companies accountable and get you the compensation you deserve to make your family whole again.

Contact the social media attorneys at Cutter Law for a free consultation. Remember, there is no fee unless we win.

Schedule A Free Case Review

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Our Office Locations

Sacramento Office
401 Watt Avenue Suite 100
Sacramento, CA 95864
Phone: 916-290-9400

Oakland Office
Cutter Law P.C.
1999 Harrison Street Suite 1400
Oakland, CA 94612

Scroll to Top