The Kids Online Safety Act: Protecting LGBTQ+ Children & Adolescents Online - How changes to the Kids Online Safety Act will protect LGBTQ+ youth
FACT SHEET
Summary of the Kids Online Safety Act
As Congressional hearings, media reports, academic research, whistleblower disclosures, and heartbreaking stories from youth and families have repeatedly shown, social media platforms have exacerbated the mental health crisis among children and teens fostering body image issues, creating addiction-like use, promoting products that are dangerous for young audiences, and fueling destructive bullying.
The Kids Online Safety Act (KOSA) provides children, adolescents, and parents with the tools, safeguards, and transparency they need to protect against threats to young people's health and wellbeing online. The design and operation of online platforms have a significant impact on these harms, such as recommendation systems that send kids down rabbit holes of destructive content, and weak protections against relentless bullying.
KOSA would provide safeguards and accountability through:
Creating a duty of care for social media platforms to prevent and mitigate specific dangers to minors in their design and operation of products, including the promotion of suicidal behaviors, eating disorders, substance use, sexual exploitation, advertisements for tobacco and alcohol, and more.
Requiring social media platforms to provide children and adolescents with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations. Platforms are required to enable the strongest settings by default.
Giving parents new tools to help support their children and providing them (as well as schools) a dedicated reporting channel to raise issues (such as harassment or threats) to the platforms.
How Online Harms Impact LGBTQ+ Communities
Social media can be an important tool for self-discovery, expression, and community. However, online platforms have failed to take basic steps to protect their users from profound harm and have put profit ahead of safety. Companies have operationalized their products to keep young users on their sites for as long as possible, even if the means to get people to use their platforms more are harmful. From documents provided by a whistleblower, Facebook’s own researchers described Instagram itself as a “perfect storm” that “exacerbates downward spirals” and produces hundreds of millions of dollars in revenue annually.
This “perfect storm” has been shown by academic research and surveys to weigh most profoundly on LGBTQ+ children and adolescents, who are more at risk of bullying, threats, and suicidal behaviors on social media. Some harms and examples of the protections KOSA would provide include:
LGBTQ+ youth are more at risk of cyberbullying and harassment.
LGBTQ+ high school students consistently report higher rates of cyberbullying than their heterosexual peers, and suffer more severe forms of harassment, such as stalking, non-consensual imagery, and violent threats.
Surveys have found that 56% of LGBTQ+ students had been cyberbullied in their lifetime compared to 32% for non-LGBTQ+ students.
One in three young LGBTQ+ people have said that they had been sexually harassed online, four times as often as other young people.
LGBTQ+ youth are more at risk for eating disorders and substance use.
Young LGBTQ+ people experience significantly greater rates of eating disorders and substance use compared to their heterosexual and cisgender peers. Transgender and nonbinary youth are at even higher risk for eating disorders, and Black LGBTQ+ youth are diagnosed at half the rate of their white peers.
Prolonged use of social media is linked with negative appearance comparison, which in turn increases risk for eating disorder symptoms.
Engagement-based algorithms feed extreme eating disorders through recommending more eating disorder content to vulnerable users (every click or view sends more destructive content to a user).
For example, TikTok began recommending eating disorder content within 8 minutes of creating a new account and Instagram was found to deluge a new user with eating disorder recommendations within one day.
How KOSA Will Help:
KOSA would require that platforms give users the ability to turn off engagement-based algorithms or options to influence the recommendation they receive. A user would be able to stop recommendation systems that are sending them toxic content.
KOSA’s duty of care requires platforms to prevent and mitigate cyberbullying. It also requires that platforms give users options to restrict messages from other users and to make their profile private.
It would require platforms to provide a point of contact for users to report harassment and mandates platforms respond to these reports within a designated time frame.
LGBTQ+ youth are more at risk of suicide and suicidal behaviors.
Young people exposed to hateful messaging online in tandem with self-harm material on social media, increases the risk of suicidal behaviors and/or suicide.
These risks are exacerbated when platform recommendation systems amplify hateful content and self-harm content.
For example, after creating a new teen account on TikTok, suicide content was recommended under three minutes.
Surveys have found 42% of LGBTQ+ youth seriously considered attempting suicide, including more than half of transgender and nonbinary youth.
Moreover, eating disorders, depression, bullying, substance use, and other mental health harms that fall harder on LGBTQ+ communities further increase risks of self-harm and suicide.
How KOSA Will Help:
In addition to the core safeguards and options provided to kids, such as controls and transparency over algorithmic recommendation systems, KOSA’s duty of care would require platforms consider and address the ways in which their recommendation systems promote suicide and suicidal behaviors, creating incentives for the platforms to provide self-help resources, uplift information about recovery, and prevent their algorithms from pushing users down rabbit holes of harmful and deadly content.
Protections for LGBTQ+ Communities
The reintroduction of the Kids Online Safety Act takes into account recommended edits from a diverse group of organizations, researchers, youth, and families.
The outcome from experts in the field and those with lived experience is a thoughtful and tailored bill designed to be a strong step in advancing a core set of accountability provisions to provide children, adolescents, and families with a safer online experience. Below is a summary comparing previous bill text and changes that were made for reintroduction.
Concerns with Previous Draft
How Current Draft Protects LGBTQ+
The “duty of care” is too vague, creating liabilities for broad and undefined harms to children and teens.
The duty of care is now limited to a set of specific harms that have been shown to be exacerbated by online platforms’ product designs and algorithms.
Specific harms are focused on serious threats to the wellbeing of young users, such as, eating disorders, substance use, depression, anxiety, suicidal behaviors, physical violence, sexual exploitation, and the marketing of narcotics, tobacco, gambling, alcohol.
The terms used to describe those harms are linked to clinical or legal definitions where there is a perceived risk of misuse. In addition, the duty of care includes a limitation to ensure it is not construed to require platforms to block access to content that a young user specifically requests or block access to evidence-informed medical information and support resources.
The inclusion of “grooming” in the duty of care could be weaponized against entities providing information about gender-affirming care.
“Grooming” was cut from the bill.
Sexual exploitation and abuse are now defined using existing federal criminal statutes to prevent politicalization or distortion of terms.
The duty of care to prevent and mitigate “self-harm” or “physical harm” could be weaponized against trans youth and those who provide information about gender-affirming care.
The specific reference to “self-harm” has been removed from the duty of care. “Physical harm” has been changed to “physical violence” to enhance clarity.
Other covered harms related to “self-harm” are covered using terminology that is anchored in a medical definition.
Will allow non-supportive parents to surveil LGBTQ+ youth online.
The legislation clarifies the tools available to protect kids and differentiates the developmental differences between children and young teens.
KOSA has always included requirements that children and adolescents are notified if parental controls are turned on, and required kids know before parents are informed about creating a new account.
For teens, the bill requires platforms to give parents the ability to restrict purchases, view metrics on how much time a minor is spending on a platform and view - but not change - account settings. It does not require the disclosure of a minor’s browsing behavior, search history, messages, or other content or metadata of their communications.
KOSA will lead to privacy-invasive age verification across the internet.
KOSA never required age verification or gating, nor did it create liability for companies if kids lie about their age.
The bill explicitly states that companies are not required to age-gate or collect additional data to determine a user’s age.
Additionally, a knowledge standard is more consistently applied across the bill for the purpose of clarifying that companies are not liable if they have no knowledge whether a user is a child or adolescent.
KOSA will affect access to sexual health information, schools, or nonprofit services.
KOSA requirements only apply to commercial online platforms, such as social media and games that have been the largest source of issues for kids online.
Nonprofits, schools, and broadband services are exempt from KOSA and a previous reference to “educational services” was removed from the “covered platform” definition of the bill.
KOSA does not apply to health sites or other information resources.