CDD

Press Releases

  • Press Release

    Advocates demand Federal Trade Commission investigate Google for continued violations of children’s privacy law

    Following news of Google’s violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation

    Contact:Josh Golin, Fairplay: josh@fairplayforkids.orgJeff Chester, Center for Digital Democracy: jeff@democraticmedia.org Advocates demand Federal Trade Commission investigate Google for continued violations of children’s privacy lawFollowing news of Google’s violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation BOSTON and WASHINGTON, DC – WEDNESDAY, August 23, 2023 – The organizations that alerted the Federal Trade Commission (FTC) to Google’s violations of the Children’s Online Privacy Protection Act (COPPA) are urging the Commission to investigate whether Google and YouTube are once again violating COPPA, as well as the companies’ 2019 settlement agreement and the FTC Act. In a Request for Investigation filed today, Fairplay and the Center for Digital Democracy (CDD) detail new research from Adalytics, as well as Fairplay’s own research, indicating Google serves personalized ads on “made for kids” YouTube videos and tracks viewers of those videos, even though neither is permissible under COPPA. Common Sense Media and the Electronic Privacy Information Center (EPIC), joined Fairplay and CDD in calling on the Commission to investigate and sanction Google for its violations of children’s privacy. The advocates suggest that the FTC should seek penalties upwards of tens of billions of dollars. In 2018, Fairplay and Center for Digital Democracy led a coalition asking the FTC to investigate YouTube for violating the Children’s Online Privacy Protection Act (COPPA) by collecting personal information from children on the platform without parental consent. As a result of the advocates’ complaint, Google and YouTube were required to pay a then-record $170 million fine in a 2019 settlement with the FTC and comply with COPPA going forward. Rather than getting the required parental permission before collecting personally identifiable information from children on YouTube, Google claimed instead it would comply with COPPA by limiting data collection and eliminating personalized advertising on “made for kids.” But an explosive new report released by Adalytics last week called into question Google’s assertions and compliance with federal privacy law. The report detailed how Google appeared to be surreptitiously using cookies and identifiers to track viewers of “made for kids” videos. The report also documented how YouTube and Google appear to be serving personalized ads on “made for kids” videos and transmitting data about viewers to data brokers and ad tech companies. In response to the report, Google told the New York Times that ads on children’s videos are based on webpage content, not targeted to user profiles. But follow-up research conducted independently by both Fairplay and ad buyers suggests the ads are, in fact, personalized and Google is both violating COPPA and making deceptive statements about its targeting of children. Both Fairplay and the ad buyers ran test ad campaigns on YouTube where they selected a series of users of attributes and affinities for ad targeting and instructed Google to only run the ads on “made for kids” channels. In theory, these test campaigns should have resulted in zero placements, because under Google and YouTube’s stated policy, no personalized ads are supposed to run on “made for kids” videos. Yet, Fairplay’s targeted $10 ad campaign resulted in over 1,400 impressions on “made for kids” channels and the ad buyers reported similar results. Additionally, the reporting Google provided to Fairplay and the ad buyers to demonstrate the efficacy of the ad buys would not be possible if the ads were contextual, as Google claims. “If Google’s representations to its advertisers are accurate, it is violating COPPA,” said Josh Golin, Executive Director of Fairplay. “The FTC must launch an immediate and comprehensive investigation and use its subpoena authority to better understand Google’s black box child-directed ad targeting. If Google and YouTube are violating COPPA and flouting their settlement agreement with the Commission, the FTC should seek the maximum fine for every single violation of COPPA and injunctive relief befitting a repeat offender.” The advocates’ letter urges the FTC to seek robust remedies for any violations, including but not limited to: ·       Civil penalties that demonstrate that continued violations of COPPA and Section 5 of the FTC Act are unacceptable. Under current law, online operators can be fined $50,120 per violation of COPPA. Given the immense popularity of many “made for kids” videos, it is likely millions of violations have occurred, suggesting the Commission should seek civil penalties upwards of tens of billions of dollars.·       An injunction requiring relinquishment of all ill-gotten gains·       An injunction requiring disgorgement of all algorithms trained on impermissibly collected data·       A prohibition on the monetization of minors’ data·       An injunction requiring YouTube to move all “made for kids” videos to YouTube Kids and remove all such videos from the main YouTube platform. Given Google’s repeated failures to comply with COPPA on the main YouTube platform – even when operating under a consent decree – these videos should be cabined to a platform that has not been found to violate existing privacy law·       The appointment of an independent “special master” to oversee Google’s operations involving minors and provide the Commission, Congress, and the public semi-annual compliance reports for a period of at least five yearsKatharina Kopp, Deputy Director of the Center for Digital Democracy, said “The FTC must fully investigate what we believe are Google’s continuous violations of COPPA, its 2019 settlement with the FTC, and Section 5 of the FTC Act. These violations place many millions of young viewers at risk. Google and its executives must be effectively sanctioned to stop its ‘repeat offender’ behaviors—including a ban on monetizing the personal data of minors, other financial penalties, and algorithmic disgorgement. The Commission’s investigation should also review how Google enables advertisers, data brokers, and leading online publisher partners to surreptitiously surveil the online activities of young people. The FTC should set into place a series of ‘fail-safe’ safeguards to ensure that these irresponsible behaviors will never happen again.” Caitriona Fitzgerald, Deputy Director of the Electronic Privacy Information Center (EPIC), said "Google committed in 2019 that it would stop serving personalized ads on 'made for kids' YouTube videos, but Adalytics’ research shows that this harmful practice is still happening. The FTC should investigate this issue and Google should be prohibited from monetizing minors’ data."Jim Steyer, President and CEO of Common Sense Media, said "The Adalytics findings are troubling but in no way surprising given YouTube’s history of violating the kids’ privacy. Google denies doing anything wrong and the advertisers point to Google, a blame game that makes children the ultimate losers. The hard truth is, companies — whether it’s Big Tech or their advertisers — basically care only about their profits, and they will not take responsibility for acting against kids’ best interests. We strongly encourage the FTC to take action here to protect kids by hitting tech companies where it really hurts: their bottom line." ### 
    black and white laptop computer by Souvik Banerjee
  • Press Release

    Advocates call for FTC action to rein in Meta’s abusive practices targeting kids and teens

    Letter from 31 organizations in tech advocacy, children’s rights, and health supports FTC action to halt Meta’s profiting off of young users’ sensitive data

    Contact:David Monahan, Fairplay: david@fairplayforkids.orgKatharina Kopp, Center for Digital Democracy: kkopp@democraticmedia.org Advocates call for FTC action to rein in Meta’s abusive practices targeting kids and teensLetter from 31 organizations in tech advocacy, children’s rights, and health supports FTC action to halt Meta’s profiting off of young users’ sensitive data BOSTON/ WASHINGTON DC–June 13, 2023– A coalition of leading advocacy organizations is standing up today to support the Federal Trade Commission’s recent order reining in Meta’s abusive practices aimed at kids and teens.  Thirty-one groups, led by the Center for Digital Democracy, the Electronic Privacy Information Center (EPIC), Fairplay, and U.S. PIRG, sent a letter to the FTC saying “Meta has violated the law and its consent decrees with the Commission repeatedly and flagrantly for over a decade, putting the privacy of all users at risk. In particular, we support the proposal to prohibit Meta from profiting from the data of children and teens under 18. This measure is justified by Meta’s repeated offenses involving the personal data of minors and by the unique and alarming risks its practices pose to children and teens.”  Comments from advocates: Katharina Kopp, Director of Policy, Center for Digital Democracy:“The FTC is fully justified to propose the modifications of Meta’s consent decree and to require it to stop profiting from the data it gathers on children and teens.  There are three key reasons why.  First, due to their developmental vulnerabilities, minors are uniquely harmed by Meta’s failure to comply repeatedly with its 2012 and 2020 settlements with the FTC, including its non-compliance with the federal children’s privacy law (COPPA); two, because Meta has failed for many years to even comply with the procedural safeguards required by the Commission, it is now time for structural remedies that will make it less likely that Meta can again disregard the terms of the consent decree; and three, the FTC must affirm its credibility and that of the rule of law and ensure that tech giants cannot evade regulation and meaningful accountability.” John Davisson, Director of Litigation, Electronic Privacy Information Center (EPIC): "Meta has had two decades to clean up its privacy practices after many FTC warnings, but consistently chose not to. That's not 'tak[ing] the problem seriously,' as Meta claims—that's lawlessness. The FTC was right to take decisive action to protect Meta's most vulnerable users and ban Meta from profiting off kids and teens. It's no surprise to see Meta balk at the legal consequences of its many privacy violations, but this action is well within the Commission's power to take.” Haley Hinkle, Policy Counsel, Fairplay: “Meta has been under the FTC's supervision in this case for over a decade now and has had countless opportunities to put user privacy over profit. The Commission's message that you cannot monetize minors' data if you can't or won't protect them is urgent and necessary in light of these repeated failures to follow the law. Kids and teens are uniquely vulnerable to the harms that result from Meta’s failure to run an effective privacy program, and they can’t wait for change any longer.” R.J. Cross, Director of U.S. PIRG’s Don’t Sell My Data campaign: “The business model of social media is a recipe for unhappiness. We’re all fed content about what we should like and how we should look, conveniently presented alongside products that will fix whatever problem with our lives the algorithm has just helped us discover. That’s a hard message to hear day in and day out, especially when you’re a teen. We’re damaging the self-confidence of some of our most impressionable citizens in the name of shopping. It’s absurd. It’s time to short circuit the business model.”  ###
    a white and blue square with a blue and white facebook logo by Dima Solomin
  • “By clarifying what types of data constitute personal data under COPPA, the FTC ensures that COPPA keeps pace with the 21st century and the increasingly sophisticated practices of marketers,” said Katharina Kopp, Director of Policy at Center for Digital Democracy.“As interactive technologies evolve rapidly, COPPA must be kept up to date and reflect changes in the way children use and access these new media, including virtual and augmented realities. The metaverse typically involves a convergence of physical and digital lives, where avatars are digital extension of our physical selves. We agree with the FTC that an avatar’s characteristics and its behavior constitute personal information. And as virtual and augmented reality interfaces allow for the collection of extensive sets of personal data, including sensitive and biometric data, this data must be considered personal information under COPPA. Without proper protections this highly coveted data would be exploited by marketers and used to further manipulate and harm children online.”
    person holding black game controller by Hardik Sharma
  • Contact: Katharina Kopp, kkopp [at] democraticmedia.org“We welcome the FTC ‘s action to address the rampant commercial surveillance of children via Internet of Things (IoT) devices, such as Amazon’s Echo, and for enforcing existing law,” said Katharina Kopp, Director of Policy at Center for Digital Democracy. “Children’s data is taken away from them illegally and surreptitiously on a massive scale via IoT devices, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits. These violations in turn lead to further exploitation and manipulation of children and teens. They lead to violating their privacy, to manipulating them into being interested in harmful products, to undermining their autonomy and hooking them to digital media, and to perpetuating discrimination and bias. As Commissioner Bedoya’s separate statement points out, with this proposed order the FTC warns companies that they cannot take data from children and teens (and others) illegitimately to develop even more sophisticated methods to take advantage of them. Both the FTC and the Department of Justice must hold Amazon accountable.”
    white and black Amazon Echo Dot 2 by Find Experts at Kilta.com
  • CDD urges Congress to adopt stronger online safeguards for kids and teensContact: Katharina Kopp, kkopp [at] democraticmedia.orgThe Children’s Online Privacy Protection Act (COPPA 2.0), introduced by Senators Markey and Cassidy, will provide urgently needed online safeguards for children and teens. It will enact real platform accountability and limit the economic and psychological exploitation of children and teens online and thus address the public health crisis they are experiencing.By banning targeted ads to young people under 16, the endless streams of data collected by online companies to profile and track them will be significantly reduced. The ability of digital marketers and platforms to manipulate, discriminate, and exploit children and teens will be curtailed. COPPA 2.0 will also extend the original COPPA law protections for youth from 12 to 16 years of age.  The proposed law provides the ability to delete children’s and teen’s data with a click of an “eraser button.”  With the creation of a new FTC "Youth Marketing and Privacy Division,” COPPA 2.0 will ensure young peoples’ privacy rights are enforced.
  • Reining In Meta’s Digital ‘Wild West’ as FTC protects young people’s safety, health and privacyContacts:Jeff Chester, CDD, 202-494-7100David Monahan, Fairplay, 781-315-2586Children’s advocates Fairplay and Center for Digital Democracy respond to today’s announcement that the FTC proposes action to address Facebook’s privacy violations in practices impacting children and teens.  And see important new information compiled by Fairplay and CDD, linked below.Josh Golin, executive director, Fairplay:The action taken by the Federal Trade Commission against Meta is long overdue. For years, Meta has flouted the law and exploited millions of children and teens in their efforts to maximize profits, with little care as to the harms faced by young users on their platforms. The FTC has rightly recognized Meta simply cannot be trusted with young people’s sensitive data and proposed a remedy in line with Meta’s long history of abuse of children. We applaud the Commission for its efforts to hold Meta accountable and for taking a huge step toward creating the safe online ecosystem every young American deserves.Jeff Chester, executive director, Center for Digital Democracy:Today’s action by the Federal Trade Commission (FTC) is a long-overdue intervention into what has become a huge national crisis for young people. Meta and its platforms are at the center of a powerful commercialized social media system that has spiraled out of control, threatening the mental health and wellbeing of children and adolescents. The company has not done enough to address the problems caused by its unaccountable data-driven commercial platforms. Amid a continuing rise in shocking incidents of suicide, self-harm and online abuse, as well as exposés from industry “whistleblowers,” Meta is unleashing even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards. Parents and children urgently need the government to institute protections for the “digital generation” before it is too late. Today’s action by the FTC limiting how Meta can use the data it gathers will bring critical protections to both children and teens. It will require Meta/Facebook to engage in a proper “due diligence” process when launching new products targeting young people—rather than its current method of “release first and address problems later approach.” The FTC deserve the thanks of U.S parents and others concerned about the privacy and welfare of our “digital generation.”NEW REPORTS:META HAS A LONG HISTORY OF FAILING TO PROTECT CHILDREN ONLINE(link is external)(from Fairplay)META’S VIRTUAL REALITY-BASED MARKETING APPARATUS POSES RISKS TO TEENS AND OTHERS(from CDD)
     by
  • Advocates Fairplay, Eating Disorders Coalition, Center for Digital Democracy, and others announce support of the newly reintroduced Kids Online Safety ActContact:David Monahan, Fairplay (david@fairplayforkids.org)Advocates pledge support for landmark bill requiring online platforms to protect kids, teens with “safety by design” approachAdvocates Fairplay, Eating Disorders Coalition, Center for Digital Democracy, and others announce support of the newly reintroduced Kids Online Safety ActBOSTON, MA and WASHINGTON, DC — May 2, 2023 — Today, a coalition of leading advocates for children’s rights, health, and privacy lauded the introduction of the Kids Online Safety Act (KOSA), a landmark bill that would create robust online protections for children and teens online. Among the advocates pledging support for KOSA are Fairplay, Eating Disorders Coalition, the American Academy of Pediatrics, the American Psychological Association, and Common Sense.KOSA, a bipartisan bill from Senators Richard Blumenthal (D-CT) and Martha Blackburn (R-TN), would make online platforms and digital providers abide by a “duty of care” requiring them to eliminate or mitigate the impact of harmful content on their platforms. The bill would also require platforms to default to the most protective settings for minors and enable independent researchers to access “black box” algorithms to assist in research on algorithmic harms to children and teens.The reintroduction of the Kids Online Safety Act coincides with a rising tide of bipartisan support for action to protect children and teens online amidst a growing youth mental health crisis. A February report from the CDC showed that teen girls and LGBTQ+ youth are facing record levels of sadness and despair, and another report from Amnesty International indicated that 74% of youth check social media more than they’d like.Fairplay Executive Director, Josh Golin:“For far too long, Big Tech have been allowed to play by their own rules in a relentless pursuit of profit, with little regard for the damage done to the children and teens left in their wake. Companies like Meta and TikTok have made billions from hooking kids on their products by any means necessary, even promoting dangerous challenges, pro-eating disorder content, violence, drugs, and bigotry to the kids on their platforms. The Kids Online Safety Act stands to change all that. Today marks an exciting step toward the internet every young person needs and deserves, where children and teens can explore, socialize and learn without being caught in Big Tech crossfire.”National Alliance for Eating Disorders CEO and EDC Board Member, Johanna Kandel:“The Kids Online Safety Act is an integral first step in making social media platforms a safer place for our children. We need to hold these platforms accountable for their role in exposing our kids to harmful content, which is leading to declining mental health, higher rates of suicide, and eating disorders. As both a CEO of an eating disorders nonprofit and a mom of a young child, these new laws would go a long way in safeguarding the experiences our children have online.”Center for Digital Democracy Deputy Director, Katharina Kopp:“The Kids Online Safety Act (KOSA), co-sponsored by Senators Blumenthal and Blackburn, will hold social media companies accountable for their role in the public health crisis that children and teens experience today. It will require platforms to make better design choices that ensure the well-being of young people. KOSA is urgently needed to stop online companies operating in ways that encourage self-harm, suicide, eating disorders, substance use, sexual exploitation, patterns of addiction-like behaviors, and other mental and physical threats.  It also provides safeguards to address unfair digital marketing tactics. Children and teens deserve an online environment that is safe. KOSA will significantly reduce the harms that children, teens, and their families experience online every day.”Children and Screens: Institute of Digital Media and Children Development Executive Director, Kris Perry:“We appreciate the Senators’ efforts to protect children in this increasingly complicated digital world. KOSA will allow access to critical datasets from online platforms for academic and research organizations. This data will facilitate scientific research to better understand the overarching impact social media has on child development."###kosa_reintro_pr.pdf
  • Statement from Children’s Advocacy Groups on New Social Media Bill by U.S. Senators Schatz and CottonWashington, D.C., April 26, 2023– Several children’s advocacy groups expressed concern today with parts of a new bill intended to protect kids and teens from online harms.  The bill, “The Protecting Kids on Social Media Act,” was introduced this morning by U.S. Sens. Brian Schatz (D-HI) and Tom Cotton (R-AR).The groups, including Common Sense Media, Fairplay, and The Center for Digital Democracy, play a leading role on legislation in Congress to ensure that tech companies, and social media platforms in particular, are held accountable for the serious and sometimes deadly harms related to the design and operation of these platforms. They said the new bill is well-intentioned in the face of a youth mental health crisis and has some features that should be adopted, but that other aspects of the bill take the wrong approach to a serious problem.The groups said they support the bill’s ban on algorithmic recommendation systems to minors, which would prevent platforms from using personal data of minors to amplify harmful content to them. However, they said they object to the fact that the bill places too many new burdens on parents and creates unrealistic bans and institutes potentially harmful parental control over minors’ access to social media. By requiring parental consent before a teen can use a social media platform, vulnerable minors, including LGBTQ+ kids and kids who live in unsupportive households, may be cut off from access to needed resources and community. At the same time, kids and teens could pressure their parents or guardians to provide consent. Once young users make it onto the platform, they will still be exposed to addictive or unsafe design features beyond algorithmic recommendation systems, such as endless scroll and autoplay. The bill’s age verification measures also introduce troubling implications for the privacy of all users, given the requirement for covered companies to verify the age of both adult and minor users. Despite its importance, there is currently no consensus on how to implement age verification measures without compromising users’ privacy. The groups said that they strongly support other legislation that establish important guardrails on platforms and other tech companies to make the internet a healthier and safer place for kids and families, for example the Kids Online Safety Act (KOSA), COPPA 2.0, bi-partisan legislation that was approved last year by the Senate Commerce Committee and expected to be reintroduced again this year.“We appreciate Senators Schatz and Cotton's effort to protect kids and teens online and we look forward to working with them as we have with many Senators and House members over the past several years. But this is a life or death issue for families and we have to be very careful about how to protect kids online. The truth is, some approaches to the problem of online harms to kids risk further harming kids and families,” said James P. Steyer, founder and CEO of Common Sense Media. “Congress should place the onus on companies to make the internet safer for kids and teens and avoid placing the government in the middle of the parent-child relationship. Congress has many good policy options already under consideration and should act on them now to make the internet healthier and safer for kids.”“We are grateful to Senators Schatz, Cotton, Britt and Murphy for their efforts to improve the online environment for young people but are deeply concerned their bill is not not the right approach,” said Josh Golin, Executive Director of Fairplay. “ Young people deserve secure online spaces where they can safely and autonomously socialize, connect with peers, learn, and explore. But the Protecting Kids on Social Media Act does not get us any closer to a safer internet for kids and teens. Instead, if this legislation passes, parents will face the same exact conundrum they face today: Do they allow their kids to use social media and be exposed to serious online harms, or do they isolate their children from their peers? We need legislative solutions that put the burden on companies to make their platforms safer, less exploitative, and less addictive, instead of putting even more on parents’ plates.”"It’s critical that social media platforms are held accountable for the harmful impacts their practices have on children and teens. However, this bill’s approach is misguided. It places too much of a burden on parents, instead of focusing on platforms’ business practices that have produced the unprecedented public health crisis that harms our children’s physical and mental well-being. Kids and teens should not be locked out of our digital worlds, but be allowed online where they can be safe and develop in age-appropriate ways. One of the unintended consequences of this bill will likely be a two-tiered online system, where poor and otherwise disadvantaged parents and their children will be excluded from digital worlds. What we need are policies that hold social media companies truly accountable, so all young people can thrive,” said Katharina Kopp, Ph.D., Deputy Director of the Center for Digital Democracy.schatz-cotton_bill_coalition_statement.pdf
  • Citing research that illustrates a number of serious risks to children and teens in the Metaverse, advocates say Meta must wait for more research and root out dangers before targeting youth in VR. BOSTON, MA, WASHINGTON, DC and LONDON, UK — Friday, April 14, 2023 — Today, a coalition of over 70 leading experts and advocates for health, privacy, and children’s rights are urging Meta to abandon plans to allow minors between the ages of 13 and 17 into Horizon Worlds, Meta’s flagship virtual reality platform. Led by Fairplay, the Center for Digital Democracy (CDD), and the Center for Countering Digital Hate (CCDH), the advocates underscored the dearth of research on the impact of time spent in the Metaverse on the health and wellbeing of youth as well as the company’s track record of putting profits ahead of children’s safety. The advocates’ letter maintained that the Metaverse is already unsuitable for use by children and teens, citing March 2023 research from CCDH which revealed that minors already using Horizon Worlds were routinely exposed to harassment and abuse—including sexually explicit insults and racist, misogynistic, and homophobic harassment—and other offensive content. In addition to the existing risks present in Horizon Worlds, the advocates’ letter outlined a variety of potential risks facing underage users in the Metaverse, including magnified risks to privacy through the collection of biomarkers, risks to youth mental health and wellbeing, and the risk of discrimination, among others.In addition to Fairplay, CDD, and CCDH, the 36 organizations signing on include Common Sense Media, the Electronic Privacy Information Center (EPIC), Public Citizen, and the Eating Disorders Coalition.The 37 individual signatories include: Richard Gephardt of the Council for Responsible Social Media, former Member of Congress and House Majority Leader; Sherry Turkle, MIT Professor and author of Alone Together and Reclaiming Conversation; and social psychologist and author Jonathan Haidt.Josh Golin, Executive Director, Fairplay:“It's beyond appalling that Mark Zuckerberg wants to save his failing Horizons World platform by targeting teens. Already, children are being exposed to homophobia, racism, sexism, and other reprehensible content on Horizon Worlds. The fact that Mr. Zuckerberg is even considering such an ill-formed and dangerous idea speaks to why we need Congress to pass COPPA 2.0 and the Kids Online Safety Act.”Katharina Kopp, PhD, Deputy Director, Center for Digital Democracy:“Meta is demonstrating once again that it doesn’t consider the best interest of young people when it develops plans to expand its business operations.  Before it considers opening its Horizon Worlds metaverse operation to teens, it should first commit to fully exploring the potential consequences.  That includes engaging in an independent and research-based effort addressing the impact of virtual experiences on young people’s mental and physical well-being, privacy, safety, and potential exposure to hate and other harmful content.  It should also ensure that minors don’t face forms of discrimination in the virtual world, which tends to perpetuate and exacerbate ‘real life’ inequities.”Mark Bertin, MD, Assistant Professor of Pediatrics at New York Medical College, former Director of Developmental Behavioral Pediatrics at the Westchester Institute for Human Development, author of The Family ADHD Solution, Mindful Parenting for ADHD, and How Children Thrive:“This isn't like the panic over rock and roll, where a bunch of old folks freaked out over nothing. Countless studies already describe the harmful impact of Big Tech products on young people, and it’s worsening a teen mental health crisis. We can't afford to let profit-driven companies launch untested projects targeted at kids and teens and let families pick up the pieces after. It is crucial for the well-being of our children that we understand what is safe and healthy first.” Imran Ahmed, CEO of the Center for Countering Digital Hate:“Meta is making the same mistake with Horizon Worlds that it made with Facebook and Instagram. They have prioritized profit over safety in their design of the product, failed to provide meaningful transparency, and refused to take responsibility for ensuring worlds are safe, especially for children.“Yet again, their aim is speed to market in order to achieve monopoly status – rather than building truly sustainable, productive and enjoyable environments in which people feel empowered and safe.“Whereas, to some, ‘move fast and break things’ may have appeared swashbuckling from young startup entrepreneurs, it is a brazenly irresponsible strategy coming from Meta, one of the world’s richest companies. It should have learned lessons from the harms their earlier products imposed on society, our democracies and our citizens.”horizonletter.pdf
    Jeff Chester
     by
  • Reports indicate FTC plans to advance case against Amazon for violation of kids’ privacy after advocates’ 2019 complaint. BOSTON, MA and WASHINGTON, DC — Friday, March 31, 2023 — Following a groundbreaking investigation of Amazon’s Echo Dot Kids by Fairplay and Center for Digital Democracy (CDD), the Federal Trade Commission is preparing to advance a case against Amazon for the company’s violations of children’s privacy law to the Department of Justice. According to new reporting from Politico, the case centers on Amazon’s violations of the Children’s Online Privacy Protection Act (COPPA) through its Alexa voice assistant.In 2019, privacy advocates Fairplay and CDD called for the FTC to take action against Amazon after an investigation of the company’s Echo Dot Kids smart home assistant, a candy-colored version of Amazon’s flagship home assistant with Alexa voice technology. The investigationrevealed a number of shocking illegal privacy violations, including Amazon’s indefinite retention of kids’ sensitive data even after parents requested for it to be deleted. Now, reports indicate that the FTC is acting on the advocates’ calls for investigation.“We’re thrilled that the Federal Trade Commission and Department of Justice are close to taking action against Amazon for its egregious violations of children’s privacy,” said Josh Golin, Executive Director of Fairplay. “We know it’s not just social media platforms and apps thatmisuse children’s sensitive data. This landmark case would be the first time the FTC sanctioned the maker of a voice-enabled device for flouting COPPA. Amazon and its Big Tech peers must learn that COPPA violations are not just a cost of doing business.” “It is time for the FTC to address the rampant commercial surveillance of children via Internet of Things (IoT) devices, such as Amazon’s Echo, and enforce existing law,” said Katharina Kopp, Director of Policy at Center for Digital Democracy. “Children are giving away sensitive personal data on a massive scale via IoT devices, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits. These data practices lead to violating children’s privacy, to manipulating them into being interested in harmful products, undermining their autonomy, and to perpetuating discrimination and bias. Both the FTC and the Department of Justice must hold Amazon accountable.”[see attached for additional comments] ftc_amazon_investigation_statement_fairplay_cdd.pdf
    Jeff Chester
  • Consumer Advocates Urge Action Walmart Deceptively Marketing to Kids on RobloxConsumer Advocates Urge ActionMADISON, CONN. January 23, 2023 – A coalition of advocacy groups led by ad watchdog truthinadvertising.org (TINA.org) is urging the Children’s Advertising Review Unit (CARU) – a BBB National Program – to immediately audit the Walmart Universe of Play advergame, a recent addition to the self-regulatory group’s COPPA Safe Harbor Program and bearer of one of the Program’s certification seals. According to a letter from TINA.org, Fairplay, Center for Digital Democracy and the National Association of Consumer Advocates, a copy of which was sent to Walmart, Roblox and the FTC, the retail giant is exposing children to deceptive marketing on Roblox, the online gaming and creation platform used by millions of kids on a daily basis.Walmart’s first foray into the Roblox metaverse came last September, when it premiered two experiences, Walmart Universe of Play and Walmart Land, which collectively have been visited more than 12 million times. Targeted at – and accessible to – young children on Roblox, Universe of Play features virtual products and characters from L.O.L. Surprise!, Jurassic World, Paw Patrol, and more and is advertised to allow kids to play with the “year’s best toys” and make a “wish list” of toys that can then be purchased at Walmart.As the consumer groups warn, Walmart completely blurs the distinction between advertising content and organic content, and simultaneously fails to provide clear or conspicuous disclosures that Universe of Play (or content within the virtual world) are ads. In addition, as kids’ avatars walk through the game, they are manipulated into opening additional undisclosed advertisements disguised as surprise wrapped gifts.To make matters worse, Walmart is using the CARU COPPA Safe Harbor Program seal to convey the false message that its children’s advergame is not only in compliance with COPPA (Children’s Online Privacy Protection Act), but CARU's Advertising Guidelines and truth-in-advertising laws, as well as a shield against enforcement action.“Walmart’s brazen use of stealth marketing directed at young children who are developmentally unable to recognize the promotional content is not only appalling, it’s deceptive and against truth-in-advertising laws. We urge CARU to take swift action to protect the millions of children being manipulated by Walmart on a daily basis.” Laura Smith, TINA.org Legal Director“Walmart's egregious and rampant manipulation of children on Roblox -- a platform visited by millions of children every day -- demands immediate action. The rise of the metaverse has enabled a new category of deceptive marketing practices that are harmful to children. CARU must act now to ensure that children are not collateral damage in Walmart's digital drive for profit.” Josh Golin, Executive Director, Fairplay“Walmart’s and Roblox’s practices demonstrate that self-regulation is woefully insufficient to protect children and teens online. Today, young people are targeted by a powerful set of online marketing tactics that are manipulative, unfair, and harmful to their mental and physical health. Digital advertising operates in a ‘wild west’ world where anything goes in terms of reaching and influencing the behaviors of kids and teens. Congress and the Federal Trade Commission must enact safeguards to protect the privacy and well-being of a generation of young people.” Katharina Kopp, Director of Policy, Center for Digital DemocracyTo read more about Walmart’s deceptive marketing on Roblox see: /articles/tina-org-urges-action-against-walmarts-undisclosed-advergame-on-robloxAbout TINA.org (truthinadvertising.org) TINA.org is a nonprofit organization that uses investigative journalism, education, and advocacy to empower consumers to protect themselves against false advertising and deceptive marketing.About Fairplay Fairplay is the leading nonprofit organization committed to helping children thrive in an increasingly commercialized, screen-obsessed culture, and the only organization dedicated to ending marketing to children.About Center for Digital DemocracyThe Center for Digital Democracy is a nonprofit organization using education, advocacy, and research into commercial data practices to ensure that digital technologies serve and strengthen democratic values, institutions, and processes.About National Association of Consumer AdvocatesThe National Association of Consumer Advocates is a nonprofit association of more than 1,500 attorneys and consumer advocates committed to representing consumers’ interests.For press inquiries contact: Shana Mueller at 203.421.6210 or press@truthinadvertising.org.walmart_caru_press_release_final.pdf
  • Josh Golin, executive director, Fairplay:The FTC’s landmark settlement against Epic Games is an enormous step forward towards creating a safer, less manipulative internet for children and teens. Not only is the Commission holding Epic accountable for violating COPPA by illegally collecting the data of millions of under 13-year-olds, but the settlement is also a shot across the bow against game makers who use unfair practices to drive in-game purchases by young people. The settlement rightly recognizes not only that unfair monetization practices harm young people financially, but that design choices used to drive purchases subject young people to a wide array of dangers, including cyberbullying and predation.Today’s breakthrough settlement underscores why it is so critical that Congress pass the privacy protections for children and teens currently under consideration for the Omnibus bill. These provisions give teens privacy rights for the first time, address unfair monetization by prohibiting targeted advertising, and empower regulators by creating a dedicated youth division at the FTC. Jeff Chester, executive director, Center for Digital Democracy:Through this settlement with EPIC Games using its vital power to regulate unfair business practices, the FTC has extended long-overdue and critically important online protections for teens.  This tells online marketers that from now on, teenagers cannot be targeted using unfair and manipulative tactics designed to take advantage of their young age and other vulnerabilities.Kids should also have their data privacy rights better respected through this enforcement of the federal kids data privacy law (COPPA).  Gaming is a “wild west” when it comes to its data gathering and online marketing tactics, placing young people among the half of the US population who play video games at especially greater risk.  While today’s FTC action creates new safeguards for young people, Congress has a rare opportunity to pass legislation this week ensuring all kids and teens have strong digital safeguards, regardless of what online service they use.
    Jeff Chester
  • Press Statement regarding today’s FTC Notice(link is external) of Proposed Rulemaking Regarding the Commercial Surveillance and Data SecurityKatharina Kopp, Deputy Director, Center for Digital Democracy:Today, the Federal Trade Commission issued its long overdue advanced notice of proposed rulemaking (ANPRM) regarding a trade regulation rule on commercial surveillance and data security. The ANPRM aims to address the prevalent and increasingly unavoidable harms of commercial surveillance. Civil society groups including civil rights groups, privacy and digital rights and children’s advocates had previously called on the commission to initiate this trade regulation rule to address the decades long failings of the commission to reign in predatory corporate practices online. CDD had called on the commission repeatedly over the last two decades to address the out-of-control surveillance advertising apparatus that is the root cause of increasingly unfair, manipulative, and discriminatory practices harming children, teens, and adults and which have a particularly negative impact on equal opportunity and equity.The Center for Digital Democracy welcomes this important initial step by the commission and looks forward to working with the FTC. CDD urges the commission to move forward expeditiously with the rule making and to ensure fair participation of stakeholders, particularly those that are disproportionately harmed by commercial surveillance.press_statement_8-11fin.pdf
  • Groups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money" Contact:David Monahan, Fairplay david@fairplayforkids.orgJeff Chester, CDD jeff@democraticmedia.org; 202-494-7100Advocates call on FTC to investigate manipulative design abuses in popular FIFA gameGroups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money”BOSTON and WASHINGTON, DC – Thursday, June 2, 2022 – Today, advocacy groups Fairplay and Center for Digital Democracy (CDD) led a coalition of 15 advocacy groups in calling on the Federal Trade Commission (FTC) to investigate video game company Electronic Arts (EA) for unfairly exploiting young users in EA’s massively popular game, FIFA: Ultimate Team. In a letter sent to the FTC, the advocates described how the use of loot boxes and virtual currency in FIFA: Ultimate Team exploits the many children who play the game, especially given their undeveloped financial literacy skills and poor understanding of the odds of receiving the most desirable loot box items.Citing the Norwegian Consumer Council’s recent report, Insert Coin: How the Gaming Industry Exploits Consumers Using Lootboxes, the advocates’ letter details how FIFA: Ultimate Team encourages gamers to engage in a constant stream of microtransactions as they play the game. Users are able to buy FIFA points, a virtual in-game currency, which can then be used to purchase loot boxes called FIFA packs containing mystery team kits; badges; and player cards for soccer players who can be added to a gamer’s team. In their letter, the advocates noted the game’s use of manipulative design abuses such as “lightning round” sales of premium packs to promote the purchase of FIFA packs, which children are particularly vulnerable to. The advocates also cite the use of virtual currency in the game, which obscures the actual cost of FIFA packs to adult users, let alone children. Additionally, the actual probability of unlocking the best loot box prizes in FIFA: Ultimate Team is practically inscrutable to anyone who is not an expert in statistics, according to the advocates and the NCC report. In order to unlock a specific desirable player in the game, users would have to pay around $14,000 or spend three years continuously playing the game. “By relentlessly marketing pay-to-win loot boxes, EA is exploiting children’s desire to compete with their friends, despite the fact that most adults, let alone kids, could not determine their odds of receiving a highly coveted card or what cards cost in real money. The FTC must use its power to investigate these design abuses and determine just how many kids and teens are being fleeced by EA.” Josh Golin, Executive Director, Fairplay“Lootboxes, virtual currencies, and other gaming features are often designed deceptively, aiming to exploit players’ known vulnerabilities. Due to their unique developmental needs, children and teens are particularly harmed. Their time and attention is stolen from them, they're financially exploited, and are purposely socialized to adopt gambling-like behaviors. Online gaming is a key online space where children and teens gather in millions, and regulators must act to protect them from these harmful practices.” Katharina Kopp, Deputy Director, Center for Digital Democracy“As illustrated in our report, FIFA: Ultimate Team uses aggressive in-game marketing and exploits gamers’ cognitive biases - adults and children alike - to manipulate them into spending large sums of money. Children especially are vulnerable to EA’s distortion of real-world value of its loot boxes and the complex, misleading probabilities given to describe the odds of receiving top prizes. We join our US partners in urging the Federal Trade Commission to investigate these troubling practices.” Finn Lützow-Holm Myrstad, Digital Policy Director, Norwegian Consumer Council"The greed of these video game companies is a key reason why we're seeing a new epidemic of child gambling in our families. Thanks to this report, the FTC has more than enough facts to take decisive action to protect our kids from these predatory business practices." Les Bernal, National Director of Stop Predatory Gambling and the Campaign for Gambling-Free Kids“Exploiting consumers, especially children, by manipulating them into buying loot boxes that, in reality, rarely contain the coveted items they are seeking, is a deceptive marketing practice that causes real harm and needs to stop. TINA.org strongly urges the FTC to take action.” Laura Smith, Legal Director at TINA.orgAdvocacy groups signing today's FTC complaint include Fairplay; the Center for Digital Democracy; Campaign for Accountability; Children and Screens: Institute of Digital Media and Child Development; Common Sense Media; Consumer Federation of America; Electronic Privacy Information Center (EPIC); Florida Council on Compulsive Gambling, Inc.; Massachusetts Council on Gaming and Health; National Council on Problem Gambling; Parent Coalition for Student Privacy; Public Citizen; Stop Predatory Gambling and the Campaign for Gambling-Free Kids; TINA.org (Truth in Advertising, Inc.); U.S. PIRG### lootboxletter_pr.pdf, lootboxletterfull.pdf
  • Press Statement regarding today’s FTC Policy Statement on Education Technology and the Children’s Online Privacy Protection ActJeff Chester, Executive Director, Center for Digital Democracy:Today, the Federal Trade Commission adopts a long overdue policy designed to protect children’s privacy. By shielding school children from the pervasive forces of commercial surveillance, which gathers their data for ads and marketing, the FTC is expressly using a critical safeguard from the bipartisan Children’s Online Privacy Protection Act (COPPA). Fairplay, Center for Digital Democracy, and a coalition of privacy, children’s health, civil and consumer rights groups had previously called on the commission to enact policies that make this very Edtech safeguard possible.   We look forward to working with the FTC to ensure that parents can be confident that their child’s online privacy and security is protected in—or out of-the classroom.  However, the Commission must also ensure that adolescents receive protections from what is now an omniscient and manipulative data-driven complex that profoundly threatens their privacy and well-being.
    boy in red hoodie wearing black headphones by Compare Fibre
  • 60 leading advocacy organizations say unregulated Big Tech business model is “fundamentally at odds with children’s wellbeing”Contact:David Monahan, Fairplay david@fairplayforkids.org(link sends e-mail)Jeff Chester, Center for Digital Democracy, jeff@democraticmedia.org(link sends e-mail), 202-494-7100Diverse coalition of advocates urges Congress to pass legislation to protect kids and teens online60 leading advocacy organizations say unregulated Big Tech business model is “fundamentally at odds with children’s wellbeing”BOSTON, MA and WASHINGTON, DC - March 22, 2022 – Congressional leaders in the House and Senate were urged today to enact much needed protections for children and teens online. In a letter to Senate Majority Leader Chuck Schumer, Senate Minority Leader Mitch McConnell, House Speaker Nancy Pelosi and House Minority Leader Kevin McCarthy, a broad coalition of health, safety, privacy and education groups said it was time to ensure that Big Tech can no longer undermine the wellbeing of America’s youth. The letter reiterated President Biden’s State of the Union address call for increased online protections for young people.In their letter, the advocates outlined how the prevailing business model of Big Tech creates a number of serious risks facing young people on the internet today, including mental health struggles, loss of privacy, manipulation, predation, and cyberbullying. The advocates underscored the dangers posed by rampant data collection on popular platforms, including algorithmic discrimination and targeting children at particularly vulnerable moments.  The reforms called for by the advocates include:Protections for children and teens wherever they are online, not just on “child-directed” sites;Privacy protections to all minors;A ban on targeted advertising to young people;Prohibition of algorithmic discrimination of children and teens;Establishment of a duty of care that requires digital service providers to make the best interests of children a primary design consideration and prevent and mitigate harms to minors;Requiring platforms to turn on the most protective settings for minors by default;Greater resources for enforcement by the Federal Trade Commission.United by the desire to see Big Tech’s harmful business model regulated, the advocates’ letter represents a landmark moment for the movement to increase privacy protections for children and teenagers online, especially due to the wide-ranging fields and focus areas represented by signatories. Among the 60 signatories to the advocates’ letter are: Fairplay, Center for Digital Democracy, Accountable Tech, American Academy of Pediatrics, American Association of Child and Adolescent Psychiatry, American Psychological Association, Center for Humane Technology, Common Sense, Darkness to Light, ECPAT-USA, Electronic Privacy Information Center (EPIC), National Alliance to Advance Adolescent Health, National Center on Sexual Exploitation, National Eating Disorders Association, Network for Public Education, ParentsTogether, Public Citizen, Society for Adolescent Health and Medicine, and Exposure Labs, creators of The Social Dilemma.Signatories on the need for legislation to protect young people online:“Congress last passed legislation to protect children online 24 years ago – nearly a decade before the most popular social media platforms even existed. Big Tech's unregulated business model has led to a race to the bottom to collect data and maximize profits, no matter the harm to young people. We agree with the president that the time is now to update COPPA, expand privacy protections to teens, and put an end to the design abuses that manipulate young people into spending too much time online and expose them to harmful content.” – Josh Golin, Executive Director, Fairplay.“It’s long past time for Congress to put a check on Big Tech’s pervasive manipulation of young people’s attention and exploitation of their personal data. We applaud President Biden’s call to ban surveillance advertising targeting young people and are heartened by the momentum to rein in Big Tech and establish critical safeguards for minors engaging with their products.” – Nicole Gill, Co-Founder and Executive Director, Accountable Tech.“Digital technology plays an outsized role in the lives of today’s children and adolescents, exacerbated by the dramatic changes to daily life experienced during the pandemic. Pediatricians see the impact of these platforms on our patients and recognize the growing alarm about the role of digital platforms, in particular social media, in contributing to the youth mental health crisis. It has become clear that, from infancy through the teen years, children’s well-being is an afterthought in developing digital technologies. Strengthening privacy, design, and safety protections for children and adolescents online is one of many needed steps to create healthier environments that are more supportive of their mental health and well-being.”– Moira Szilagyi, MD, PhD, FAAP, President, American Academy of Pediatrics.“Children and teens are at the epicenter of a pervasive data-driven marketing system that takes advantage of their inherent developmental vulnerabilities. We agree with President Biden: now is the time for Congress to act and enact safeguards that protect children and teens.  It’s also long overdue for Congress to enact comprehensive legislation that protects parents and other adults from unfair, manipulative, discriminatory and privacy invasive commercial surveillance practices.”  – Katharina Kopp, Ph.D. Policy Director, Center for Digital Democracy."President Biden's powerful State of the Union plea to Congress to hold social media platforms accountable for the ‘national experiment’ they're conducting on our kids and teens could not be more important. It is clear that young people are being harmed by these platforms that continue to prioritize profits over the wellbeing of its youngest users. Children and teens' mental health is at stake. Congress and the Administration must act now to pass legislation to protect children’s and teens' privacy and well-being online." – Jim Steyer, Founder and CEO, Common Sense.“Online protections for children are woefully outdated and it's clear tech companies are more interested in profiting off of vulnerable children than taking steps to prevent them from getting hurt on their platforms. American kids are facing a mental health crisis partly fueled by social media and parents are unable to go it alone against these billion dollar companies. We need Congress to update COPPA, end predatory data collection on children, and regulate design practices that are contributing to social media addiction, mental health disorders, and even death.”– Justin Ruben, Co-Founder and Co-Director, ParentsTogether."A business model built on extracting our attention at the cost of our well being is bad for everyone, but especially bad for children. No one knows this better than young people themselves, many of whom write to us daily about the ways in which Big Social is degrading their mental health. Left unregulated, Big Social will put profits over people every time. It's time to put our kids first. We urge Congress to act swiftly and enact reforms like strengthening privacy, banning surveillance advertising, and ending algorithmic discrimination for kids so we can begin to build a digital world that supports, rather than demotes child wellbeing." – Julia Hoppock, Partnerships Director, The Social Dilemma, Exposure Labs.# # #press_release_letter_to_congress_updated_embargo_to_3_22.pdf, letter_to_congress_re_children_online_3_22_22.pdf
  • Groups urge Congress to stop Big Tech’s manipulation of young people BOSTON – Thursday, December 2, 2021 – Today a coalition of leading advocacy groups launched Designed With Kids in Mind, a campaign demanding a design code in the US to protect young people from online manipulation and harm. The campaign seeks to secure protections for US children and teens similar to the UK’s groundbreaking Age-Appropriate Design Code (AADC), which went into effect earlier this year. The campaign brings together leading advocates for child development, privacy, and a healthier digital media environment, including Fairplay, Accountable Tech, American Academy of Pediatrics, Center for Digital Democracy, Center for Humane Technology, Common Sense, ParentsTogether, RAINN, and Exposure Labs, creators of The Social Dilemma. The coalition will advocate for legislation and new Federal Trade Commission rules that protect children and teens from a business model that puts young people at risk by prioritizing data collection and engagement.The coalition has launched a website that explains how many of the most pressing problems faced by young people online are directly linked to platform’s design choices. They cite features that benefit platforms at the expense of young people’s wellbeing, such as: Autoplay: increases time on platforms, and excessive time on screens is linked to mental health challenges, physical risks like less sleep, and promotes family conflict.Algorithmic recommendations: risks exposure to self-harm, racist content, pornography, and mis/disinformation.Location tracking: makes it easier for strangers to track and contact children.Nudges to share: leads to loss of privacy, risks of sexual predation and identity theft.The coalition is promoting three bills which would represent a big step forward in protecting US children and teens online: the Children and Teens’ Online Privacy Protection Act S.1628; the Kids Internet Design and Safety (KIDS) Act S. 2918; and the Protecting the Information of our Vulnerable Children and Youth (PRIVCY) Act H.R. 4801. Taken together, these bills would expand privacy protections to teens for the first time and incorporate key elements of the UK’s AADC, such as requiring the best interest of children to be a primary design consideration for services likely to be accessed by young people. The legislation backed by the coalition would also protect children and teens from manipulative design features and harmful data processing. Members of the coalition on the urgent need for a US Design Code to protect children and teens:Josh Golin, Executive Director, Fairplay:We need an internet that helps children learn, connect, and play without exploiting their developmental vulnerabilities; respects their need for privacy and safety; helps young children disconnect at the appropriate time rather than manipulating them into spending even more time online; and prioritizes surfacing high-quality content instead of maximizing engagement. The UK’s Age-Appropriate Design Code took an important step towards creating that internet, and children and teens in the US deserve the same protections and opportunities. It’s time for Congress and regulators to insist that children come before Big Tech’s profits.Nicole Gill, Co-Founder and Executive Director of Accountable Tech:You would never put your child in a car seat that wasn't designed for them and met all safety standards, but that's what we do every day when our children go online using a network of apps and websites that were never designed with them in mind. Our children should be free to learn, play, and connect online without manipulative platforms like Facebook and Google's YouTube influencing their every choice. We need an age appropriate design code that puts kids and families first and protects young people from the exploitative practices and the perverse incentives of social media.Lee Savio Beers, MD, FAAP, President of the American Academy of Pediatrics:The American Academy of Pediatrics is proud to join this effort to ensure digital spaces are safe for children and supportive of their healthy development. It is in our power to create a digital ecosystem that works better for children and families; legislative change to protect children is long overdue. We must be bold in our thinking and ensure that government action on technology addresses the most concerning industry practices while preserving the positive aspects of technology for young people.Jeff Chester, Executive Director, Center for Digital Democracy:The “Big Tech” companies have long treated young people as just a means to generate vast profits – creating apps, videos and games designed to hook them to an online world designed to surveil and manipulate them. It’s time to stop children and teens from being victimized by the digital media industry. Congress and the Federal Trade Commission should adopt commonsense safeguards that ensure America’s youth reap all the benefits of the online world without having to constantly expose themselves to the risks.Randima Fernando, Executive Director, Center for Humane Technology:We need technology that respects the incredible potential – and the incredible vulnerability – of our kids' minds. And that should guide technology for adults, who can benefit from those same improvements.Irene Ly, Policy Counsel, Common Sense:This campaign acknowledges harmful features of online platforms and apps like autoplay, algorithms amplifying harmful content, and location tracking for what they are: intentional design choices. For too long, online platforms and apps have chosen to exploit children’s vulnerabilities through these manipulative design features. Common Sense has long supported designing online spaces with kids in mind, and strongly supports US rules that would finally require companies to put kids’ well-being first.Julia Hoppock, The Social Dilemma Partnerships Director, Exposure Labs:For too long, Big Social has put profits over people. It's time to put our kids first and build an online world that works for them.Dalia Hashad, Online Safety Director, ParentsTogether: From depression to bullying to sexual exploitation, tech companies knowingly expose children to unacceptable harms because it makes the platforms billions in profit. It's time to put kids first.Scott Berkowitz, President of RAINN (Rape, Abuse & Incest National Network):Child exploitation has reached crisis levels, and our reliance on technology has left children increasingly vulnerable. On our hotline, we hear from children every day who have been victimized through technology. An age-appropriate design code will provide overdue safeguards for children across the U.S.launch_-_design_code_to_protect_kids_online.pdf
  • Press Release

    Against surveillance-based advertising

    CDD joins an international coalition of more than 50 NGOs and scholars in a call for a surveillance-based advertising ban in its Digital Services Act and for the U.S. to enact a federal digital privacy and civil rights law

    International coalition calls for action against surveillance-based advertising Every day, consumers are exposed to extensive commercial surveillance online. This leads to manipulation, fraud, discrimination and privacy violations. Information about what we like, our purchases, mental and physical health, sexual orientation, location and political views are collected, combined and used under the guise of targeting advertising.   In a new report, the Norwegian Consumer Council (NCC) sheds light on the negative consequences that this commercial surveillance has on consumers and society. Together with [XXX] organizations and experts, NCC is asking authorities on both sides of the Atlantic to consider a ban. In Europe, the upcoming Digital Services Act can lay the legal framework to do so. In the US, legislators should seize the opportunity to enact comprehensive privacy legislation that protects consumers.  - The collection and combination of information about us not only violates our right to privacy, but renders us vulnerable to manipulation, discrimination and fraud. This harms individuals and society as a whole, says the director of digital policy in the NCC, Finn Myrstad.  In a Norwegian population survey conducted by YouGov on behalf of the NCC, consumers clearly state that they do not want commercial surveillance. Just one out of ten respondents were positive to commercial actors collecting personal information about them online, while only one out of five thought that ads based on personal information is acceptable. - Most of us do not want to be spied on online, or receive ads based on tracking and profiling. These results mirror similar surveys from Europe and the United States, and should be a powerful signal to policymakers looking at how to better regulate the internet, Myrstad says. Policymakers and civil society organisations on both sides of the Atlantic are increasingly standing up against these invasive practices. For example, The European Parliament and the European Data Protection Supervisor (EDPS) have already called for phasing out and banning surveillance-based advertising. A coalition of consumer and civil rights organizations in the United States has called for a similar ban.     Significant consequences  The NCC report ’Time to ban surveillance-based advertising’ exposes a variety of harmful consequences that surveillance-based advertising can have on individuals and on society:    1. Manipulation  Companies with comprehensive and intimate knowledge about us can shape their messages in attempts to reach us when we are susceptible, for example to influence elections or to advertise weight loss products, unhealthy food or gambling.     2. Discrimination  The opacity and automation of surveillance-based advertising systems increase the risk of discrimination, for example by excluding consumers based on income, gender, race, ethnicity or sexual orientation, location, or by making certain consumers pay more for products or services.     3. Misinformation   The lack of control over where ads are shown can promote and finance false or malicious content. This also poses significant challenges to publishers and advertisers regarding revenue, reputational damage, and opaque supply chains. 4. Undermining competition   The surveillance business model favours companies that collect and process information across different services and platforms. This makes it difficult for smaller actors to compete, and negatively impacts companies that respect consumers’ fundamental rights.  5. Security risks  When thousands of companies collect and process enormous amounts of personal data, the risk of identity theft, fraud and blackmail increases. NATO has described this data collection as a national security risk.    6. Privacy violations   The collection and use of personal data is happening with little or no control, both by large companies and by companies that are unknown to most consumers. Consumers have no way to know what data is collected, who the information is shared with, and how it may be used.   -  It is very difficult to justfy the negative consequences of this system. A ban will contribute to a healthier marketplace that helps protect individuals and society, Myrstad comments.  Good alternatives  In the report, the NCC points to alternative digital advertising models that do not depend on the surveillance of consumers, and that provide advertisers and publishers more oversight and control over where ads are displayed and which ads are being shown. - It is possible to sell advertising space without basing it on intimate details about consumers. Solutions already exist to show ads in relevant contexts, or where consumers self-report what ads they want to see, Myrstad says. - A ban on surveillance-based advertising would also pave the way for a more transparent advertising marketplace, diminishing the need to share large parts of ad revenue with third parties such as data brokers. A level playing field would contribute to giving advertisers and content providers more control, and keep a larger share of the revenue. The coordinated push behind the report and letter illustrates the growing determination of consumer, digital rights, human rights and other civil society groups to end the widespread business model of spying on the public.
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Ask FTC to Protect Youth From Manipulative “Dark Patterns” Online BOSTON, MA and WASHINGTON, DC — May 28, 2021—Two leading advocacy groups protecting children from predatory practices online filed comments today asking the FTC to create strong safeguards to ensure that internet “dark patterns” don’t undermine children’s well-being and privacy. Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) cited leading authorities on the impacts of internet use on child development in their comments prepared by the Communications & Technology Law Clinic at Georgetown University Law Center. These comments follow testimony given by representatives of both groups last month at a FTC workshop spearheaded by FTC Acting Chair Rebecca Slaughter. CCFC and CDD say tech companies are preying upon vulnerable kids, capitalizing on their fear of missing out, desire to be popular, and inability to understand the value of misleading e-currencies, as well as putting them on an endless treadmill on their digital devices. They urged the FTC to take swift and strong action to protect children from the harms of dark patterns. Key takeaways include: - A range of practices, often called “dark patterns” are pervasive in the digital marketplace, manipulate children, are deceptive and unfair and violate Section 5 of the FTC Act. They take advantage of a young person’s psycho-social development, such as the need to engage with peers. - The groups explained the ways children are vulnerable to manipulation and other harms from “dark patterns,” including that they have “immature and developing executive functioning,” which leads to impulse behaviors. - The FTC should prohibit the use of dark pattern practices in the children’s marketplace; issue guidance to companies to ensure they do not develop or deploy such applications, and include new protections under their Children’s Online Privacy Protection Act (COPPA) rulemaking authority to better regulate them. The commission must bring enforcement actions against the developers using child-directed dark patterns. - The FTC should prohibit the use of micro-transactions in apps serving children, including the buying of virtual currency to participate in game playing. - The FTC should adopt a definition of dark patterns to include all “nudges” designed to use a range of behavioral techniques to foster desired responses from users. The groups’ filing was in response to the FTC’s call for comments (link is external) on the use of digital “dark patterns” — deceptive and unfair user interface designs — on websites and mobile apps. Comment of Jeff Chester, executive Director of the Center for Digital Democracy: “Dark Patterns” are being used in the design of child-directed services to manipulate them to spend more time and money on games and other applications, as well as give up more of their data. It’s time the FTC acted to protect young people from being unfairly treated by online companies. The commission should issue rules that prohibit the use of these stealth tactics that target kids and bring legal action against the companies promoting their use. Comment of Josh Golin, executive Director of the Campaign for a Commercial-Free Childhood: In their rush to monetize children, app and game developers are using dark patterns that take advantage of children’s developmental vulnerabilities. The FTC has all the tools it needs to stop unethical, harmful, and illegal conduct. Doing so would be a huge step forward towards creating a healthy media environment for children. Comment of Michael Rosenbloom, Staff Attorney & Clinical Teaching Fellow, Communications and Technology Law Clinic, Georgetown University Law Center: Software and game companies are using dark patterns to pressure children into playing more and paying more. Today, many apps and games that children play use dark patterns like arbitrary virtual currencies, encouragement from in-game characters, and ticking countdown timers, to get children to spend more time and money on microtransactions. These dark patterns harm children and violate Section 5 of the FTC Act, and we urge the FTC to act to stop these practices. ###
  • Press Release

    “Big Food” and “Big Data” Online Platforms Fueling Youth Obesity Crisis as Coronavirus Pandemic Rages

    New Report Calls for Action to Address Saturation of Social Media, Gaming Platforms, and Streaming Video with Unhealthy Food and Beverage Products

    “Big Food” and “Big Data” Online Platforms Fueling Youth Obesity Crisis as Coronavirus Pandemic RagesNew Report Calls for Action to Address Saturation of Social Media, Gaming Platforms, and Streaming Video with Unhealthy Food and Beverage Products Contact: Jeff Chester (202-494-7100) For Immediate ReleaseWashington, DC, May 12, 2021A report released today calls for federal and global action to check the growth of digital marketing of food and beverage products that target children and teens online. Tech platforms especially popular with young people—including Facebook’s Instagram, Amazon’s Twitch, ByteDance’s TikTok, and Google’s YouTube – are working with giant food and beverage companies, such as Coca Cola, KFC, Pepsi and McDonald’s, to promote sugar-sweetened soda, energy drinks, candy, fast food, and other unhealthy products across social media, gaming, and streaming video. The report offers fresh new analysis and insight into the most recent industry practices, documenting how “Big Food” and “Big Tech” are using AI, machine learning, and other data-driven techniques to ensure that food marketing permeates all of the online cultural spaces where children and teenagers congregate. The pandemic has dramatically increased exposure to these aggressive new forms of marketing, further increasing young people’s risks of becoming obese. Black and Brown youth are particularly vulnerable to new online promotional strategies. Noting that concerns about youth obesity have recently fallen off the public radar in the U.S., the report calls for both international and domestic policies to rein in the power of the global technology and food industries. The report and an executive summary are available at the Center for Digital Democracy’s (CDD) website, along with other background material.“Our investigation found that there is a huge amount of marketing for unhealthy foods and beverages all throughout the youth digital media landscape, and it has been allowed to flourish with no government oversight,” explained Kathryn C. Montgomery, PhD, the report’s lead author, Professor Emerita at American University and CDD’s Senior Strategist. “We know from decades of research that marketing of these products contributes to childhood obesity and related illnesses. And we’ve witnessed how so many children, teens, and young adults suffering from these conditions have been particularly vulnerable to the coronavirus. Both the technology industry and the food and beverage industry need to be held accountable for creating an online environment that undermines young people’s health.”The report examines an array of Big Data strategies and AdTech tools used by the food industry, focusing on three major sectors of digital culture that attract large numbers of young people -- the so-called “influencer economy,” gaming and esports platforms, and the rapidly expanding streaming and online video industry.Dozens of digital campaigns by major food and beverage companies, many of which have won prestigious ad industry awards, illustrate some of the latest trends and techniques in digital marketing:The use of influencers is one of the primary ways that marketers reach and engage children and teens. Campaigns are designed to weave branded material “seamlessly into the daily narratives” shared on social media. Children and teens are particularly susceptible to influencer marketing, which taps into their psycho-social development. Marketing researchers closely study how young people become emotionally attached to celebrities and other influencers through “parasocial” relationships.McDonald’s enlisted rapper Travis Scott, to promote the “Travis Scott Meal” to young people, featuring “a medium Sprite, a quarter pounder with bacon, and fries with barbecue sauce.” The campaign was so successful that some restaurants in the chain sold out of supplies within days of its launch. This and other celebrity endorsements have helped boost McDonald’s stock price, generated a trove of valuable consumer data, and triggered enormous publicity across social media.Food and beverage brands have flocked to Facebook-owned Instagram, which is considered one of the best ways to reach and engage teens.According to industry research, nearly all influencer campaigns (93%) are conducted on Instagram. Cheetos’ Chester Cheetah is now an “Instagram creator,” telling his own “stories” along with millions of other users on the platform.One Facebook report, “Quenching Today’s Thirsts: How Consumers Find and Choose Drinks,” found that “64% of people who drink carbonated beverages use Instagram for drinks-related activities, such as sharing or liking posts and commenting on drinks content,” and more than a third of them report following or “liking” soft drink “brands, hashtags, or influencer posts.”The online gaming space generates more revenue than TV, film or music, and attracts viewers and players – including many young people -- who are “highly engaged for a considerable length of time.” Multiplayer online battle arena (MOBA) and first-person shooter games are considered one of the best marketing environments, offering a wide range of techniques for “monetization,” including in-game advertising, sponsorship, product placement, use of influencers, and even “branded games” created by advertisers. Twitch, the leading gaming platform, owned by Amazon, has become an especially important venue for food and beverage marketers. Online gamers and fans are considered prime targets for snack, soft drink, and fast food brands, all products that lend themselves to uninterrupted game play and spectatorship.PepsiCo’s energy drink, MTN DEW Amp Game Fuel, is specifically “designed with gamers in mind.” To attract influencers, it was featured on Twitch’s “Bounty Board,” a one-stop-shopping tool for “streamers,” enabling them to accept paid sponsorship (or “bounties”) from brands that want to reach the millions of gamers and their followers.Red Bull recently partnered with Ninja“the most popular gaming influencer in the world with over 13 million followers on Twitch, over 21 million YouTube subscribers, and another 13 million followers on Instagram.”Dr. Pepper featured the faces of players of the popular Fortnite game on its bottles, with an announcement on Twitter that this campaign resulted in “the most engaged tweet” the soft-drink company had ever experienced.Wendy’s partnered with “five of the biggest Twitch streamers,” as well as food delivery app Uber Eats, to launch its “Never Stop Gaming” menu, with the promise of “five days of non-stop gaming, delicious meal combos and exclusive prizes.” Branded meals were created for each of the five streamers, who offered their fans the opportunity to order directly through their Twitch channels and have the food delivered to their doors.One of the newest marketing frontiers is streaming and online video, which have experienced a boost in viewership during the pandemic. Young people are avid users, accessing video on their mobile devices, gaming consoles, personal computers, and online connections to their TV sets.Concerned that teens “are drinking less soda,” Coca-Cola’s Fanta brand developed a comprehensive media campaign to trigger “an ongoing conversation with teen consumers through digital platforms” by creating four videos based on the brand’s most popular flavors, and targeting youth on YouTube, Hulu, Roku, Crackle, and other online video platforms. “From a convenience store dripping with orange flavor and its own DJ cat, to an 8-bit videogame-ified pizza parlor, the digital films transport fans to parallel universes of their favorite hangout spots, made more extraordinary and fantastic once a Fanta is opened.”New video ad formats allow virtual brand images to be inserted into the content and tailored to specific viewers. “Where one customer sees a Coca-Cola on the table,” explained a marketing executive, “the other sees green tea. Where one customer sees a bag of chips, another sees a muesli bar… in the exact same scene.”The major technology platforms are facilitating and profiting from the marketing of unhealthy food and beverage products.Facebook’s internal “creative shop” has helped Coca-Cola, PepsiCo, Unilever, Nestle and hundreds of other brands develop global marketing initiatives to promote their products across its platform. The division specializes in “building data-driven advertising campaigns, branded content, branded entertainment, content creation, brand management, social design,” and similar efforts.Google regularly provides a showcase for companies such as Pepsi, McDonald’s and Mondelez to tout their joint success promoting their respective products throughout the world.For example, Pepsi explained in a “Think with Google” post that it used Google’s “Director’s Mix” personalization video advertising technology to further what it calls its ability to “understand the consumer’s DNA,” meaning their “needs, context, and location in the shopping journey.” Pepsi could leverage Google’s marketing tools to help its goal of combining “insights with storytelling and drive personalized experiences at scale.”Hershey’s has been working closely with Amazon to market its candy products via streaming video, as well as through its own ecommerce marketplace. In a case study published online, Amazon explained that “…as viewing consumption began to fragment, the brand [Hershey’s] realized it was no longer able to reach its audience with linear TV alone.” Amazon gave Hershey’s access to its storehouse of data so the candy company could market its products on Amazon’s streaming services, such as IMDbTV. Amazon allowed Hershey’s to use Amazon’s data to ensure the candy brands would “be positioned to essentially ‘win’ search in that category on Amazon and end up as the first result….” Hershey’s also made use of “impulse buy” strategies on the Amazon platform, including “cart intercepts,” which prompt a customer to “add in snacks as the last step in their online shopping trip, mimicking the way someone might browse for candy during the checkout at a physical store.”Some of the largest food and beverage corporations—including Coca-Cola, McDonald’s, and Pepsi—have, in effect, transformed themselves into Big Data businesses.Coca-Cola operates over 40 interconnected social media monitoring facilities worldwide, which use AI to follow customers, analyze their online conversations, and track their behaviors.PepsiCo has developed a “fully addressable consumer database” (called “Consumer DNA”) that enables it to “see a full 360 degree view of our consumers.”McDonald’s made a significant investment in Plexure, a “mobile engagement” company specializing in giving fast food restaurants the ability “to build rich consumer profiles” and leverage the data “to provide deeply personalized offers and content that increase average transaction value” and help generate other revenues. One of its specialties is designing personalized messaging that triggers the release of the brain chemical, dopamine.The report raises particularly strong concerns about the impact of all these practices on youth of color, noting that food and beverage marketers “are appropriating some of the most powerful ‘multicultural’ icons of youth pop culture and enlisting these celebrities in marketing campaigns for sodas, ‘branded’ fast-food meals, and caffeine-infused energy drinks.” These promotions can “compound health risks for young Blacks and Hispanics,” subjecting them to “multiple layers of vulnerability, reinforcing existing patterns of health disparity that many of them experience.”“U.S. companies are infecting the world’s young people with invasive, stealth, and incessant digital marketing for junk food,” commented Lori Dorfman, DrPH, director, Berkeley Media Studies Group, one of CDD’s partners on the project. “And they are targeting Black and Brown youth because they know kids of color are cultural trendsetters,” she explained. “Big Food and Big Tech run away with the profits after trampling the health of children, youth, and families.”The Center for Digital Democracy and its allies are calling for a comprehensive and ambitious set of policies for limiting the marketing of unhealthy food and beverages to young people, arguing that U.S. policymakers must work with international health and youth advocacy organizations to develop a coordinated agenda for regulating these two powerful global industries. As the report explains, other governments in the UK, Europe, Canada, and Latin America have already developed policies for limiting or banning the promotion of foods that are high in fat, sugar, and salt, including on digital platforms. Yet, the United States has continued to rely on an outdated self-regulatory model that does not take into account the full spectrum of Big Data and AdTech practices in today’s contemporary digital marketplace, places too much responsibility on parents, and offers only minimal protections for the youngest children.“Industry practices have become so sophisticated, widespread, and entangled that only a comprehensive public policy approach will be able to produce a healthier digital environment for young people,” explained Katharina Kopp, PhD, CDD’s Deputy Director and Director of Research.The report lays out an eight-point research-based policy framework:Protections for adolescents as well as young children.Uniform, global, science-based nutritional criteria.Restrictions on brand promotion.Limits on the collection and use of data.Prohibition of manipulative and unfair marketing techniques and design features.Market research protections for children and teens.Elimination of digital racial discrimination.Transparency, accountability, and enforcement.###