CDD

Press Releases

  • Contact: Jeff Chester, CDD jeff@democraticmedia.org (link sends e-mail); 202-494-7100David Monahan, CCFC, david@commercialfreechildhood.org (link sends e-mail)Advocates say Google Play continues to disregard children’s privacy law and urge FTC to act BOSTON, MA and WASHINGTON, DC — March 31, 2021—Today, advocacy groups Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) to investigate Google’s promotion of apps which violate the Children’s Online Privacy Protection Act (COPPA). In December 2018, CCFC and CDD led a coalition of 22 consumer and health advocacy groups in asking the FTC to investigate these same practices. Since then Google has made changes to the Play Store, but the advocates say these changes fail to address the core problem: Google is certifying as safe and appropriate for children apps that violate COPPA and put children at risk. Recent studies found that a significant number of apps in Google Play violate COPPA by collecting and sharing children’s personal information without getting parental consent. For instance, a JAMA Pediatrics study found that 67% of apps used by children aged 5 and under were transmitting personal identifiers to third parties.Comment of Angela Campbell, Chair of the Board of Directors, Campaign for a Commercial-Free Childhood, Professor Emeritus, Communications & Technology Law Clinic, Georgetown University Law Center:“Parents reasonably expect that Google Play Store apps designated as ‘Teacher approved’ or appropriate for children under age 13 comply with the law protecting children’s privacy. But far too often, that is not the case. The FTC failed to act when this problem was brought to its attention over two years ago. Because children today are spending even more time using mobile apps, the FTC must hold Google accountable for violating children’s privacy.”Comment of Jeff Chester, executive Director of the Center for Digital Democracy:"The Federal Trade Commission must swiftly act to stop Google’s ongoing disregard of the privacy and well-being of children. For too long, the Commission has allowed Google’s app store, and the data marketing practices that are its foundation, to operate without enforcing the federal law that is designed to protect young people under 13. With children using apps more than ever as a consequence of the pandemic, the FTC should enforce the law and ensure Google engages with kids and families in a responsible manner."###
  • Press Statement, Center for Digital Democracy (CDD) and Campaign for a Commercial-Free Childhood (CCFC), 12-14-20 Today, the Federal Trade Commission announced (link is external) it will use its to 6(b) authority to launch a major new study into the data collection practices of nine major tech platforms and companies: ByteDance (TikTok), Amazon, Discord, Facebook, Reddit, Snap, Twitter, WhatsApp and YouTube. The Commission’s study includes a section on children and teens. In December, 2019, the Campaign for a Commercial-Free Childhood (CCFC), Center for Digital Democracy (CDD) and their attorneys at Georgetown Law’s Institute for Public Representation urged the Commission to use its 6(b) authority to better understand how tech companies collect and use data from children. Twenty-seven consumer and child advocacy organizations joined that request. Below are statements from CDD and CCFC on today’s announcement. Josh Golin, Executive Director, CCFC: “We are extremely pleased that the FTC will be taking a hard look at how platforms like TikTok, Snap, and YouTube collect and use young people’s data. These 6(b) studies will provide a much-needed window into the opaque data practices that have a profound impact on young people’s wellbeing. This much-needed study will not only provide critical public education, but lay the groundwork for evidence-based policies that protect young people’s privacy and vulnerabilities when they use online services to connect, learn, and play.” Jeff Chester, Executive Director, CDD: "The FTC is finally holding the social media and online video giants accountable, by requiring leading companies to reveal how they stealthily gather and use information that impacts our privacy and autonomy. It is especially important the commission is concerned about also protecting teens— who are the targets of a sophisticated and pervasive marketing system designed to influence their behaviors for monetization purposes." For questions, please contact: jeff@democraticmedia.org (link sends e-mail) See also: https://www.markey.senate.gov/news/press-releases/senator-markey-stateme... (link is external)
  • For Immediate Release September 24, 2020 Contact: Jeff Chester (202-494-7100) jeff@democraticmedia.org (link sends e-mail) A Step Backwards for Consumer Privacy: Why Californians Should Vote No on Proposition 24 Ventura, CA, and Washington, DC: The Center for Digital Democracy (CDD) announced today its opposition to the California Privacy Rights Act (CPRA), also known as Proposition 24, which will appear on the November 2020 California general election ballot. Prop 24 does not sufficiently strengthen Californians’ privacy and may, in fact, set a new lower and thus more dangerous standard for privacy protection in the U.S., according to its analyses. “We need strong and bold privacy legislation, not weaker standards and tinkering at the margins,” declared CDD Policy Director Katharina Kopp. “Prop 24 fails to significantly limit data uses that undermine our privacy, increase corporate manipulation and exploitation, and exacerbate racial and economic inequality. This initiative allows the much more powerful companies to set unfair terms by default. It also condones pay-for-privacy schemes, where corporations would be allowed to charge a premium (or eliminate a discount) in exchange for privacy. These schemes tend to hurt the already disadvantaged the most,” she explained. CDD intends to work with allies from the consumer and privacy communities to inform voters about Prop 24 and how best to protect their privacy. The Center for Digital Democracy is a leading nonprofit organization focused on empowering and protecting the rights of the public in the digital era.
  • Press Release

    Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices

    Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent.

    Contact: Katharina Kopp, CDD (kkopp@democraticmedia.org (link sends e-mail); 202-836-4621) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail)) Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent. WASHINGTON, DC and BOSTON, MA—September 3, 2020—The nation’s leading children’s privacy advocates are calling on potential buyers of TikTok “to take immediate steps to comprehensively improve its privacy and data marketing practices for young people” should they purchase the platform. In separate letters to Microsoft, Walmart, and Oracle, Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) detail TikTok’s extensive history of violating the Children’s Online Privacy Protection Act (COPPA), including a recent news report that TikTok internally classified more than one-third of its 49 million US users as fourteen or under. Given the likelihood that millions of these users are also under thirteen, the advocates urged Microsoft, Walmart, and Oracle to pledge to immediately stop collecting and processing data from any account flagged as or believed to be under thirteen if they acquire TikTok’s US operations, and only restore accounts that can be affirmatively verified as belonging to users that are thirteen or older. COPPA requires apps and websites to obtain verifiable parental consent before collecting the personal information of anyone under 13, but TikTok has not done so for its millions of accounts held by children. “Whoever purchases TikTok will have access to a treasure trove of ill-gotten, sensitive children’s data,” said Josh Golin, Executive Director of CCFC. “Any new owner must demonstrate their commitment to protecting young people’s privacy by immediately deleting any data that was illegally obtained from children under thirteen. With the keys to one of the most popular platforms for young people on the planet must come a commitment to protect children’s privacy and wellbeing.” In February 2019, TikTok was fined $5.7 million by the Federal Trade Commission (FTC) for COPPA violations and agreed to delete children’s data and properly request parental consent before allowing children under 13 on the site and collecting more data from them. This May, CCFC, CDD, and a coalition of 20 advocacy groups filed an FTC complaint against TikTok for ignoring their promises to delete kids’ data and comply with the law. To this day, the groups say, TikTok plays by its own rules, luring millions of kids under the age of 13, illegally collecting their data, and using it to manipulatively target them with marketing. In addition, they wrote to the companies today that, “By ignoring the presence of millions of younger children on its app, TikTok is putting them at risk for sexual predation; news reports and law enforcement agencies have documented many cases of inappropriate adult-to-child contact on the app.” In August, the groups’ allegations that TikTok had actual knowledge that millions of its users were under thirteen were confirmed by the New York Times. According to internal documents obtained by the Times, TikTok assigns an age range to each user utilizing a variety of methods including “facial recognition algorithms that scrutinize profile pictures and videos,” “comparing their activity and social connections in the app against those of users whose ages have already been estimated,” and drawing “upon information about users that is bought from other sources.” Using these methods, more than one third of TikTok’s 49 million users in the US were estimated to be under fourteen. Among daily users, the proportion that TikTok has designated as under fourteen rises to 47%. “The new owners of TikTok in the U.S. must demonstrate they take protecting the privacy and well-being of young people seriously,” said Katharina Kopp, policy director of the Center for Digital Democracy. “The federal law protecting kids’ privacy must be complied with and fully enforced. In addition, the company should implement a series of safeguards that prohibits manipulative, discriminatory and harmful data and marketing practices that target children and teens. Regulators should reject any proposed sale without ensuring a set of robust set of safeguards for youth are in place,” she noted. ###
  • Press Release

    USDA Online Buying Program for SNAP Participants Threatens Their Privacy and Can Exacerbate Racial and Health Inequities, Says New Report

    Digital Rights, Civil Rights and Public Health Groups Call for Reforms from USDA, Amazon, Walmart, Safeway/Albertson’s and Other Grocery Retailers - Need for Safeguards Urgent During Covid-19 Crisis

    Contact: Jeff Chester jeff@democraticmedia.org (link sends e-mail) 202-494-7100 Katharina Kopp kkopp@democraticmedia.org (link sends e-mail) https://www.democraticmedia.org/ USDA Online Buying Program for SNAP Participants Threatens Their Privacy and Can Exacerbate Racial and Health Inequities, Says New Report Digital Rights, Civil Rights and Public Health Groups Call for Reforms from USDA, Amazon, Walmart, Safeway/Albertson’s and Other Grocery Retailers Need for Safeguards Urgent During Covid-19 Crisis Washington, DC, July 16, 2020—A pilot program designed to enable the tens of millions of Americans who participate in the USDA’s Supplemental Nutrition Assistance Program (SNAP) to buy groceries online is exposing them to a loss of their privacy through “increased data collection and surveillance,” as well as risks involving “intrusive and manipulative online marketing techniques,” according to a report from the Center for Digital Democracy (CDD). The report reveals how online grocers and retailers use an orchestrated array of digital techniques—including granular data profiling, predictive analytics, geolocation tracking, personalized online coupons, AI and machine learning —to promote unhealthy products, trigger impulsive purchases, and increase overall spending at check-out. While these practices affect all consumers engaged in online shopping, the report explains, “they pose greater threats to individuals and families already facing hardship.” E-commerce data practices “are likely to have a disproportionate impact on SNAP participants, which include low-income communities, communities of color, the disabled, and families living in rural areas. The increased reliance on these services for daily food and other household purchases could expose these consumers to extensive data collection, as well as unfair and predatory techniques, exacerbating existing disparities in racial and health equity.” The report was funded by the Robert Wood Johnson Foundation, as part of a collaboration among four civil rights, digital rights, and health organizations: Color of Change, UnidosUS, Center for Digital Democracy, and Berkeley Media Studies Group. The groups issued a letter today to Secretary of Agriculture Sonny Perdue, urging the USDA to take immediate action to strengthen online protections for SNAP participants. USDA launched (link is external) its e-commerce pilot last year in a handful of states, with an initial set of eight retailers approved for participation: Amazon, Dash’s Market, FreshDirect, Hy-Vee, Safeway, ShopRite, Walmart and Wright’s Market. The program has rapidly expanded (link is external) to a majority of states, in part as a result of the current Covid-19 health crisis, in order to enable SNAP participants to shop more safely from home by following “shelter-in-place” rules. Through an analysis of the digital marketing and grocery ecommerce practices of the eight companies, as well as an assessment of their privacy policies, CDD found that SNAP participants and other online shoppers confront an often manipulative and nontransparent online grocery marketplace, which is structured to leverage the tremendous amounts of data gathered on consumers via their mobile devices, loyalty cards, and shopping transactions. E-commerce grocers deliberately foreground the brands and products that partner with them (which include some of the most heavily advertised, processed foods and beverages), making them highly visible on store home pages and on “digital shelves,” as well as through online coupons and well-placed reminders at the point of sale. Grocers working with the SNAP pilot have developed an arsenal of “adtech” (advertising technology) techniques, including those that use machine learning and behavioral science to foster “frictionless shopping” and impulsive purchasing of specific foods and beverages. The AI and Big Data operations documented in the report may also lead to unfair and discriminatory data practices, such as targeting low-income communities and people of color with aggressive promotions for unhealthy food. Data collected and profiles created during online shopping may be applied in other contexts as well, leading to increased exposure to additional forms of predatory marketing, or to denial of opportunities in housing, education, employment, and financial services. “The SNAP program is one of our nation’s greatest success stories because it puts food on the table of hungry families and money in the communities where they live,” explained Dr. Lori Dorfman, Director of the Berkeley Media Studies Group. “Shopping for groceries should not put these families in danger of being hounded by marketers intent on selling products that harm health. Especially in the time of coronavirus when everyone has to stay home to keep themselves and their communities safe, the USDA should put digital safeguards in place so SNAP recipients can grocery shop without being manipulated by unfair marketing practices.” CDD’s research also found that the USDA relied on the flawed and misleading privacy policies of the participating companies, which fail to provide sufficient data protections. According to the pilot’s requirement for participating retailers, privacy policies should clearly explain how a consumer’s data is gathered and used, and provide “optimal” protections. A review of these long, densely worded documents, however, reveals the failure of the companies to identify the extent and impact of their actual data operations, or the risks to consumers. The pilot’s requirements also do not adequately limit the use of SNAP participant’s data for marketing. In addition, CDD tested the companies’ data practices for tracking customers’ behavior online, and compared them to the USDA’s requirements. The research found widespread use of so-called “third party” tracking software (such as “cookies”), which can expose an individual’s personal data to others. “In the absence of strong baseline privacy and ecommerce regulations in the US, the USDA’s weak safeguards are placing SNAP recipients at substantial risk,” explained Dr. Katharina Kopp, one of the report’s authors. “The kinds of e-commerce and Big Data practices we have identified through our research could pose even greater threats to communities of color, including increased commercial surveillance and further discrimination.” “Being on SNAP, or any other assistance program, should not give corporations free rein to use intrusive and manipulative online marketing techniques on Black communities,” said Jade Magnus Ogunnaike, Senior Campaign Director at Color of Change. “Especially in the era of COVID, where online grocery shopping is a necessity, Black people should not be further exposed to a corporate surveillance system with unfair and predatory practices that exacerbate disparities in racial and health equity just because they use SNAP. The USDA should act aggressively to protect SNAP users from unfair, predatory, and discriminatory data practices.” “The SNAP program helps millions of Latinos keep food on the table when times are tough and our nation’s public health and economic crises have highlighted that critical role,” said Steven Lopez, Director of Health Policy at UnidosUS. “Providing enhanced access to healthy and nutritious foods at the expense of the privacy and health of communities of color is too high of a price. Predatory marketing practices have been linked to increased health disparities for communities of color. The USDA must not ignore that fact and should take strong and meaningful steps to treat all participants fairly, without discriminatory practices based on the color of their skin.” The report calls on the USDA to “take an aggressive role in developing meaningful and effective safeguards” before moving the SNAP online purchasing system beyond its initial trial. The agency needs to ensure that contemporary e-commerce, retail and digital marketing applications treat SNAP participants fairly, with strong privacy protections and safeguards against manipulative and discriminatory practices. The USDA should work with SNAP participants, civil rights, consumer and privacy groups, as well as retailers like Amazon and Walmart, to restructure its program to ensure the safety and well-being of the millions of people enrolled in the program. ###
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Statement from Campaign for a Commercial-Free Childhood and Center for Digital Democracy on Comments filed with FTC regarding Endorsement Guides WASHINGTON, DC and BOSTON, MA—June 23, 2020—Advocacy groups Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) filed comments on Monday in response to the FTC’s request for public comment (link is external) on its Endorsement Guides. Jeff Chester, executive director, Center for Digital Democracy: "Influencer marketing should be declared an unfair and deceptive practice when it comes to children. The FTC is enabling so-called ‘kidfluencers,’ ‘brand ambassadors,’ and other ‘celebrity’ marketers to stealthily pitch kids junk food, toys and other products, despite the known risks to their privacy, personal health and security. Kids and teens are being targeted by a ‘wild west’ influencer marketing industry wherever they go online, including when they watch videos, play games, or use social media. It's time for the FTC to place the interests of America's youth before the manipulative commercial activities of influencers." Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood: “The FTC’s failure to act has helped create an entire ecosystem of unfair and deceptive influencer marketing aimed at children. It’s past time for the Commission to send a strong message to everyone – platforms, brands, ad agencies and the influencers themselves – that children should not be targets for this insidious and manipulative marketing.” Angela J. Campbell, Director Emeritus of the Institute for Public Representation’s Communications and Technology Clinic at Georgetown Law, currently chair of CCFC’s Board, and counsel to CCFC and CDD: "Influencer videos full of hidden promotions and sometimes blatant marketing have largely displaced actual programs for children. The FTC must act now to stop these deceptive and unfair practices." ###
  • Supporting the Call for Racial Justice The Center for Digital Democracy supports the call for racial justice and the fight against police violence, against the systemic injustices that exist in all parts of our society – inferior educational opportunities; lack of affordable equitable health care; an unjust justice system; housing and employment discrimination; and discriminatory marketing practices. We grieve for the lives lost and the opportunities denied! We grieve for the everyday injustices people of color have to endure and had to endure for centuries. We grieve for an America that could be so much more! Our grieving is not enough! CDD will continue its fight for data justice in support of racial and social justice June 5, 2020
  • Press Release

    Groups Tell FTC to Investigate TikTok’s Failure to Protect Children’s Privacy

    TikTok gathers data from children despite promise made to commission

    Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Say TikTok In Contempt of Court Order More kids than ever use the site due to COVID19 quarantine, but TikTok flouts settlement agreement with the FTC WASHINGTON, DC and BOSTON, MA—May 14, 2020—Today, a coalition of leading U.S. child advocacy, consumer, and privacy groups filed a complaint (link is external) urging the Federal Trade Commission (FTC) to investigate and sanction TikTok for putting kids at risk by continuing to violate the Children’s Online Privacy Protection Act (COPPA). In February 2019, TikTok paid a $5.7 million fine for violating COPPA, including illegally collecting personal information from children. But more than a year later, with quarantined kids and families flocking to the site in record numbers, TikTok has failed to delete personal information previously collected from children and is still collecting kids’ personal information without notice to and consent of parents. Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), and a total of 20 organizations demonstrated in their FTC filing that TikTok continues to violate COPPA by: failing to delete personal information related to children under 13 it obtained prior to the 2019 settlement order; failing to give direct notice to parents and to obtain parents’ consent before collecting kids’ personal information; and failing to give parents the right to review or delete their children’s personal information collected by TikTok. TikTok makes it easy for children to avoid obtaining parental consent. When a child under 13 tries to register using their actual birthdate, they will be signed up for a “younger users account” with limited functions, and no ability to share their videos. If a child is frustrated by this limited functionality, they can immediately register again with a fake birthdate from the same device for an account with full privileges, thereby putting them at risk for both TikTok’s commercial data uses and inappropriate contact from adults. In either case, TikTok makes no attempt to notify parents or obtain their consent. And TikTok doesn’t even comply with the law for those children who stick with limited “younger users accounts.” For these accounts, TikTok collects detailed information about how the child uses the app and uses artificial intelligence to determine what to show next, to keep the child engaged online as long as possible. The advocates, represented by the Communications & Technology Law Clinic in the Institute for Public Representation at Georgetown Law, asked the FTC to identify and hold responsible those individuals who made or ratified decisions to violate the settlement agreement. They also asked the FTC to prevent TikTok from registering any new accounts for persons in the US until it adopts a reliable method of determining the ages of its users and comes into full compliance with the children’s privacy rules. In light of TikTok’s vast financial resources, the number and severity of the violations, and the large number of US children that use TikTok, they asked the FTC to seek the maximum monetary penalties allowed by law. Josh Golin, Executive Director of Campaign for a Commercial-Free Childhood, said “For years, TikTok has ignored COPPA, thereby ensnaring perhaps millions of underage children in its marketing apparatus, and putting children at risk of sexual predation. Now, even after being caught red-handed by the FTC, TikTok continues to flout the law. We urge the Commission to take swift action and sanction TikTok again – this time with a fine and injunctive relief commensurate with the seriousness of TikTok’s serial violations.” Jeff Chester, Executive Director of the Center for Digital Democracy, said “Congress empowered the FTC to ensure that kids have online protections, yet here is another case of a digital giant deliberately violating the law. The failure of the FTC to ensure that TikTok protects the privacy of millions of children, including through its use of predictive AI applications, is another reason why there are questions whether the agency can be trusted to effectively oversee the kids’ data law.” Michael Rosenbloom, Staff Attorney and Teaching Fellow at the Institute for Public Representation, Georgetown Law, said “The FTC ordered TikTok to delete all personal information of children under 13 years old from its servers, but TikTok has clearly failed to do so. We easily found that many accounts featuring children were still present on TikTok. Many of these accounts have tens of thousands to millions of followers, and have been around since before the order. We urge the FTC to hold TikTok to account for continuing to violate both COPPA and its consent decree.” Katie McInnis, Policy Counsel at Consumer Reports, said "During the pandemic, families and children are turning to digital tools like TikTok to share videos with loved ones. Now more than ever, effective protection of children's personal information requires robust enforcement in order to incentivize companies, including TikTok, to comply with COPPA and any relevant consent decrees. We urge the FTC to investigate the matters raised in this complaint" Groups signing on to the complaint to the FTC are: Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Badass Teachers Association, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Action, Consumer Federation of America, Consumer Reports, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, Obligation, Inc., Parent Coalition for Student Privacy, Parents Across America, ParentsTogether Foundation, Privacy Rights Clearinghouse, Public Citizen, The Story of Stuff, United Church of Christ, and USPIRG. ###
  • Press Release

    Groups Say White House Must Show Efficacy, Protect Privacy, and Ensure Equity When Deploying Technology to Fight Virus

    Fifteen leading consumer, privacy, civil and digital rights organizations called on the federal government to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. There must be consensus among all relevant stakeholders on the most efficacious solution before relying on a technological fix to respond to the pandemic.

    FOR IMMEDIATE RELEASE Contacts: Susan Grant (link sends e-mail), CFA, 202-939-1003 May 5, 2020 Katharina Kopp (link sends e-mail), CDD, 202-836 4621 White House Must Act To protect privacy and ensure equity in responding to COVID-19 pandemic Groups Tell Pence to Set Standards to Guide Government and Public-Private Partnership Data Practices and Technology Use Washington, D.C. – Today, 15 leading consumer, privacy, civil and digital rights organizations called on the federal government (link is external) to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. In a letter to Vice President Michael R. Pence, who leads the Coronavirus Task Force, the groups said that the proper use of technology and data have the potential to provide important public health benefits, but must incorporate privacy and security, as well as safeguards against discrimination and violations of civil and other rights. Developing a process to assess how effective technology and other tools will be to achieve the desired public health objectives is also vitally important, the groups said. The letter (link is external) was signed by the Campaign for a Commercial Free Childhood, Center for Democracy & Technology, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Electronic Privacy Information Center (EPIC), Media Alliance, MediaJustice, Oakland Privacy, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Public Citizen, Public Knowledge, and Rights x Tech. “A headlong rush into technological solutions without carefully considering how well they work and whether they could undermine fundamental American values such as privacy, equity, and fairness would be a mistake,” said Susan Grant, Director of Consumer Protection and Privacy at the Consumer Federation of America. “Fostering public trust and confidence in the programs that are implemented to combat COVID-19 is crucial to their overall success.” “Measures to contain the deadly spread of COVID-19 must be effective and protect those most exposed. History has taught us that the deployment of technologies is often driven by forces that tend to risk privacy, undermine fairness and equity, and place our civil rights in peril. The White House Task Force must work with privacy, consumer and civil rights groups, and other experts, to ensure that the efforts to limit the spread of the virus truly protect our interests,” said Katharina Kopp, Director of Policy, Center for Digital Democracy. In addition to concerns about government plans that are being developed to address the pandemic, such as using technology for contact tracing, the groups noted the need to ensure that private-sector partnerships incorporate comprehensive privacy and security standards. The letter outlines 11 principles that should form the basis for standards that government agencies and the private sector can follow: Set science-based, public health objectives to address the pandemic. Then design the programs and consider what tools, including technology, might be most efficacious and helpful to meet those objectives. Assess how technology and other tools meet key criteria. This should be done before deployment when possible and consistent with public health demands, and on an ongoing basis. Questions should include: Can they be shown to be effective for their intended purposes? Can they be used without infringing on privacy? Can they be used without unfairly disadvantaging individuals or communities? Are there other alternatives that would help meet the objectives well without potentially negative consequences? Use of technologies and tools that are ineffective or raise privacy or other societal concerns should be discontinued promptly. Protect against bias and address inequities in technology access. In many cases, communities already disproportionately impacted by COVID-19 may lack access to technology, or not be fairly represented in data sets. Any use of digital tools must ensure that nobody is left behind. Set clear guidelines for how technology and other tools will be used. These should be aimed at ensuring that they will serve the public health objective while safeguarding privacy and other societal values. Public and private partners should be required to adhere to those guidelines, and the guidelines should be readily available to the public. Ensure that programs such as technology-assisted contact tracing are voluntary. Individual participation should be based on informed, affirmative consent, not coercion. Only collect individuals’ personal information needed for the public health objective. No other personal information should be collected in testing, contact tracing, and public information portals. Do not use or share individuals’ personal information for any other purposes. It is important to avoid “mission creep” and to prevent use for purposes unrelated to the pandemic such as for advertising, law enforcement, or for reputation management in non-public health settings. Secure individuals’ personal information from unauthorized access and use. Information collected from testing, contact tracing and information portals may be very revealing, even if it is not “health” information, and security breaches would severely damage public trust. Retain individuals’ personal information only for as long as it is needed. When it is no longer required for the public health objective, the information should be safely disposed of. Be transparent about data collection and use. Before their personal information is collected, individuals should be informed about what data is needed, the specific purposes for which the data will be used, and what rights they have over what’s been collected about them. Provide accountability. There must be systems in place to ensure that these principles are followed and to hold responsible parties accountable. In addition, individuals should have clear means to ask questions, make complaints, and seek recourse in connection with the handling of their personal information. The groups asked Vice President Pence for a meeting to discuss their concerns and suggested that the Coronavirus Task Force immediately create an interdisciplinary advisory committee comprised of experts from public health, data security, privacy, social science, and civil society to help develop effective standards. The Consumer Federation of America (link is external) is a nonprofit association of more than 250 consumer groups that was founded in 1968 to advance the consumer interest through research, advocacy, and education. The Center for Digital Democracy (CDD) is recognized as one of the leading NGOs organizations promoting privacy and consumer protection, fairness and data justice in the digital age. Since its founding in 2001 (and prior to that through its predecessor organization, the Center for Media Education), CDD has been at the forefront of research, public education, and advocacy.
  • Press Release

    Children’s privacy advocates call on FTC to require Google, Disney, AT&T and other leading companies to disclose how they gather and use data to target kids and families

    Threats to young people from digital marketing and data collection are heightened by home schooling and increased video and mobile streaming in response to COVID-19

    Contact: Jeffrey Chester, CDD, jeff@democraticmedia.org (link sends e-mail), 202-494-7100 Josh Golin, CCFC, josh@commercialfreechildhood.org (link sends e-mail), 339-970-4240 Children’s privacy advocates call on FTC to require Google, Disney, other leading companies to disclose how they gather and use data to target kids and families Threats to young people from digital marketing and data collection are heightened by home schooling and increased video and mobile streaming in response to COVID-19 WASHINGTON, DC and BOSTON, MA – March 26, 2020 – With children and families even more dependent on digital media during the COVID-19 crisis, the Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) to require leading digital media companies to turn over information on how they target kids, including the data they collect. In a letter to the FTC, the advocates proposed a series of questions to shed light on the array of opaque data collection and digital marketing practices which the tech companies employ to target kids. The letter includes a proposed list of numerous digital media and marketing companies and edtech companies that should be the targets of the FTC’s investigation—among them are Google, Zoom, Disney, Comcast, AT&T, Viacom, and edtech companies Edmodo and Prodigy. The letter—sent by the Institute for Public Representation at Georgetown Law, attorneys for the advocates—is in response to the FTC’s early review of the rules protecting children under the Children’s Online Privacy Protection Act (COPPA). The groups said “children’s privacy is under siege more than ever,” and urged the FTC “not to take steps that could undermine strong protections for children’s privacy without full information about a complex data collection ecosystem.” The groups ask the Commission to request vital information from two key sectors that greatly impact the privacy of children: the edtech industry, which provides information and technology applications in the K-12 school setting, and the commercial digital data and marketing industry that provides the majority of online content and communications for children, including apps, video streaming, and gaming. The letter suggests numerous questions for the FTC to get to the core of how digital companies conduct business today, including contemporary Big Data practices that capture, analyze, track, and target children across platforms. “With schools closed across the country, American families are more dependent than ever on digital media to educate and occupy their children,” said CCFC’s Executive Director, Josh Golin. “It’s now urgent that the FTC use its full authority to shed light on the business models of the edtech and children’s digital media industries so we can understand what Big Tech knows about our children and what they are doing with that information. The stakes have never been higher.” “Although children’s privacy is supposed to be protected by federal law and the FTC, young people remain at the epicenter of a powerful data-gathering and commercial online advertising system," said Dr. Katharina Kopp, Deputy Director of the Center for Digital Democracy. “We call on the FTC to investigate how companies use data about children, how these data practices work against children’s interests, and also how they impact low-income families and families of color. Before it proposes any changes to the COPPA rules, the FTC needs to obtain detailed insights into how contemporary digital data practices pose challenges to protecting children. Given the outsize intrusion of commercial surveillance into children’s and families’ lives via digital services for education, entertainment, and communication, the FTC must demonstrate it is placing the welfare of kids as its highest priority.” In December, CCFC and CDD led a coalition of 31 groups—including the American Academy of Pediatrics, Center for Science in the Public Interest, Common Sense Media, Consumer Reports, Electronic Privacy Information Center, and Public Citizen—in calling on the FTC to use its subpoena authority. The groups said the Commission must better assess the impacts on children from today’s digital data-driven advertising system, and features such as cross-device tracking, artificial intelligence, machine learning, virtual reality, and real-time measurement. “Childhood is more digital than ever before, and the various ways that children's data is collected, analyzed, and used have never been more complex or opaque,” said Lindsey Barrett, Staff Attorney and Teaching Fellow at IPR’s Communications and Technology Law Clinic at Georgetown Law. “The Federal Trade Commission should shed light on how children's privacy is being invaded at home, at school, and throughout their lives by investigating the companies that profit from collecting their data, and cannot undertake an informed and fact-based revision of the COPPA rules without doing so.” "Children today, more than ever, have an incredible opportunity to learn, play, and socialize online,” said Celia Calano, student attorney at the Institute for Public Representation. “But these modern playgrounds and classrooms come with new safety concerns, including highly technical and obscure industry practices. The first step to improving the COPPA Rule and protecting children online is understanding the current landscape—something the FTC can achieve with a 6(b) investigation." ###
  • Press Release

    Popular Dating, Health Apps Violate Privacy

    Leading Consumer and Privacy Groups Urge Congress, the FTC, State AGs in California, Texas, Oregon to Investigate

    Popular Dating, Health Apps Violate Privacy Leading Consumer and Privacy Groups Urge Congress, the FTC, State AGs in California, Texas, Oregon to Investigate For Immediate Release: Jan. 14, 2020 Contact: David Rosen, drosen@citizen.org (link is external), (202) 588-7742 Angela Bradbery, abradbery@citizen.org (link is external), (202) 588-7741 WASHINGTON, D.C. – Nine consumer groups today asked (link is external) the Federal Trade Commission (FTC), congressional lawmakers and the state attorneys general of California, Texas and Oregon to investigate several popular apps available in the Google Play Store. A report (link is external) released today by the Norwegian Consumer Council (NCC) alleges that the apps are systematically violating users’ privacy. The report found that 10 well-known apps – Grindr, Tinder, OkCupid, Happn, Clue, MyDays, Perfect365, Qibla Finder, My Talking Tom 2 and Wave Keyboard – are sharing information they collect on users with third-party advertisers without users’ knowledge or consent. The European Union’s General Data Protection Regulation forbids sharing information with third parties without users’ knowledge or consent. When it comes to drafting a new federal privacy law, American lawmakers cannot trust input from companies who do not respect user privacy, the groups maintain. Congress should use the findings of the report as a roadmap for a new law that ensures that such flagrant violations of privacy found in the EU are not acceptable in the U.S. The new report alleges that these apps (and likely a great many others) are allowing commercial third parties to collect, use and share sensitive consumer data in a way that is hidden from the user and involves parties that the consumer neither knows about nor would be familiar with. Although consumers can limit some tracking on desktop computers through browser settings and extensions, the same cannot be said for smartphones and tablets. As consumers use their smartphones throughout the day, the devices are recording information about sensitive topics such as our health, behavior, religion, interests and sexuality. “Consumers cannot avoid being tracked by these apps and their advertising partners because they are not provided with the necessary information to make informed choices when launching the apps for the first time. In addition, consumers are unable to make an informed choice because the extent of tracking, data sharing, and the overall complexity of the adtech ecosystem is hidden and incomprehensible to average consumers,” the letters sent to lawmakers and regulators warn. The nine groups are the American Civil Liberties Union of California, Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Consumer Action, Consumer Federation of America, Consumer Reports, the Electronic Privacy Information Center (EPIC), Public Citizen and U.S. PIRG. In addition to calling for an investigation, the groups are calling for a strong federal digital privacy law that includes a new data protection agency, a private right of action and strong enforcement mechanisms. Below are quotes from groups that signed the letters: “Every day, millions of Americans share their most intimate personal details on these apps, upload personal photos, track their periods and reveal their sexual and religious identities. But these apps and online services spy on people, collect vast amounts of personal data and share it with third parties without people’s knowledge. Industry calls it adtech. We call it surveillance. We need to regulate it now, before it’s too late.” Burcu Kilic, digital rights program director, Public Citizen “The NCC’s report makes clear that any state or federal privacy law must provide sufficient resources for enforcement in order for the law to effectively protect consumers and their privacy. We applaud the NCC’s groundbreaking research on the adtech ecosystem underlying popular apps and urge lawmakers to prioritize enforcement in their privacy proposals.” Katie McInnis, policy counsel, Consumer Reports “U.S. PIRG is not surprised that U.S. firms are not complying with laws giving European consumers and citizens privacy rights. After all, the phalanx of industry lobbyists besieging Washington, D.C., has been very clear that its goal is simply to perpetuate a 24/7/365 surveillance capitalism business model, while denying states the right to protect their citizens better and denying consumers any real rights at all.” Ed Mierzwinski, senior director for consumer programs, U.S. PIRG “This report reveals how the failure of the U.S. to enact effective privacy safeguards has unleashed an out-of-control and unaccountable monster that swallows up personal information in the EU and elsewhere. The long unregulated business practices of digital media companies have shred the rights of people and communities to use the internet without fear of surveillance and manipulation. U.S. policymakers have been given a much-needed wake-up call by Norway that it’s overdue for the enactment of laws that bring meaningful change to the now lawless digital marketplace.” Jeff Chester, executive director, Center for Digital Democracy “For those of us in the U.S., this research by our colleagues at the Norwegian Consumer Council completely debunks the argument that we can protect consumers’ privacy in the 21st century with the old notice-and-opt-out approach, which some companies appear to be clinging to in violation of European law. Business practices have to change, and the first step to accomplish that is to enact strong privacy rights that government and individuals can enforce.” Susan Grant, director of consumer protection and privacy, Consumer Federation of America “The illuminating report by our EU ally the Norwegian Consumer Council highlights just how impossible it is for consumers to have any meaningful control over how apps and advertising technology players track and profile them. That’s why Consumer Action is pressing for comprehensive U.S. federal privacy legislation and subsequent strong enforcement efforts. Enough is enough already! Congress must protect us from ever-encroaching privacy intrusions.” Linda Sherry, director of national priorities, Consumer Action “For families who wonder what they’re trading off for the convenience of apps like these, this report makes the answer clear. These companies are exploiting us – surreptitiously collecting sensitive information and using it to target us with marketing. It’s urgent that Congress pass comprehensive legislation which puts the privacy interests of families ahead of the profits of businesses. Thanks to our friends at the Norwegian Consumer Council for this eye-opening research.” David Monahan, campaign manager, Campaign for a Commercial-Free Childhood “This report highlights the pervasiveness of corporate surveillance and the failures of the FTC notice-and-choice model for privacy protection. Congress should pass comprehensive data protection legislation and establish a U.S. Data Protection Agency to protect consumers from the privacy violations of the adtech industry.” Christine Bannan, consumer protection counsel, EPIC
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Groups Praise Sen. Markey and Google for Ensuring Children on YouTube Receive Key Safeguards BOSTON, MA & WASHINGTON, DC—December 18, 2019—The organizations that spurred the landmark FTC settlement with Google over COPPA violations applauded the announcement of additional advertising safeguards for children on YouTube today. The Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) commended Google for announcing it would apply most of its robust marketing protections on YouTube Kids, including no advertising of food or beverages or harmful products, to all child-directed content on its main YouTube platform. The groups also lauded Senator Markey for securing (link is external) a public commitment from Google to implement these long-overdue safeguards. The advocates expressed disappointment, however, that Google did not agree to prohibit paid influencer marketing and product placement to children on YouTube as it does on YouTube Kids “Sen. Ed Markey has long been and remains the champion for kids,” said Jeff Chester, CDD’s executive director. “Through the intervention of Sen. Markey, Google has finally committed to protecting children whether they are on the main YouTube platform or using the YouTube Kids app. Google has acted responsibly in announcing that its advertising policies now prohibit any food and beverage marketing on YouTube Kids, as well as ads involving ‘sexually suggestive, violent or dangerous content.’ However, we remain concerned that Google may try to weaken these important child- and family-friendly policies in the near future. Thus we call on Google to commit to keeping these rules in place, and to implement other needed safeguards that children deserve,” added Chester. Josh Golin, Executive Director of CCFC, said, “We are so grateful to Senator Markey for his leadership on one of the most crucial issues faced by children and families today. And we commend Google for implementing a robust set of advertising safeguards on the most popular online destination for children. We urge Google to take another critical step and prohibit child-directed influencer content on YouTube; if this manipulative marketing isn’t allowed on children’s TV or YouTube Kids, it shouldn’t be targeted to children on the main YouTube platform either.” ###
  • Washington, December 11, 2019 In comments filed today in response to the Federal Trade Commission’s review of COPPA, the Center for Digital Democracy, the Campaign for a Commercial-Free Childhood, the American Academy of Pediatrics, and a total of 19 advocacy groups faulted the FTC for failing to engage in sufficient enforcement and oversight of the children’s privacy law. The groups suggested how COPPA can better protect children’s privacy, and urged the Commission not to weaken the law to satisfy industry’s thirst for more data about kids. The advocates also urged the FTC first to investigate the children’s “kid tech” market before it proposes any changes in how to implement its rules. The following can be attributed to Jeff Chester, Executive Director, Center for Digital Democracy: “Children are at greater risk today of losing their digital privacy because the FTC has failed to enforce COPPA. For years, the Commission has allowed Google and many others to ignore the landmark bipartisan law designed to protect children under 13. It’s time for the FTC to stand up to the big data companies and put the interests of young people and families first.” The following can be attributed to Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood: “This is a critical moment for the future of children’s online privacy. The ink is barely dry on the FTC’s first major COPPA enforcement, and already industry is mobilizing to weaken the rules. The FTC should not make any changes to COPPA until it uses its authority to learn exactly how Big Tech is collecting and monetizing our children’s data.” The following can be attributed to Kyle Yasuda, MD, FAAP, President, American Academy of Pediatrics: “Keeping children safe and healthy where they learn and grow is core to what pediatricians do every day, and today more than ever before that extends to the digital spaces that children inhabit. The Children’s Online Privacy Protection Act is a foundational law that helps hold companies accountable to basic standards of safety when it comes to children’s digital privacy, but it’s only as effective as its enforcement by the Federal Trade Commission. Before any major changes are made to COPPA, we must ensure that the FTC is doing its part to keep children safe wherever they engage online.” The following can be attributed to Laura Moy, Associate Professor of Law, Director of the Communications and Technology Law Clinic, Institute for Public Representation at Georgetown University Law Center: “A recent survey showed that the majority of Americans feel that ‘the threat to personal privacy online is a crisis.’ We are at a critical point in our nation’s history right now—when we are deciding whether or not to allow companies to track, profile, and target us to an extent that compromises our ability to be and make decisions for ourselves. At the forefront of that discussion are children. We must protect the next generation from inappropriate tracking so that they can grow up with privacy and dignity. To make good on that, the FTC must thoroughly investigate how companies are collecting and using children’s data, and must enforce and strengthen COPPA.”
  • Press Release

    Leading child advocacy, health, and privacy groups call on FTC to Investigate Children’s Digital Media Marketplace Before Proposing any Changes to Privacy Protections for Children

    Threats to young people from digital marketing and data collection must be analyzed to ensure meaningful safeguards under the Children’s Online Privacy Protection Act (COPPA).

    EMBARGOED UNTIL DECEMBER 5, 2019 AT 12:01 AM Contact: Jeffrey Chester, CDD, jeff@democraticmedia.org (link sends e-mail), (202) 494-7100 Josh Golin, CCFC, josh@commercialfreechildhood.org (link sends e-mail); 617-896-9369 Leading child advocacy, health, and privacy groups call on FTC to Investigate Children’s Digital Media Marketplace Before Proposing any Changes to Privacy Protections for Children Threats to young people from digital marketing and data collection must be analyzed to ensure meaningful safeguards under the Children’s Online Privacy Protection Act (COPPA). WASHINGTON, DC and BOSTON, MA – December 5, 2019 – A coalition of 31 advocacy groups is urging the Federal Trade Commission to use its subpoena authority to obtain information from leading digital media companies that target children online. In comments filed today by the Institute for Public Representation at Georgetown and organized by Center for Digital Democracy (CDD) and the Campaign for a Commercial-Free Childhood (CCFC), the coalition explained the opaque data and digital marketing practices targeting kids. The comments are filed with the FTC as part of its early review of the rules protecting children under the Children’s Online Privacy Protection Act (COPPA). The advocates’ call was supported by Sesame Workshop, the leading producer of children’s educational programming, in a separate filing. To better assess the impacts on children from today’s digital data-driven advertising system, and features such as cross-device tracking, artificial intelligence, machine learning, virtual reality, and real-time measurement—the advocates urge the commission to gather and analyze data from leading companies that target children. Any proposed changes to COPPA must be based on empirical data, which is consistent with calls by Commissioners Wilson, Phillips, and Simons that rulemaking must be evidence-based. In their comments, the organizations ask the FTC to use its authority under rule 6(b) to: - Examine today’s methods of advertising to children and their impact, including their discriminatory effects - Examine practices concerning data collection and retention - Illuminate children’s presence on “general audience” platforms and those platforms’ awareness of children’s presence - Identify how the data of children is being used by contemporary data platforms, including “marketing clouds,” “identity management” systems, in-house data management platforms, and data brokers - Illuminate the efficacy—or lack thereof—of safe harbors Groups that have signed the comments are Campaign for a Commercial-Free Childhood; Center for Digital Democracy; American Academy of Pediatrics; Badass Teachers Association; Berkeley Media Studies Group; Center for Science in the Public Interest; Children and Screens; Color of Change; Common Sense Media; Consumer Action; Consumer Federation of America; Consumer Federation of California; Consumer Reports; Consumer Watchdog; Corporate Accountability; Defending the Early Years; Electronic Frontier Foundation; Electronic Privacy Information Center; Obligation, Inc.; Parent Coalition for Student Privacy; Parents Across America; Parents Television Council; P.E.A.C.E. (Peace Educators Allied For Children Everywhere) (link is external); Privacy Rights Clearinghouse; Public Citizen; Public Knowledge; The Story of Stuff; TRUCE (Teachers Resisting Unhealthy Childhood Entertainment); UnidosUS; United Church of Christ; and U.S. Public Interest Research Group (U.S. PIRG). …. The following can be attributed to Kyle Yasuda, MD, FAAP, President, American Academy of Pediatrics: “As children become more digitally connected, it becomes even more important for parents, pediatricians and others who care for young children to understand how digital media impacts their health and development. Since digital technology evolves rapidly, so must our understanding of how data companies are engaging with children’s information online. As we pursue the promise of digital media for children’s development, we must design robust protections to keep them safe based on an up-to-date understanding of the digital spaces they navigate.” The following can be attributed to Josh Golin, Executive Director of Campaign for Commercial-Free Childhood: As kids are spending more time than ever on digital devices, we need the full power of the law to protect them from predatory data collection -- but we can't protect children from Big Tech business models if we don't know how those models truly work. The FTC must use its full authority to investigate opaque data and marketing practices before making any changes to COPPA. We need-to-know what Big Tech knows about our kids." The following can be attributed to Katharina Kopp, Director of Policy, Center for Digital Democracy (CDD): “Children are being subjected to a purposefully opaque ‘Big Data’ digital marketing system that continually gathers their information when they are online. The FTC must use its authority to understand how new and evolving advertising practices targeting kids really work, and whether these data practices are having a discriminatory, or other harmful impact, on their lives.” The following can be attributed to James P. Steyer, CEO and Founder of Common Sense: "Kids and families have to be the priority in any changes to COPPA and in order to do that, we must fully understand what the industry is and isn’t doing when it comes to tracking and targeting kids. Tech companies are never going to be transparent about their business practices which is why it is critical that the FTC use its authority to look behind the curtain and shed light on what they are doing when it comes to kids so that if any new rules are needed, they can be smart and well-informed." The following can be attributed to Katie McInnis, Policy Counsel, Consumer Reports: "We’re glad the FTC is asking for comments on the implementation of COPPA through the 2013 COPPA rule. But the Commission should have the fullest possible picture of how children's personal information is being collected and used before it considers any changes. It’s well-documented that compliance with COPPA is uneven among apps, connected toys, and online services. The FTC must fully understand how kids’ personal information is treated before the 2013 rule can be modified, in order to ensure that children and their data are protected.” The following can be attributed to Marc Rotenberg, President, Electronic Privacy Information Center (EPIC): “The FTC should complete its homework before it proposes changes to the regulations that safeguard children’s privacy. Without a clear understanding of current industry practices, the agency’s proposal will be ill-informed and counterproductive." The following can be attributed to Lindsey Barrett, Staff Attorney and Teaching Fellow, Institute for Public Representation, Georgetown Law: The FTC should conduct 6(b) studies to shed light on the complex and evolving profiling practices that violate children’s privacy. Children are being monitored, quantified, and analyzed more than ever before, and the Commission cannot make informed decisions about the rules that protect them online based on limited or skewed information about the online ecosystem. The following can be attributed to Robert Weissman, President, Public Citizen: “The online corporate predators are miles ahead of the FTC, employing surveillance and targeting tactics against children that flout the protections enshrined in COPPA. The first thing the FTC should do is invoke its investigative powers to get a firm grasp on how Big Tech is systematically invading children’s privacy.” The following can be attributed to Cheryl A. Leanza, Policy Advisor, UCC OC Inc.: “In the modern era, our data are our lives and our children’s lives are monitored and tracked in more detail than any previous generation to unknown effect. Parents seek to pass on their own values and priorities to their children, but feel subverted at every turn by unknown algorithms and marketing efforts directed to their children. At a minimum, the FTC must collect basic facts and trends about children and their data privacy.” The following can be attributed to Eric Rodriguez, Senior Vice President, UnidosUS: “All children should have the right to privacy and live free from discrimination, including in digital spaces. Latino children are targeted by digital marketing efforts and with real consequences to their health and wellbeing. UnidosUS urges the Commission to use its authority and study how children of color operate in the digital space, what happens to their personal data, and how well they are protected by COPPA. Only then can the Commission take effective and objective action to strengthen COPPA to protect an increasingly diverse youth population.”
    Jeff Chester
  • SUBJECT: CCFC and CDD statement on today’s YouTube inquiry by Senator Markey Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, whose complaint led to the FTC settlement (link is external) which requires YouTube to change its practices to comply with federal children’s privacy law, applaud Senator Ed Markey for writing to Google (link is external) to inquire about YouTube’s child-directed advertising practices. “To its credit, Google has developed a robust set of safeguards and policies on YouTube Kids to protect children from advertising for harmful products and exploitative influencer marketing. Now that Google has been forced to abandon the fiction that the main YouTube platform is exclusively for ages 13 and up, it should apply the same protections on all child-directed content, regardless of which YouTube platform kids are using.” Josh Golin, Campaign for a Commercial-Free Childhood “Google should treat all children fairly on YouTube and apply the same set of advertising and content safeguards it has especially developed for YouTube Kids. When young people view child-directed programming on YouTube, they should also be protected from harmful and unfair practices such as ‘influencer’ marketing, exposure to ‘dangerous’ material, violent content, and exposure to food and beverage marketing.” Jeff Chester, Center for Digital Democracy
  • Press Release

    Grading Digital Privacy Proposals in Congress

    Which digital privacy proposals in Congress make the grade?

    Subject: Which digital privacy proposals in Congress make the grade? Nov. 21, 2019 Contact: David Rosen, drosen@citizen.org (link sends e-mail), (202) 588-7742 Susan Grant, sgrant@consumerfed.org (link sends e-mail), (202) 387-6121 Caitriona Fitzgerald, fitzgerald@epic.org (link sends e-mail), (617) 945-8409 Katharina Kopp, kkopp@democraticmedia.org (link sends e-mail), (202) 836-4621 Campaign for a Commercial-Free Childhood · Center for Digital Democracy · Color of Change · Consumer Federation of America · Consumer Action · Electronic Privacy Information Center · Parent Coalition for Student Privacy · Privacy Rights Clearinghouse · Public Citizen · U.S. PIRG NOTE TO REPORTERS Grading Digital Privacy Proposals in Congress When it comes to digital privacy, we’re facing an unprecedented crisis. Tech giants are spying on our families and selling the most intimate details about our lives for profit. Bad actors, both foreign and domestic, are targeting personal data gathered by U.S. companies – including our bank details, email messages and Social Security numbers. Algorithms used to determine eligibility for jobs, housing, credit, insurance and other life necessities are having disparate, discriminatory impacts on disadvantaged groups. We need a new approach. Consumer, privacy and civil rights groups are encouraged by some of the bills that recently have been introduced in Congress, many of which follow recommendations in the groups’ Framework for Comprehensive Privacy Protection and Digital Rights in the United States (link is external). The framework calls for baseline federal privacy legislation that: - Has a clear and comprehensive definition of personal data; - Establishes an independent data protection agency; - Establishes a private right of action allowing individuals to enforce their rights; - Establishes individual rights to access, control and delete data; - Puts meaningful privacy obligations on companies that collect personal data; - Requires the establishment of algorithmic governance to advance fair and just data practices; - Requires companies to minimize privacy risks and minimize data collection; - Prohibits take-it-or-leave-it or pay-for-privacy terms; - Limits government access to personal data; and - Does not preempt stronger state laws. Three bills attained the highest marks in the recent Privacy Legislation Scorecard (link is external) compiled by the Electronic Privacy Information Center (EPIC): - The Online Privacy Act (H.R. 4978 (link is external)), introduced by U.S. Reps. Anna Eshoo (D-Calif.) and Zoe Lofgren (D-Calif.), takes a comprehensive approach and is the only bill that calls for a U.S. Data Protection Agency. The bill establishes meaningful rights for individuals and clear obligations for companies. It does not preempt state law, but it lacks explicit anti-preemption language, which would make it more effective. - The Mind Your Own Business Act (S. 2637 (link is external)), introduced by U.S. Sen. Ron Wyden (D-Ore.), requires companies to assess the impact of the automated systems they use to make decisions about consumers and how well their data protection mechanisms are working. It has explicit anti-preemption language and holds companies accountable when they fail to protect privacy. The private right of action should be broader, and the bill needs clear limits on data uses. - The Privacy Rights for All Act (S. 1214 (link is external)), introduced by U.S. Sen. Ed Markey (D-Mass.), has important provisions minimizing data collection and delinking user identities from collected data, and prohibits bias and discrimination in automated decision-making. It also includes a strong private right of action and bans forced arbitration for violations. It does not preempt state law, but it lacks explicit anti-preemption language, which would make it more effective. Two bills are plainly anti-privacy. The Information Transparency & Personal Data Control Act (H.R. 2013 (link is external)), introduced by U.S. Rep. Suzan DelBene (D-Wash.), falls woefully short. It provides few protections for individuals, contains overly broad exemptions and preempts stronger state laws. The Balancing the Rights of Web Surfers Equally and Responsibility (BROWSER) Act (S. 1116 (link is external)), introduced by U.S. Sen. Marsha Blackburn (R-Tenn.), is based on the old, ineffective take-it-or-leave-it terms of use model, does not allow agency rulemaking, is weak on enforcement and preempts state laws. Both are bad, anti-privacy bills. Future federal privacy bills must make the grade. Additional privacy bills are expected to be introduced by U.S. Sen. Maria Cantwell (D-Wash.) and U.S. Rep. Jan Schakowsky (D-Ill.). Separately, U.S. Sens. Richard Blumenthal (D-Conn.), Roger Wicker (R-Miss.) and Josh Hawley (R-Mo.) may release their own bills. These leaders should strive to meet the standards that the framework lays out. Baseline privacy legislation must not preempt stronger state protections and laws – such as the California Consumer Privacy Protection Act (link is external) that takes effect in 2020, biometric data protection laws such as those in Illinois (link is external) and Texas (link is external), and data breach notification laws (link is external) that exist in every state. States must be allowed to continue serving as “laboratories of democracy,” pioneering innovative new protections to keep up with rapidly changing technologies. In addition, federal privacy legislation must include a strong private right of action – a crucial tool consumers need to enforce their rights and change the behavior of powerful corporations – and establish safeguards against data practices that lead to unjust, unfair, manipulative and discriminatory outcomes. For more information, see these fact sheets (link is external). Please contact any of the individuals listed above to speak with an expert. ###
  • Press Release

    Will the FTC Weaken Children’s Privacy Rules?

    Invited Advocates Raise Concerns About Upcoming COPPA Workshop, Plans to Undermine Federal Protections for Kids, October 7 D.C. lineup dominated by tech industry supporters

    Contact: David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) Will the FTC Weaken Children’s Privacy Rules? Invited Advocates Raise Concerns About Upcoming COPPA Workshop, Plans to Undermine Federal Protections for Kids October 7 D.C. lineup dominated by tech industry supporters WHAT: The Future of the Children’s Online Privacy Protection Act Rule (COPPA): An FTC Workshop (link is external) WHEN: October 7, 2019, 9:00 am ET WHERE: Constitution Center, 400 7th St SW, Washington, DC WORKSHOP PRESENTERS FOR CAMPAIGN FOR A COMMERCIAL-FREE CHILDHOOD (CCFC) AND CENTER FOR DIGITAL DEMOCRACY (CDD): THE CHALLENGE: In 2012, the FTC approved new safeguards to protect children’s privacy in the digital era, heeding the advice of child advocates, consumer groups, privacy experts and health professionals. But now the Commission has called for comments (link is external) on COPPA three years before a new review is mandated by statute. The questions posed by the Commission, as well as public comments made by FTC staff, make privacy advocates wary that the FTC’s goal is to roll back COPPA safeguards rather than strengthen protections for children. Concerns about the FTC creating new loopholes or supporting industry calls to weaken the rules are heightened by the FTC’s speaker list for this workshop, replete with tech and marketing companies and their lawyers and lobbyists, with just a few privacy and children’s advocates at the table. The advocates are also concerned that the FTC is contemplating this action just weeks after its most significant COPPA enforcement action to date—requiring major changes to Google’s data collection practices on YouTube—a move that could result in rules being changed before those new practices have even been implemented. Children and families need increased COPPA enforcement, not weaker rules. The key problems, the advocates note, are the lack of enforcement of the law by the FTC; the failure of the agency to protect children from unfair marketing practices, such as influencers; and the need to maintain the strongest possible safeguards—whether in the home, school or on mobile devices. Speakers at the workshop include: Josh Golin, Executive Director, CCFC Will participate in a panel entitled Scope of the COPPA Rule. Katharina Kopp, Ph.D., Deputy Director, Director of Policy, CDD Will participate in a panel entitled Uses and Misuses of Persistent Identifiers. Laura M. Moy, Associate Professor of Law, Director, Communications & Technology Law Clinic, Georgetown University Law Center Will participate in a panel entitled State of the World in Children’s Privacy. Josh, Katharina, and Laura are available for questions in advance of the workshop, and will also be available to speak with press on site. See video of Future of COPPA Workshop here: https://www.ftc.gov/news-events/audio-video/video/future-coppa-rule-ftc-... (link is external) https://www.ftc.gov/news-events/audio-video/video/future-coppa-rule-ftc-... (link is external) ###
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Advocates Who Filed the Privacy Complaint Against Google/YouTube Laud Improvements, But Say FTC Settlement Falls Far Short BOSTON, MA & WASHINGTON, DC—September 4, 2019—The advocates who triggered the Federal Trade Commission’s (FTC) investigation into YouTube’s violations of the Children’s Online Privacy Protection Act (COPPA) say the FTC’s settlement with Google will likely significantly reduce behavioral marketing to children on YouTube, but doesn’t do nearly enough to ensure children will be protected or to hold Google accountable. In April, 2018, Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD), through their attorneys at Georgetown Law’s Institute for Public Representation (IPR), filed an FTC complaint (link is external) detailing YouTube’s COPPA violations. Twenty-one other privacy and consumer groups signed on to CCFC and CDD’s complaint, which detailed how Google profits by collecting personal information from kids on YouTube, without first providing direct notice to parents and obtaining their consent as required by law. Google uses this information to target advertisements to children across the internet and across devices, in clear violation of COPPA. Today, the FTC and the New York Attorney General announced a settlement with Google, fining the company $170 million. The settlement also “requires Google and YouTube to develop, implement, and maintain a system that permits channel owners to identify their child-directed content on the YouTube platform so that YouTube can ensure it is complying with COPPA.” Content creators will be asked to disclose if they consider their videos to be child-directed; if they do, no behavioral advertising will be served to viewers of those videos. “We are pleased that our advocacy has compelled the FTC to finally address YouTube’s longstanding COPPA violations and that there will be considerably less behavioral advertising targeted to children on the number one kids’ site in the world,” said CCFC’s Executive Director Josh Golin. “But it’s extremely disappointing that the FTC isn’t requiring more substantive changes or doing more to hold Google accountable for harming children through years of illegal data collection. A plethora of parental concerns about YouTube – from inappropriate content and recommendations to excessive screen time – can all be traced to Google’s business model of using data to maximize watch time and ad revenue.” In a July 3, 2019 (link is external) letter to the FTC, the advocates specifically warned that shifting the burden of COPPA compliance from Google and YouTube to content creators would be ineffective. The letter noted many children’s channels were unlikely to become COPPA compliant by turning off behavioral advertising, since Google warns that turning off these ads “may significantly reduce your channel’s revenue.” The letter also detailed Google’s terrible track record of ensuring COPPA compliance on its platforms; a 2018 study found that 57% of apps in the Google Play Store’s Designed for Families program were violating COPPA despite Google’s policy that apps in the program must be COPPA compliant. And as Commissioner Rebecca Slaughter wrote in her dissent, many children’s content creators are not U.S.-based and therefore are unlikely to be concerned about FTC enforcement. “We are gratified that the FTC has finally forced Google to confront its longstanding lie that it wasn’t targeting children on YouTube,” said CDD’s executive director Jeff Chester, who helped spearhead the campaign that led to the 1998 passage of COPPA “However, we are very disappointed that the Commission failed to penalize Google sufficiently for its ongoing violations of COPPA and failed to hold Google executives personally responsible for the roles they played. A paltry financial penalty of $170 million—from a company that earned nearly $137 billion in 2018 alone -- sends a signal that if you are a politically powerful corporation, you do not have to fear any serious financial consequences when you break the law. Google made billions off the backs of children, developing a host of intrusive and manipulative marketing practices that take advantage of their developmental vulnerabilities. More fundamental changes will be required to ensure that YouTube is a safe and fair platform for young people.” Echoing Commissioner Rohit Copra’s dissent, the advocates noted that unlike smaller companies sanctioned by the FTC, Google was not forced to pay a penalty larger than its “ill-gotten gains.” In fact, with YouTube earning a reported $750 million annually from children’s content alone, the $170 million fine amounts to less than three months of advertising revenue from kids’ videos. With a maximum fine of $41,484 per violation, the FTC easily could have sought a fine in the tens of billions of dollars. "I am pleased that the FTC has made clear that companies may no longer avoid complying with COPPA by claiming their online services are not intended for use by children when they know that many children in fact use their services,” said Angela Campbell, Director Emeritus of IPR’s Communications and Technology Clinic at Georgetown Law, which researched and drafted the complaint. Campbell, currently chair of CCFC’s Board, served as lead counsel to CCFC and CDD on the YouTube and other complaints alleging COPPA violations. She, along with Chester, was responsible for filing an FTC complaint in 1996 against a child-directed website that led to Congress’s passage of COPPA in 1998 (link is external). COPPA gave the FTC expanded authority to implement and enforce the law, for example, by including civil penalties. About the proposed settlement, Campbell noted: “It’s disappointing that the FTC has not fully used its existing authority to hold Google and YouTube executives personally liable for adopting and continuing to utilize a business model premised on ignoring children’s privacy protection, to adopt a civil penalty substantial enough to deter future wrongdoing, or to require Google to take responsibility for ensuring that children’s content on YouTube platforms complies with COPPA.” On the heels of a sweetheart settlement with Facebook, the advocates said the deal with Google was further proof the FTC wasn’t up to the task of protecting consumers’ privacy. Said Campbell, “I support Commissioner Slaughter’s call to state attorney generals to step up and hold Google accountable. Added Chester, “The commission’s inability to stop Google’s cynically calculated defiance of COPPA underscores why Congress must create a new consumer watchdog that will truly protect Americans’ privacy.” Organizations which signed on to the CCFC/CDD 2018 FTC complaint were Berkeley Media Studies Group; Center for Media Justice; Common Sense; Consumer Action; Consumer Federation of America; Consumer Federation of California; Consumers Union, the advocacy division of Consumer Reports; Consumer Watchdog; Corporate Accountability; Defending the Early Years; Electronic Privacy Information Center (“EPIC”); New Dream; Obligation, Inc.; Parent Coalition for Student Privacy; Parents Across America; Parents Television Council; Privacy Rights Clearinghouse; Public Citizen; The Story of Stuff Project; TRUCE (Teachers Resisting Unhealthy Childhood Entertainment); and USPIRG. ###
  • Press Statement Google YouTube FTC COPPA Settlement Statement of Katharina Kopp, Ph.D. Deputy Director Center for Digital Democracy August 30, 2019 It has been reported that Google has agreed to pay between $150 million and $200 million to resolve an FTC investigation into YouTube over alleged violations of a children's privacy law. A settlement amount of $150-200 million would be woefully low, considering the egregious nature of the violation, how much Google profited from violating the law, and given Google’s size and revenue. Google’s unprecedented violation requires an unprecedented FTC response. A small amount like this would effectively reward Google for engaging in massive and illegal data collection without any regard to children’s safety. In addition to assessing substantial civil penalties, the FTC must enjoin Google from committing further violations of COPPA and impose effective means for monitoring compliance; the FTC must impose a 20-year consent decree to ensure Alphabet Inc. acts responsibly when it comes to serving children and parents. ------ In April, 2018, the Center for Digital Democracy (CDD) and the Campaign for Commercial-Free Childhood (CCFC), through their attorneys at Georgetown Law’s Institute for Public Representation (IPR), filed an FTC complaint (link is external) detailing YouTube’s COPPA violations. Twenty-one other privacy and consumer groups signed on to CCFC and CDD’s complaint, which detailed how Google profits by collecting personal information from kids on YouTube, without first providing direct notice to parents and obtaining their consent as required by law. Google uses this information to target advertisements to children across the internet and across devices, in clear violation of COPPA.
  • Press Release

    FTC Fails to Protect Privacy in Facebook decision

    Instead of serious structural and behavioral change, 3-2 deal is a huge giveaway. By dismissing all other claims, Simons' FTC does disservice to public

    Statement of Jeff Chester, executive director, Center for Digital Democracy--CDD helped bring the 2009 FTC complaint that is the subject of today's decision on the Consent Order Once again, the Federal Trade Commission has shown itself incapable of protecting the privacy of the public and also preventing ongoing consumer harms. Today's announcement of a fine and--yet again! --improved system of internal compliance and other auditing controls doesn't address the fundamental problems. First, the FTC should have required Facebook to divest both its Instagram and Whatsapp platforms. By doing so, the commission would have prevented what will be the tremendous expansion of Facebook's ability to continually expand its data gathering activities. By failing to require this corporate break-up, the FTC has set the stage for what will be "Groundhog Day" violations of privacy for years to come. The FTC should have insisted that an independent panel of experts--consumer groups, data scientists, civil rights groups, etc.--be empaneled to review all the company's data related products, to decide which ones are to be modified, eliminated, or allowed to continue (such as lookalike modeling, role of influencers, cross-device tracking, etc.). This group should have been given the authority to review all new products proposed by the company for a period of at least five years. What was needed here was a serious change in the corporate culture, along with serious structural remedies, if the FTC really wanted to ensure that Facebook would act more responsibly in the future. The dissents by Commissioners Chopra and Slaughter illustrate that the FTC majority could have taken another path, instead of supporting a decision that will ultimately enable the problems to continue. Today's decision also dismisses all other complaints and requests for investigation related to Facebook's consent decree failures--a huge giveway. The FTC should be replaced by a new data protection agency to protect privacy. The commission has repeatedly demonstrated that--regardless of who is in charge--it is incapable of confronting the most powerful forces that undermine our privacy--and digital rights.