CDD

CDD Filings Digital Consumer

  • Consumer financial safeguards for online payments needed, says U.S. PIRG & CDDBig Tech Payment PlatformsSupplemental Comments of USPIRG and the Center for Digital DemocracyCFPB-2021-0017December 7, 2022United States Public Interest Research Group (USPIRG) and the Center for Digital Democracy (CDD) submit these additional comments to further inform the Bureau’s inquiry. They amplify the comments USPIRG and CDD submitted last year.[1]  We believe that since we filed our original comment, the transformation of “Big Tech” operated digital payment platforms has significantly evolved, underscoring the need for the Bureau to institute much needed consumer protection safeguards. We had described how online platform based payment services seamlessly incorporate the key elements of “commerce” today—including content, promotion, marketing, sales and payment. We explained how these elements are part of the data-driven “surveillance” and personalized marketing system that operates as the central nervous system for nearly all U.S. online operations. We raised the growing role that “social media commerce” plays in contemporary payment platforms, supporting the Bureau’s examination of Big Tech platforms and consumer financial payment services. For example, U.S. retail social media commerce sales will generate $53 billion in 2022, rising to $107 billion by 2025, according to a recent report by Insider Intelligence/eMarketer. Younger Americans, so-called “Generation Z,” are helping drive this new market—an indicator of how changing consumer financial behaviors are being shaped by the business model and affordances of the Big Tech platforms, including TikTok, Meta and Google.[2]In order to meaningfully respond to the additional questions raised by the Bureau in its re-opening of the comment period, in particular regarding how the payment platforms handle “complaints, disputes and errors” and whether they are “sufficiently staffed…to address consumer protection and provide responsible customer service,” USPIRG and CDD offer some further analysis regarding the structural problems of contemporary platform payment systems below.[3]First, payment services such as operated by Google, Meta, TikTok and others have inherent conflicts of interest.They are, as the Bureau knows, primarily advertising systems, that are designed to capture the “engagement” of individuals and groups using a largely stealth array of online marketing applications (including, for example, extensive testing to identify ways to engage in subconscious “implicit” persuasion).[4] Our prior comment and those of other consumer groups have already documented the extensive use of data profiling, machine learning, cross-platform predictive analysis and “identity” capture that are just a few of current platform monetization tactics. The continually evolving set of tools available for digital platforms to target consumers has no limits—and raises critical questions when it comes to the financial security of US consumers.  The build-out of Big Tech payment platforms leveraging their unique capabilities to seamlessly combine social media, entertainment, commerce with sophisticated data-driven contemporary surveillance has transformed traditional financial services concepts. Today’s social media giants are also global consumer financial banking and retail institutions. For example, J.P. Morgan has “built a real-time payments infrastructure” for TikTok’s parent company ByteDance: “that can be connected to local clearing systems. This allows users, content producers, and influencers to be paid instantaneously and directly into their bank accounts at any day or time. ByteDance has enabled this capability in the U.S. and Europe, meaning it covers approximately one-fifth of TikTok’s 1 billion active users worldwide.”[5]J.P. Morgan assisted ByteDance to also replace its “host-to host connectivity with banks, replacing it with application programming interfaces (API) connectivity that allows real-time exchange of data” between ByteDance and Morgan. This allows ByteDance to “track and trace the end-to-end status through the SWIFT network, see and monitor payments, and allow users to check for payments via their TikTok or other ByteDance apps in real time.” Morgan also has “elevated and further future-proofed ByteDance’s cash management through a centralized account structure covering all 15 businesses” through a “virtual account management and liquidity tool.”[6]Google’s Pay operations also illustrate how distinct digital payment platforms are from previous forms of financial services. Google explains to merchants that by integrating “with Google Wallet [they can] engage with users through location-based notifications, real-time updates” and offers, including encouraging consumers to “add offers from your webpage or app directly to Google wallet.” Google promotes the use of “geofenced notifications to drive engagement” with its Pay and Wallet services as well. Google’s ability to leverage its geolocation and other granular tracking and making that information available through a package of surveillance and engagement tools to merchants to drive financial transactions in real-time is beyond the ability of a consumer to effectively address. A further issue is the growing use of “personalization” technologies to make the financial services offering even more compelling. Google has already launched its “Spot” service to deliver “payment enabled experiences for users, including “fully customized experiences” in Google Pay. Although currently available only in India and Singapore, Google’s Spot platform, which allows consumers with “a few simple taps…to search, review, choose and pay” for a product is an example of how payment services online are continually advanced—and require independent review by consumer financial regulators. It also reflects another problem regarding protecting the financial well-being of US consumers. What are the impacts to financial security when there is no distance—no time to reflect—when the seamless, machine and socially-driven marketing and payment operations are at work?[7]A good example of the lack of meaningful protections for online financial consumers is Google Pay’s use of what’s known as “discovery,” a popular digital marketing concept meaning to give enhanced prominence to a product or service. Here’s how Google describes how that concept works in its Spot-enabled Pay application: “We understand that discovery is where it starts, but building deep connections is what matters the most - a connection that doesn’t just end with a payment, but extends to effective post sale engagement. The Spot Platform helps merchants own this relationship by providing a conversational framework, so that order updates, offers, and recommendations can easily be surfaced to the customer. This is powered by our Order API which is specialised to surface updates and relevant actions for users' purchases, and the Messaging API which can surface relevant messages post checkout to the user.”[8]Meta (Facebook), along with ad giant WPP, also relies on the growing use of “discovery” applications to promote sales. In a recent report, they explain that “digital loyalty is driven by seamless shopping experiences, convenience, easy discovery, consistent availability, positive community endorsement and personal connections.”[9]  Since Google and other payment platforms have relationships with dozens of financial institutions, and also have an array of different requirements for vendors and developers, USPIRG and CDD are concerned that consumers are placed at a serious disadvantage when it comes to protecting their interests and also seeking redress for complaints. The chain of digital payment services relationships, including with partners that conduct their own powerful data driven marketing systems, requires Bureau review. For example, PayPal is a partner with Google Pay, while the PayPal Commerce Platform has Salesforce as one of many partners.[10]See also PIRG’s recent comments to the FTC, for an extensive discussion of retail media networks and data clean rooms:[11]“Clean rooms are data platforms that allow companies to share first party data with one another without giving the other party full access to the underlying, user-level data. This ability to set controls on who has access to granular information about consumers is the primary reason that data clean rooms are able to subvert current privacy regulations.” Another important issue for the Bureau is the ability of the Big Tech payment platforms to collect and analyze data in ways that allow it to identify unique ways to influence consumer spending behaviors. In a recent report, Chinese ecommerce platform Alibaba explained how such a system operates: “The strength of Alibaba’s platforms allows a birds-eye view of consumer preferences, which is combined with an ecosystem of tactical solutions, to enable merchants to engage directly and co-create with consumers and source suppliers to test, adapt, develop, and launch cutting-edge products…helps merchants identify new channels and strategies to tap into the Chinese market by using precise market analysis, real-time consumer insights, and product concept testing.”[12]Such financial insights are part of what digital payment and platform services provide. PayPal, for example, gathers data on consumers as part of their “shopping journey.” In one case study for travel, PayPal explained that its campaign for Expedia involved pulling “together data-driven destination insights, creative messaging and strategic placements throughout the travel shoppers’ journey.” This included a “social media integration that drove users to a campaign landing page” powered by “data to win.” This data included what is the growing use of what’s euphemistically called “first-party data” from consumers, where there has been alleged permission to use it to target an individual. Few consumers will ever review—or have the ability to influence—the PayPal engine that is designed for merchants to “shape [their] customer journey from acquisition to retention.” This includes applications that add “flexible payment options…right on product pages or through emails;” “relevant Pay Later offer to customers with dynamic messaging;’ ability to “increase average order value” through “proprietary payment methods;” or “propose rewards as a payment option to help inspire loyalty.”[13]The impact of data-driven fostered social commerce on promoting the use of consumer payments should be assessed. For example, Shopify’s “in-app shopping experience on TikTok” claims that the placement of its “shopping tabs” by vendors on posts, profiles and product catalogs unleashes “organic discovery.” This creates “a mini-storefront that links directly to their online store for check out.’’ A TikTok executive explains how the use of today’s digital payment services are distinct—“rooted in discovery, connection, and entertainment, creating unparalleled opportunities for brands to capture consumers’ attention…that drives [them] directly to the digital point of purchase.”[14] TikTok also has partnered with Stripe, helping it “become much more integrated with the world of payments and fintech.”[15]TikTok’s Stripe integrations enable “sellers to send fans directly from TikTok videos, ads, and shopping tabs on their profiles to products available in their existing Square Online (link is external)store, providing a streamlined shopping experience that retains the look and feel of their personal brand.”[16] The Square/TikTok payment alliance illustrates the role that data driven commercial surveillance marketing plays in payment operations, such as the use of the “TikTok pixel” and “advanced matching.”[17] In China, ByteDance’s payment services reflects its growing ability to leverage its mass customer data capture for social media driven marketing and financial services.[18]We urge the Bureau to examine TikTok’s data and marketing practices as it transfers U.S. user information to servers in the U.S., the so-called “Project Texas,” to identify how “sensitive” data may be part of its financial services offerings.[19]Apple’s payment services deserve further scrutiny as its reintroduces its role as a digital advertising network, leveraging its dominant position in the mobile and app markets.[20] PayPal recently announced that it will be “working with Apple to enhance offerings for PayPal and Venmo merchants and consumers.” Apple is also making its payment service available through additional vendors, including the giant Kroger grocery store chain stores in California.[21]Amazon announced in October 2022 that Venmo was now an official payment service, where users could, during checkout, “select “Select a payment method” and then “Add a Venmo account.” This will redirect them to the Venmo app, where they can complete the authentication. Users can also choose Venmo to be their default payment method for Amazon purchases on that screen.”[22] Amazon’s AWS partners with fintech provider Plaid, another example of far-reaching partnerships restructuring the consumer financial services market.[23]ConclusionUSPIRG and CDD hope that both our original comments and these additional comments help the Bureau to understand the impact of rapid changes in Big Tech’s payments network relationships and partnerships. We believe urgent CFPB action is needed to protect consumers from the threat of Big Tech’s continued efforts to breach the important wall separating banking and commerce and to ensure that all players in the financial marketplace follow all the rules. Please contact us with additional questions.Sincerely yours,Jeff Chester, Executive Director, Center for Digital DemocracyEdmund Mierzwinski, Senior Director, Federal Consumer Program, U.S. PIRG [[1] /comment/CFPB-2021-0017-0079[2] /what-s-behind-social-commerce-surge-5-charts[3] We also believe that the Bureau’s request for comments concerning potential abuse of terms of service and use of penalties merits discussion. We look forward to additional comments from others. [4] /business/en-US/blog/mediascience-study-brands-memorable-tiktok; see Google, Meta, TikTok as well: https://www.neuronsinc.com/cases[5] /content/dam/jpm/treasury-services/documents/case-study-bytedance.pdf[6] /content/dam/jpm/treasury-services/documents/case-study-bytedance.pdf[7] /about/business/checkout/(link is external); /pay/spot(link is external); /about/business/passes-and-rewards/[8] /pay/spot[9] /news/meta-publishes-new-report-on-the-importance-of-building-brand-loyalty-in-on/625603/[10] See, for example, the numerous bank partners of Google in the US alone: /wallet/answer/12168634?hl=en. Also: /payments/apis-secure/u/0/get_legal_document?ldo=0&ldt=buyertos&ldr=us; /wallet/retail; /wallet/retail/offers/resources/terms-of-service; /us/webapps/mpp/google-pay-paypal; /products/commerce-cloud/overview/?cc=dwdcmain[11] /wp-content/uploads/2022/11/PIRG-FTC-data-comment-no-petitions-Nov-2022.pdf[12] /article/how-merchants-can-use-consumer-insights-from-alibaba-to-power-product-development/482374[13] /us/brc/article/enterprise-solutions-expedia-case-study(link is external); /us/brc/article/enterprise-solutions-acquire-and-retain-customers[14] /scaling-social-commerce-shopify-introduces-new-in-app-shopping-experiences-on-tiktok#[15] /financial-services-finserv/tiktok-partners-fintech-firm-stripe-tips-payments[16] /us/en/press/square-x-tiktok[17] /help/us/en/article/7653-connect-square-online-with-tiktok(link is external); /help/article/data-sharing-tiktok-pixel-partners[18] /video/douyin-chinas-version-tiktok-charge-093000931.html; /2021/01/19/tiktok-owner-bytedance-launches-mobile-payments-in-china-.html[19] /a/202211/16/WS6374c81ea31049175432a1d8.html[20] /news/newsletters/2022-08-14/apple-aapl-set-to-expand-advertising-bringing-ads-to-maps-tv-and-books-apps-l6tdqqmg?sref=QDmhoVl8[21] /231198771/files/doc_financials/2022/q3/PYPL-Q3-22-Earnings-Release.pdf;/2022/11/08/ralphs-begins-accepting-apple-pay/[22] /2022/10/25/amazon-now-allows-customers-to-make-payments-through-venmo/[23] /blogs/apn/how-to-build-a-fintech-app-on-aws-using-the-plaid-api/pirg_cdd_cfpb_comments_7dec2022.pdf
    Jeff Chester
  • Coalition of child advocacy, health, safety, privacy and consumer organization document how data-driven marketing undermines privacy and welfare of young peopleChildren and teenagers experience widespread commercial surveillance practices to collect data used to target them with marketing. Targeted and personalized advertising remains the dominant business model for digital media, with the marketing and advertising industry identifying children and teens as a prime target. Minors are relentlessly pursued while, simultaneously, they are spending more time online than ever before. Children’s lives are filled with surveillance, involving the collection of vast amounts of personal data of online users. This surveillance, informed by behavior science and maximized by evolving technologies, allows platforms and marketers to profile and manipulate children.The prevalence of surveillance advertising and targeted marketing aimed at minors is unfair in violation of Section 5. Specifically, data-driven marketing and targeted advertising causes substantial harm to children and teens by:violating their privacy;manipulating them into being interested in harmful products;undermining their autonomyperpetuating discrimination and bias;Additionally, the design choices tech companies use to optimize engagement and data collection in order to target marketing to minors further harm children and teens. These harms include undermining their physical and mental wellbeing and increasing the risk of problematic internet risk. These harms cannot reasonably be avoided by minors or their families, and there are no countervailing benefits to consumers or competition that outweigh these harms.Surveillance advertising is also deceptive to children, as defined by the Federal Trade Commission. The representations made about surveillance advertising by adtech companies, social media companies, apps, and games are likely to mislead minors and their parents and guardians. These misrepresentations and omissions are material. Many companies also mislead minors and their guardians by omission because they fail to disclose important information about their practices. These practices impact the choices of minors and their families every day as they use websites, apps, and services without an understanding of the complex system of data collection, retention, and sharing that is used to influence them online. We therefore urge the Commission to promulgate a rule that prohibits targeted marketing to children and teenagers.Groups filing the comment included: The Center for Digital Democracy, Fairplay, and #HalfTheStory, American Academy of Pediatrics, Becca Schmill Foundation, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Federation of America, Consumer Federation of California, CUNY Urban Food Policy Institute, Eating Disorders Coalition for Research, Policy & Action, Enough is Enough, LookUp.live, Lynn’s Warriors, National Eating Disorders Association, Parents Television and Media Council, ParentsTogether, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Public Citizen and UConn Rudd Center for Food Policy & Health FairPlay's executive director Josh Golin said: "Big Tech's commercial surveillance business model undermines young people's wellbeing and development.  It causes kids and teens to spend excessive time online, and exposes them to harmful content and advertising targeted to their vulnerabilities. The FTC must adopt a series of safeguards to allow vulnerable youth to play, learn, and socialize online without being manipulated or harmed. Most importantly, the Commission should prohibit data-driven advertising and marketing to children and teens, and make clear that Silicon Valley profits cannot come at the expense of young people's wellbeing.”CDD's Jeff Chester underscored this saying: "Children and teens are key commercial targets of today’s data-driven surveillance complex.  Their lives are tethered to a far-reaching system that is specifically designed to influence how they spend their time and money online, and uses artificial intelligence, virtual reality, geo-tracking, neuromarketing and more to do so.  In addition to the loss of privacy, surveillance marketing threatens their well-being, health and safety. It’s time for the Federal Trade Commission to enact safeguards that protect young people. "[full filing attached]cdd_fairplay_anprmcomments.pdf
  • Contact: Jeff Chester, CDD jeff@democraticmedia.org (link sends e-mail); 202-494-7100David Monahan, CCFC, david@commercialfreechildhood.org (link sends e-mail)Advocates say Google Play continues to disregard children’s privacy law and urge FTC to act BOSTON, MA and WASHINGTON, DC — March 31, 2021—Today, advocacy groups Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) to investigate Google’s promotion of apps which violate the Children’s Online Privacy Protection Act (COPPA). In December 2018, CCFC and CDD led a coalition of 22 consumer and health advocacy groups in asking the FTC to investigate these same practices. Since then Google has made changes to the Play Store, but the advocates say these changes fail to address the core problem: Google is certifying as safe and appropriate for children apps that violate COPPA and put children at risk. Recent studies found that a significant number of apps in Google Play violate COPPA by collecting and sharing children’s personal information without getting parental consent. For instance, a JAMA Pediatrics study found that 67% of apps used by children aged 5 and under were transmitting personal identifiers to third parties.Comment of Angela Campbell, Chair of the Board of Directors, Campaign for a Commercial-Free Childhood, Professor Emeritus, Communications & Technology Law Clinic, Georgetown University Law Center:“Parents reasonably expect that Google Play Store apps designated as ‘Teacher approved’ or appropriate for children under age 13 comply with the law protecting children’s privacy. But far too often, that is not the case. The FTC failed to act when this problem was brought to its attention over two years ago. Because children today are spending even more time using mobile apps, the FTC must hold Google accountable for violating children’s privacy.”Comment of Jeff Chester, executive Director of the Center for Digital Democracy:"The Federal Trade Commission must swiftly act to stop Google’s ongoing disregard of the privacy and well-being of children. For too long, the Commission has allowed Google’s app store, and the data marketing practices that are its foundation, to operate without enforcing the federal law that is designed to protect young people under 13. With children using apps more than ever as a consequence of the pandemic, the FTC should enforce the law and ensure Google engages with kids and families in a responsible manner."###
  • September 25, 2018 Contact: Jeff Chester-202-494-7100 David Monahan 617-896-9397 For Immediate Release Child Advocacy and Consumer Groups Tell FCC to Keep Key TV Safeguards for Children Overturning Children’s TV Act rules will harm kids and be a huge giveaway of public airwaves to broadcast and cable companies Three leading nonprofit groups working to advance the interests of children in the digital era told the Federal Communications Commission (FCC) that its plan to dismantle long-standing safeguards designed to ensure all children have access to quality TV programing will harm American kids. The proposal to jettison guidelines which require broadcast TV stations air a minimum of three hours a week of educational programming on their primary channel and additional programming on multicast channels would significantly reduce the availability of higher quality shows, they explained in a filing (link is external) today. “The FCC seeks to strip away one of the only federal rules that helps both children and parents,” explained Jeff Chester, executive director of the Center for Digital Democracy. Chester helped lead the campaign back in the 1990’s that led to the current CTA rules. “It is also one of the only concrete public-interest requirements that Congress mandated in exchange for free use of the public airwaves, which allow television stations to earn vast revenues from both advertising and fees paid by cable companies. Just as the GOP FCC majority did when it killed network neutrality, the commission only seems interested in protecting the interests of the big broadcast and cable companies,” Chester said. “The Commission’s proposal would effectively eliminate children’s programming on broadcast television, where at least there are some limits on commercialism,” said Campaign for a Commercial-Free Childhood executive director Josh Golin. "Internet and mobile platforms for children are rife with many types of unfair and deceptive marketing that aren’t allow on kids’ TV. Rather than facilitating a race to the bottom, the FCC should work with lawmakers and the FTC to develop cross-platform rules to ensure all children access to quality, commercial-free media regardless of the platforms and devices their families own.” Without citing any evidence about the quality, cost and availability of children’s educational programs delivered by other means, the FCC claims that because children can watch children’s educational programs on cable, YouTube, Netflix, Amazon and Hulu, commercial television stations should not be required to air children’s educational programming. But in comments drafted by the Georgetown Law Communications and Technology Clinic, the advocates note, “To use non-broadcast services, households must have access to cable or broadband service, and be able to afford subscription fees and equipment. Children who live in rural areas, or whose families are low-income, and cannot access or afford alternative program options, will be hurt the most” if the FCC proposal is adopted. The three groups—Center for Digital Democracy, Campaign for a Commercial-Free Childhood, and the Benton Foundation—pledged to educate the public, including parents, educators and concerned citizens, so they can raise concerns with the FCC and other policy makers. --30--
  • CDD today joined the Electronic Privacy Information Center (EPIC) and six other consumer groups in calling on the Federal Trade Commission to investigate the misleading and manipulative tactics of Google and Facebook in steering users to “consent” to privacy-invasive default settings. In a letter to the FTC, the eight groups complained that the technology companies deceptively nudge users to choose less privacy-friendly options. The complaint was based on the findings in a report, “Deceived by Design,” published today by the Norwegian Consumer Council. It found that Google and Facebook steer consumers into sharing vast amounts of information about themselves, through cunning design, privacy invasive defaults, and “take it or leave it”-choices, according to an analysis of the companies’ privacy updates. A report by Consumer Report investigating Facebook settings for US users found “that the design and language used in Facebook's privacy controls nudge people toward sharing the maximum amount of data with the company.” Read the Norwegian report, “Deceived by Design” here: https://www.forbrukerradet.no/undersokelse/no-undersokelsekategori/deceived-by-design (link is external) Read the letter the eight groups sent to the FTC today here: http://thepublicvoice.org/wp-content/uploads/2018/06/FTC-letter-Deceived-by-Design.pdf (link is external) Read the report by Consumer Report here: https://www.consumerreports.org/privacy/cr-researchers-find-facebook-privacy-settings-maximize-data-collection (link is external)
  • WASHINGTON, DC – October 18, 2017—A number of brands of “smartwatches” intended to help parents monitor and protect young children have major security and privacy flaws which could endanger the children wearing them. A coalition of leading U.S. child advocacy, consumer, and privacy groups sent a letter to the Federal Trade Commission (FTC) today, asking the agency to investigate the threat these watches pose to children. Smartwatches for children essentially work as a wearable smartphone. Parents can communicate with their child through the mobile phone function and track the child’s location via an app. Some product listings recommend them for children as young as three years old. Groups sending the letter to the FTC are the Electronic Privacy Information Center (EPIC), the Center for Digital Democracy (CDD), the Campaign for a Commercial-Free Childhood (CCFC), the Consumer Federation of America, Consumers Union, Public Citizen, and U.S. PIRG. The advocacy groups are working with the Norwegian Consumer Council (NCC), which conducted research (link is external) showing that watches sold in the U.S. under the brands Caref and SeTracker have significant security flaws, unreliable safety features, and policies which lack consumer privacy protections. In the EU, groups are filing complaints in Belgium, Denmark, the Netherlands, Sweden, Germany, the UK, and with other European regulators. “By preying upon parents’ desire to keep children safe and, these smart watches are actually putting kids in danger,” said CCFC’s Executive Director Josh Golin. “Once again, we see Internet of Things products for kids being rushed to market with no regard for how they will protect children’s sensitive information. Parents should avoid these watches and all internetconnected devices designed for kids.” The NCC’s research showed that with two of the watches, a stranger can take control of the watch with a few simple steps, allowing them to eavesdrop on conversations the child is having with others, track and communicate with the child, and access stored data about the child’s location. The data is transmitted and stored without encryption. The watches are also unreliable: a geo-fencing feature meant to notify parents when a child leaves a specified area, as well as an “SOS” function alerting parents when a child is in distress, simply do not work. The manufacturers’ data practices also put children at risk. Some devices have no privacy policies at all, and the policies that do exist lack basic consumer protections, including seeking consent for data collection, notifying users of changes in terms, and allowing users to delete stored data. "The Trump Administration and the Congress must bring America’s consumer product safety rules into the 21st century,” said Jeff Chester of the Center for Digital Democracy. “In the rush to make money off of kids’ connected digital devices, manufacturers and retailers are failing to ensure these products are truly safe. In today’s connected world that means protecting the privacy and security of the consumer—especially of children. Both the FTC and the Consumer Product Safety Commission must be given the power to regulate the rapidly growing Internet of Things marketplace.” The Caref (branded Gator in Europe) and SeTracker smartwatches are available online through Amazon. The groups have asked the FTC to act quickly to investigate these products, and they advise parents to refrain from buying the products because of the danger they could pose to children. The NCC, which conducted the testing of the watches, advises consumers who have already purchased the watches to stop using them and uninstall the app. “The Federal Trade Commission must be proactive in protecting consumers—especially vulnerable young children—from harmful products that abuse technology for the sake of profit,” said Kristen Strader, Campaign Coordinator for Public Citizen. “Smartwatches and similar devices must be absolutely safe and secure before they are released to the public for sale.” Ed Mierzwinski, Consumer Program Director at U.S. PIRG, said, "Companies making any internet-connected devices, but especially for children, need to ensure that privacy and security are more than breakable — or worse, hackable — promises." Katie McInnis, technology policy counsel for Consumers Union, said, “When a company sells a smartwatch aimed at children, it must ensure the product is safe and secure. The FTC should launch an investigation into the privacy and security concerns surrounding these products to make sure families are safe.” The same trans-Atlantic coalition persuaded government authorities and retailers last December (link is external) that the internet-connected dolls Cayla and i-Que Robot were spying on children and threatening their welfare, and retailers removed the toys from store shelves. The FBI subsequently issued a warning to consumers (link is external) that internet-connected toys could put the privacy and safety of children at risk. --- For more information, please see the following: Letter to FTC by coalition of leading U.S. child advocacy, consumer, and privacy groups (link below) Press Release by US coalition of leading U.S. child advocacy, consumer and privacy groups (link below) #WatchOut Report by Norwegian Consumer Council (link below) Press Release by Norwegian Consumer Council (link below) #WatchOut English - YouTube (http://bit.ly/2ghNoD1 (link is external)) #WatchOut - longer video explainer on security flaws 4:30 mins - YouTube (http://bit.ly/2xLYSVv (link is external))
    Jeff Chester
  • January 27, 2017 - The Center for Digital Democracy and 18 media justice, consumer protection, civil liberties, and privacy groups strongly urge congressional leaders to oppose the use of the Congressional Review Act (CRA) to adopt a Resolution of Disapproval overturning the FCC’s broadband privacy rules.---Click the link below for the full PDF of the letter.
  • Re: Exploring Special Purpose National Bank Charters for Fintech Companies Dear Comptroller Curry: The Center for Digital Democracy and U.S. Public Interest Research Group (U.S. PIRG) agree with the consumer, civil rights, and community groups and their separately filed group letter in which they expressed strong opposition to the proposed new federal nonbank lending charters. U.S. PIRG also signed and concurs with the detailed comment from National Consumer Law Center et al. The Office of the Comptroller of the Currency (OCC) must not undermine state rate caps; must not weaken states’ ability to oversee lenders and act to prevent harmful lending practices; and the OCC must not undermine efforts to provide fair and inclusive lending practices, particularly for people of color and low- and moderate-income consumers, in the areas where they operate. Further, the OCC must not allow nonbank lenders to engage in practices that violate privacy rights, or engage in unfair data and marketing practices. State laws often operate as the primary line of defense for consumers and small businesses. The OCC’s charter proposal inadequately protects consumers from these harmful practices and it should not take state law enforcers off the beat of preventing these practices. Center for Digital Democracy and U.S. PIRG file this supplemental comment to focus on the digital rights and consumer privacy concerns raised by the use of opaque Big Data algorithms used by Fintech firms. These practices increasingly threaten consumer privacy and the OCC must also take them into account when considering non-bank special purpose charters. An ongoing and increasingly challenging issue confronting citizens and consumers is the new threats to their privacy and their ability to control how personal and non- personal data about their online and offline behavior are collected and used by online financial services companies. The use of personal data by Fintech companies is pervasive and touches every aspect of their business operation, including marketing, customer loyalty management, pricing, fraud prevention, and underwriting. Fintech companies use many new on- and offline data sources, either directly collecting data from consumers or relying on third parties for Big Data analytics to classify consumers and to make predictions about them. Assigning individuals to socially constructed classifications and then making inferences about them based on group profiles is likely to have consequences that are not well understood and may further increase social inequities. Consumers’ privacy is increasingly undermined and no adequate protections are in place. The OCC must not allow an expansion of these practices via a federal charter that does not provide for adequate privacy safeguards. The OCC must proactively investigate unfair marketing practices and not grant national licenses without affirmative protections. Fintech companies are using Facebook, Instagram, and other digital behavioral data that combine data and interactive experiences to influence consumers and their social networks. Sophisticated data-processing capabilities allow for more precise micro-targeting, the creation of comprehensive profiles, and the ability to act instantly on the insights gained from consumer behaviors. Targeted and highly personalized marketing offers can be intrusive and foster consumer behaviors that are not in the best interest of the individual. Behavioral science shows that consumers are susceptible to ‘nudges’ which raises concerns about the risk of financial institutions taking advantage of the behavioral biases and limitations of consumers. Increasing personalization which Big Data makes possible, could also reduce the comparability of products, making it harder for consumers to compare one offer with another which could have an impact on market competition. Similarly, lack of transparency around the processing of data and automated algorithms may lead to increasing information asymmetries between the financial institution and the individual and thus consumers are left with less awareness and a lack of understanding and control over important financial decisions. These practices happen behind the scenes and can only be addressed by a vigilant regulator. The OCC should not allow fintech companies to operate a national license without properly addressing these data practices. The OCC must also not allow nonbank lenders or partner depository institutions to engage in unfair and discriminatory lending practices. The use of ‘alternative data’ sources can be the cause of bias or contain errors and may lead to consumer harm or unfairness. While alternative credit scoring can be a boon for the underbanked, there need to be standards and safeguards to ensure that any new data are not biased and that their use may not lead to unintended consequences. While industry has argued that increased automation will help expand access to credit and lower costs overall, credit models that are more “accurate” may lead to a more stratified society, as it will leave those at the bottom potentially excluded from credit forever. Models that judge individuals against group profiles based on past data inevitably incorporate elements of past inequality and discrimination. Communities of color are thus most vulnerable. Unless additional policies are put in place to address these consequences, inequality is likely to become more entrenched the more we rely on models for risk evaluations. Fintech platforms must comply fully with the requirements of the Fair Credit Reporting Act and Equal Credit Opportunity Act. In conclusion, the OCC must not grant new federal nonbank lending charters that would give firms free rein to use unfair data and marketing practices. Instead the OCC must proactively mitigate risks from unfair data, marketing, and lending practices that threaten to undermine privacy, consumer rights and economic inclusion. Sincerely, Jeff Chester and Katharina Kopp Center for Digital Democracy Edmund Mierzwinski U.S. PIRG Recommended further reading: BIG DATA MEANS BIG OPPORTUNITIES AND BIG CHALLENGES: Promoting Financial Inclusion and Consumer Protection in the “Big Data” Financial Era U.S. PIRG Education Fund and Center for Digital Democracy, 27 March 2014 Available at http://www.uspirg.org/reports/usf/big-data-means-big-opportunities-and-b... (link is external)
  • Center for Digital Democracy, Center for Democracy & Technology, Consumer Action, Consumer Federation of America, Consumer Federation of California, Consumers Union, Electronic Privacy Information Center, National Association of Consumer Advocates, National Consumers League, Benton Foundation, Common Sense Kids Action, and Privacy Rights Clearinghouse file this brief to highlight the potential far-reaching ramifications of this case as well as the degree to which the panel decision breaks from century-long precedent, thereby creating a sharp split among the courts of appeals.First, the panel opinion raises issues of exceptional importance. If allowed to stand, the ruling could immunize from FTC oversight a vast swath of companies that engage to some degree in a common carrier activity. This result is unprecedented, deeply disruptive to the market, and at odds with Congress’s intent. Many of the world’s largest companies offer broadband Internet or other common carriage service. These highly diverse companies could harm consumers by committing acts that are deceptive or unfair, breach privacy commitments, fail to provide reasonable security for sensitive personal data, violate any of the seventy consumer protection statutes Congress has directed the FTC to enforce, or, as in the AT&T case, deliberately omit critical information about the services a company provides – and nonetheless escape FTC enforcement. No other federal agency has authority to fill this void.Second, the panel’s decision creates a deep Circuit split by breaking from the 100-year-long understanding that the term “common carrier” is defined by activities, not status. Departing from established norms of statutory construction, the panel failed to heed settled interpretive rules requiring that exemptions from antitrust laws be construed narrowly; that remedial statutes be read broadly to effectuate their purposes; and that an agency’s interpretation of its organic statute be accorded deference. The panel’s inversion of decades of precedent creates a substantial regulatory gap and puts the Ninth Circuit directly in conflict with the D.C. and Second Circuits.---For the full argument, see the attached PDF.
  • Cross-Device Privacy Must be Protected by FCC Proposed Rule on Broadband ISPs

    Geolocation & Cross Platform and Application Data is Sensitive information. AT&T expands cross-device targeting

    ... ISPs are engaged in cross-device tracking of its subscribers and customers which allow them to target advertising at the individual and household level. Exemplary for all ISPs, we are highlighting AT&T’s efforts in this area. AT&T is expanding its cross-device tracking in order to target individuals on their mobile device after collecting and analyzing their data using the company's internal data and analytics capabilities. In a recent interview, AT&T AdWorks President Rick Welday explained that by the end of this year AT&T will allow marketers to “advertise in 14 million addressable households, 30 million mobile devices and millions of streams within the DirecTV app.” While AT&T may claim that its cross-device tracking is done “anonymously,” that is merely a euphemism to obscure the invasion of privacy that underlies such practices. Mr. Welday explains that AT&T’s data-driven monitoring of its customers enables it to develop dossiers that reveal whether their users are a new homeowner, a new parent, or in the market for an automobile. In its trials with cross-device targeting, AT&T worked with leading Fortune 100 brands as well as promoting its own “AT&T Mobility Wireless” service. The Fortune 100 companies that AT&T worked with likely provided their own so-called first-party data to be used for such cross-device targeting. This illustrates the operational realities today for consumer profiling data, where data are no longer shared with advertisers, but rather advertisers provide such data to ad-delivery platforms (such as AT&T's) for increasingly granular targeting.[3] Linking devices (and the application history on and geolocation on of those devices) to a particular consumer via a unique identifier should be prohibited, unless the ISP has obtained affirmative, express consent (opt-in). The rule’s definition of ‘sensitive information’ must therefore reflect industry practices and include any data elements that allow for this kind of cross-device tracking. The final rule must give ISP customers control over their data, and before companies can proceed with targeted advertising, they must obtain an opt-in consent from their customers. We are particularly concerned that without such safeguards the rules would allow for a by-passing of requirements of the Children’s Online Privacy Protection Act, by using insights gained via cross device tracking to target children without parental consent. Finally, we urge the Commission to affirm in its final rule the need for safeguards against any unauthorized attempts to re-link devices (and its app usage history and geolocation information) to associate them with one user. CDD respectfully urges the FCC to enact its proposed safeguards as soon as possible to help address the further eroding of Americans’ privacy by ISPs.