CDD

program areas Digital Citizen

  • Blog

    Marketers Put First-Party Data First

    First-Party data usage to rise due to value

    Despite struggles, marketers remain focused on improving big data (link is external), and those putting money toward such efforts are reaping the benefits. In April 2015 research by the Direct Marketing Association (link is external) and Winterberry Group (link is external), 43% of US marketing professionals said they expected their data-driven marketing (DDM) spending to be higher in Q2 2015 than Q1, and 60.2% of respondents expected DDM revenues to increase during the same period. However, June 2015 research by Econsultancy in association with Signal found that senior-level marketers in North America weren’t jumping for joy over returns from data-related marketing investments. Just over one-third said these had a strong positive impact. Promisingly, though, was that 47% said that they had some positive impact. Econsultancy suggested that first-party data could help the group reporting so-so results—along with the laggards, of course. When marketers were asked to compare different levels of data and their effect on desired outcomes, first-party data ranked highest across the board. It was most popular for gaining insight into customers, cited by 74% of respondents. More than six in 10 respondents also said it was the easiest to justify using, drove the highest increase in customer value and the highest campaign lift among data sources—all by a long shot. Read more at http://bit.ly/1IHPcJZ (link is external)
  • Blog

    Can Mondelez, Facebook Sell More Cookies Online?

    Snacks Giant and Social Network Reach Deal To Boost Video, E-commerce

    Mondelez International has struck a new deal with Facebook that the Oreo maker says will help it crack the code for selling more cookies online. The pact renews a one-year global strategic partnership announced in March of 2014 that covered 52 countries and gave the snacks and candy marketer access to Facebook's beta testing programs. The new arrangement also covers 52 countries and will "focus on creating and delivering creative video content and driving impulse snack purchasing online," according to a statement issued on Tuesday. Mondelez declined to release terms of the new deal but a spokeswoman said it is "significantly larger" than last year's deal. The arrangement was brokered by Dentsu Aegis Media. Mondelez will get access to a full-time Facebook creative strategist that will work with the company and its agencies to develop "scalable video content natively for the platform to optimize social engagement," according to the announcement. Facebook will also give Mondelez access to beta-testing programs on Facebook and Instagram. Additionally, the two parties will work together to create ecommerce solutions to "drive impulse purchases" in markets including the U.S., Australia, India and U.K. Full article available at http://bit.ly/1QPZEJd (link is external)
  • Blog

    Facebook has been looking at an alternative to the 'Like' button

    Reading your facial expressions and sending your friends an appropriate cartoon face.

    Facebook has given us a bit more of an insight into some of the more experimental new products and apps its product team has been working on. Speaking at the Cannes Lions International Festival of Creativity, Facebook's chief product officer Chris Cox, showed off what could one day work as a more expressive version of the Like button. Instead of simply pressing Like, Cox said users could use their smartphones to take a selfie. But rather than just send that, the function could read the user's Facebook expression and transform it into an appropriate smiley/sad/frowning/indifferent face. Cox made it clear: "This is not on our roadmap, we don't know how to build this. It actually seems really hard, but it's the kind of thing unlocked by the power of all the different sensors on the phone." Read more at http://read.bi/1Ni1xJ5 (link is external)
  • Delving deeper into ecommerce, Twitter is testing some new ways to help users discover products within its network. First up are dedicated pages, which will feature images and videos about products, alongside information such as a description, price and an option to buy, book, or visit a brand’s Web site for more information. Within users’ timelines, they can now expect to see pages and collections of pages that are shared by influencers and brands. In addition, Twitter is also beginning to test new ways for people and brands to create and share Twitter collections of products and places. Users can now browse collections from various influencers, and get more information about featured products and places. Already, there’s Nike’s LeBron Elite collection; Reese Witherspoon’s Draper James Summer Picks; The Ellen Show’s Best of The Ellen Shop and HBO's #GoT Fan Favorites. “This is just the beginning,” Amaryllis Fox, a product manager at Twitter, promises on a new blog post. “In the coming months we’ll be testing more new experiences we hope give you the most personalized and relevant information about the places and things you want to explore.” While Twitter struggles (link is external) with its direction, ecommerce appears to part of its broader ambitions. Among other efforts, the company has been inviting advertisers to create credit-card-connected promotions, and share them with users directly in their timelines. With their credit cards, users can redeem the new “Twitter Offers” in stores without the need for a coupon or numerical code. Full article available at http://bit.ly/1GuVHOT (link is external)
  • Blog

    The First 5 Seconds: Creating YouTube Ads That Break Through in a Skippable World

    Google's Art, Copy & Code team saw a similar result with its first Unskippable Labs experiment. The team created and tested three YouTube ads for Mountain Dew® Kickstart™ and one with a lighter brand touch in the first five seconds was skipped less on mobile.

    Five, four, three, two, one. What keeps people watching after the first five seconds? What can science tell us about the art of video advertising? We took a peek behind the data curtain to see which creative choices capture audiences' attention. Online video ad formats like YouTube TrueView (link is external) ads have created a paradox for marketers. They remove traditional 30-second time constraints, giving brands more time to tell their stories. But introducing a "skip" button after five short seconds also means that advertisers have to create more engaging stories that not only grab their audience's attention, but hold it, too. Is it time to start creating ads with the "skip" button in mind? Today, all ads are skippable—whether it's a function of the format or not. People have been honing their skipping skills for a while. Think about it: Viewers experimented with fast-forwarding on their VCRs, improved their skills with DVRs, and now are mastering ad choice on the web. Even if there's no option to fast-forward or skip, consumers can always pick up a smartphone, switch tabs, or find other ways to hit a metaphoric skip button. Is it time to start creating ads with the 'skip' button in mind? Thousands of ads run on YouTube every day. So, when we look at that data in aggregate, what patterns emerge? What can we learn from existing video ads about creative that works in the first five seconds? To find the answer, we looked into thousands of TrueView ads across 16 countries and 11 verticals, categorizing them according to 170 creative attributes, including brand name mentions and featured celebrities. We used aggregated analytics from AdWords to see how long people watched without hitting the skip button. To measure brand awareness and ad recall (link is external), we took advantage of Google's Brand Lift (link is external). There are no "rules" for making ads people choose, but we did find that certain creative choices are associated with how long viewers watch or how well they remember ads on YouTube. Turns out, there is a certain science to the art of engaging video advertising. Here's what we've learned. Full article available at http://bit.ly/1FpaPg5 (link is external)
  • We explore the task of recognizing peoples' identities in photo albums in an unconstrained setting. To facilitate this, we introduce the new People In Photo Albums (PIPA) dataset, consisting of over 60000 instances of over 2000 individuals collected from public Flickr photo albums. With only about half of the person images containing a frontal face, the recognition task is very challenging due to the large variations in pose, clothing, camera viewpoint, image resolution and illumination. We propose the Pose Invariant PErson Recognition (PIPER) method, which accumulates the cues of poselet-level person recognizers trained by deep convolutional networks to discount for the pose variations, combined with a face recognizer and a global recognizer. Experiments on three different settings confirm that in our unconstrained setup PIPER significantly improves on the performance of DeepFace, which is one of the best face recognizers as measured on the LFW dataset. More available at http://bit.ly/1N0RfwN (link is external)
  • News

    Privacy and Consumer Advocates Leave Administration’s “Multistakeholder” Negotiations on Facial Recognition

    Cite Industry Refusal to Support a Consumer’s Right to Control their own Facial Data

    Today, the groups participating in the Obama Administration’s so-called multi-stakeholder negotiations to develop a self-regulatory “code of conduct” on facial recognition and privacy sent a letter (attached) to the Commerce Department explaining why they would no longer participate in the process. As CDD has said from the start, the approach the Administration embraced to protect consumers’ rights to their personal information was flawed. It relied on the data collection and digital marketing industry to support significant new policies that would empower individuals to make decisions about how their information can be collected and used. Right now, of course, it’s individual companies and industry-wide data gathering practices that have left Americans with barely any privacy. For the industry—including Google, Facebook, and Microsoft—what’s foremost in their political agenda is preserving their right to use all our personal information without constraint. It never made sense to expect industry to turn away from business practices that reap billions of dollars. What was needed at the outset was an independent agency such as the Federal Trade Commission proposing tough new rules—and an administration willing to fight for the interests of the public. The multi-stakeholder approach to Internet governance cannot work when it involves challenging the economic (or political) interests of the digital industry and its partners. Our facial data is sensitive, personal information. Before companies can gather it—let alone use it—a person must have at a minimum full knowledge on how it will be used and give meaningful prior consent. None of the companies or industry trade associations participating in the Commerce Department-led initiative is willing to support opt-in for facial recognition. That’s because they are increasingly using facial recognition technologies to track and target people in commercial settings, adding our face and other biometric data to the vast amounts of information they now routinely gather. The withdrawal by the consumer and privacy groups should wake up the Obama Administration—it must embrace a new ethics-based approach to how it develops consumer privacy safeguards. Relying on the digital foxes (the data industry) to develop rules on data gathering and use will actually lead to the further erosion of our privacy and consumer protections. This failure by the White House on privacy underscores why the EU must oppose U.S. attempts to weaken its own civil rights-based approach to data protection, especially through the TTIP trade deal. .
  • EMBARGOED FOR USE AFTER THE REPORT IS LAID IN PARLIAMENT BY THE PRIME MINISTER ON THURSDAY 11 JUNE 2015 Today the Prime Minister published the Report of the Investigatory Powers Review, entitled ‘A Question of Trust’. It was submitted to him by David Anderson Q.C. Independent Reviewer of Terrorism Legislation. Quote David Anderson said: “Modern communications networks can be used by the unscrupulous for purposes ranging from cyber-attack, terrorism and espionage to fraud, kidnap and child sexual exploitation. A successful response to these threats depends on entrusting public bodies with the powers they need to identify and follow suspects in a borderless online world. But trust requires verification. Each intrusive power must be shown to be necessary, clearly spelled out in law, limited in accordance with international human rights standards and subject to demanding and visible safeguards. The current law is fragmented, obscure, under constant challenge and variable in the protections that it affords the innocent. It is time for a clean slate. This Report aims to help Parliament achieve a world-class framework for the regulation of these strong and vital powers.” The Report The Review was conducted by a small independent team under the leadership of David Anderson Q.C. It received almost 70 written submissions. Further evidence was taken from public authorities (at the highest level of security clearance) and from a wide range of organisations and individuals in the UK, California, Washington DC, Ottawa, Berlin and Brussels. Parts I-III of the Report (Chapters 1-12) inform the debate by summarising the importance of privacy, the threat picture, the relevant technology, external legal constraints, existing law and practice and comparisons with other types of surveillance, other countries and private sector activity. They also summarise the views expressed to the Review by law enforcement, intelligence, service providers and civil society. Part IV of the Report (Chapters 13-15) sets out five underlying principles and 124 separate recommendations. Taken together, they form the blueprint for a new law to replace the Regulation of Investigatory Powers Act 2000 [RIPA] and the dozens of other statutes authorising the collection of communications data. The key recommendations are summarised in paras 10-34 of the Executive Summary at the start of the Report. They include, in particular: a new law that should be both comprehensive in its scope and comprehensible to people across the world (Executive Summary, paras 10-11); maintaining, subject to legal constraints, existing capabilities relating to compulsory data retention as provided for by DRIPA 2014 and formerly under an EU Directive (ES, para 12); the enhancement of those capabilities (e.g. by requiring the retention of “weblogs” as proposed in the draft Communications Data Bill 2012, the so-called “snoopers’ charter”) only to the extent that a detailed operational case can be made out and a rigorous assessment has been conducted of the lawfulness, likely effectiveness, intrusiveness and cost (ES, para 13); the retention subject to legal constraints of bulk collection capabilities (the utility of which is briefly explained by reference to six case studies from GCHQ: Annex 9), but subject to additional safeguards and to the addition of a new and lesser power to collect only communications data in bulk (ES, paras 14-15); a new requirement of judicial authorisation (by Judicial Commissioners) of all warrants for interception, the role of the Secretary of State being limited to certifying that certain warrants are required in the interests of national security relating to the defence or foreign policy of the UK (ES, paras 16-17); measures to reinforce the independence of those authorising requests for communications data, particularly within the security and intelligence agencies (ES, para 21); a new requirement of judicial authorisation of novel and contentious requests for communications data, and of requests for privileged and confidential communications involving e.g. journalists and lawyers (ES, paras 25-27); the streamlining of procedures in relation to warrants and the authorisation of requests for communications data by local authorities and other minor users (ES, paras 19, 23-24); improved supervision of the use of communications data, including in conjunction with other datasets and open-source intelligence (ES, para 29); maintaining the extraterritorial effect in DRIPA 2014 s4, pending a longer-term solution which should include measures to improve the cooperation of overseas (especially US) service providers and the development of a new international framework for data-sharing among like-minded democratic nations (ES, para 20). the replacement of three existing Commissioners’ offices by the Independent Surveillance and Intelligence Commission: a new, powerful, public-facing and inter-disciplinary intelligence and surveillance auditor and regulator whose judicial commissioners would take over responsibility for issuing warrants, for authorising novel, contentious and sensitive requests for communications data and for issuing guidance (ES, paras 28-32); expanded jurisdiction for the Investigatory Powers Tribunal, and a right to apply for permission to appeal its rulings (ES, para 33); and the maximum possible transparency on the part of ISIC, the IPT and public authorities (ES, para 44). Other Reports The Report endorses some of the recommendations of the Intelligence and Security Committee of Parliament (“Privacy and Security”, March 2015). But the Report is broader in its scope, covering the activities of all 600 bodies with powers in this field and not just the security and intelligence agencies. It also departs from the ISC in recommending (a) that a new law should apply across the board (Report, 13.35-13.44), and (b) that interception warrants should be judicially authorised (Report, 14.47-14.57) A further Independent Surveillance Review, to be conducted under the auspices of the Royal United Services Institute (RUSI), was commissioned in March 2014 by the Deputy Prime Minister. It has not yet issued a report. Encryption There has been some recent media speculation on the subject of encryption, which it may be useful to correct. The position communicated by the security and intelligence agencies to the Review is summarised (Report, 10.20) as follows: “The Agencies do not look to legislation to give themselves a permanent trump card: neither they nor anyone else has made a case to me for encryption to be placed under effective Government control, as in practice it was before the advent of public key encryption in the 1990s. There has been no attempt to revive the argument that led to the Clipper Chip proposal from the NSA in the 1990s, when public key cryptography first became widely available. But the Agencies do look for cooperation, enforced by law if needed, from companies abroad as well as in the UK, which are able to provide readable interception product.” The Report recommends that in the digital world as in the real world, “no-go areas” for intelligence and law enforcement should be minimised (13.7-13.14). But as concluded at 13.12: “Few now contend for a master key to all communications held by the state, for a requirement to hold data locally in unencrypted form, or for a guaranteed facility to insert back doors into any telecommunications system. Such tools threaten the integrity of our communications and of the internet itself. Far preferable, on any view, is a law-based system in which encryption keys are handed over (by service providers or by the users themselves) only after properly authorised requests.” Notes for editors: Section 7 of the Data Retention and Investigatory Powers Act 2014 http://www.legislation.gov.uk/ukpga/2014/27/section/7/enacted (link is external) required the Independent Reviewer of Terrorism Legislation to examine: the threats to the United Kingdom; the capabilities required to combat those threats; the safeguards to protect privacy; the challenges of changing technologies; and issues relating to transparency and oversight; and to report to the Prime Minister on the effectiveness of existing legislation relating to investigatory powers, and to examine the case for a new or amending law. This Report is a result of his work on those issues. David Anderson Q.C. is a barrister practising from Brick Court Chambers in London, a Visiting Professor at King’s College London, a Judge of the Courts of Appeal of Guernsey and Jersey and a Bencher of the Middle Temple. He is an experienced advocate in the European Court of Human Rights and in the Court of Justice of the EU: http://www.brickcourt.co.uk/people/profile/david-anderson-qc (link is external). He has served on a part-time basis since 2011 as the Independent Reviewer of Terrorism Legislation, reporting in that capacity to the Home Secretary, to the Treasury and to Parliament on the operation of the UK’s anti-terrorism laws. Contact: For more information about the Independent Reviewer of Terrorism Legislation and for a full copy of the Report please go to: https://terrorismlegislationreviewer.independent.gov.uk (link is external) or contact his clerk kate.trott@brickcourt.co.uk (link sends e-mail). You can also follow David on Twitter: @terrorwatchdog
  • AppNexus has launched a new programme which it claims allows advertisers to say “goodbye to the black box” and use their data more effectively. The ad tech company has today announced the launch of AppNexus Programmable Bidder (APB), enabling buyers to “bring their own algorithms” and plug them directly into the firm’s infrastructure. Available to only a handful of clients, the product will eventually be opened up to all clients with their own data sets. Advertisers will set the parameters for the campaign, while AppNexus will manage bidding, reporting and engineering. According to AppNexus co-founder Brian O’Kelley, this king of “real-time algorithmic bidding” with “revolutionise” marketing, but allowing advertisers to “refine and adapt”. Speaking to M&M Global, Catherine Williams, chief data scientist at AppNexus, said the scheme offers a “middle way” for brands with data science processed in place, but without the funds to build their own bidding infrastructure. “[Advertisers] have either had to take their data to a third party and plug it into a black box, and the third party promise to use the data to get the best results, or build their own infrastructure, which is an enormous engineering and maintenance project,” said Williams. Full article available at http://bit.ly/1Fd3wI9 (link is external)
  • There is a growing and much needed debate (link is external) on the role that algorithms and machine-driven decision making play in our lives. The use of “Black Box (link is external)” assessments of individuals to determine what kind of financial product to offer is raising (link is external) legitimate concerns about discrimination and unfair practices. Why are some people targeted for a high-cost payday or higher interest credit card, for example, while others receive better terms and conditions? The answer to such questions can be found inside the “Black Box,” in which there are very clear objectives from businesses designed to effectively use all the new power they now have due to the merging of “Big Data” technologies with our always-connected (and data-generating) online way of life. Corporations have armed themselves with the latest tools to harvest and analyze the ever-growing flow of information available. Through (link is external) “Data Management Platforms,” alliances (link is external) with giant data brokers, and through effective use of the powerful digital (link is external) marketing apparatus created by Google, (link is external) Facebook (link is external) and many others, financial, retail, grocery, health, education and nearly every other sector can make decisions about an individual in lightening speed. Companies can now reach us with offers in milliseconds, regardless (link is external) of whether we are using a mobile phone, personal computer or other connected devices. Over the next few months, we will explore the business models driving the “Black Box.” But today we will examine one crucial element that enables companies to so easily take advantage of our information to assess and influence our behavior. So-called programmatic (link is external) advertising is a data-driven system that allows companies to “buy and sell” individuals in real time when they are online or using their mobile device. In today’s digital marketplace, our “profiles”—the information gleaned about who we are and what we do—is traded as a commodity. Pioneered in the U.S. by Yahoo, Google (link is external), Rubicon Project (link is external), Appnexus (link is external) and others, this little known system is quickly dominating how online marketing and advertising operates throughout the world (link is external). Programmatic advertising relies on superfast computers that keep tabs on where we are online, so we can either be sold to the highest or special-interest bidder or labeled as someone not worth targeting at all (in ad terms, these people are classified as “waste (link is external)”). Regulators in the U.S. and EU (link is external) have not done a good job addressing the privacy and consumer-protection concerns raised by programmatic marketing. It’s a key area in which CDD has played a role advocating for a more responsible regulatory approach. But for now, we want to highlight some of the features of programmatic marketing by excerpting from the new “Adgorithms” (link is external) IPO (which just went public in the UK). We think the excerpts provide an opportunity for the public to peer inside an important part of the “Black Box” machine that is increasingly dominating our lives. Take a look and we will return soon with a discussion. For more information on programmatic advertising, see adexchanger.com (link is external) and exchangewire.com (link is external). Excerpt: Business Overview The Company's software, Albert, is a proprietary artificial intelligence based programmatic platform, which plans, identifies, prices and delivers relevant advertisements in multiple fields of online advertising. Using complex algorithms, historical data and artificial intelligence, Albert seeks to predict user intent and deliver advertisements that are likely to engage that particular user and result in higher engagement for the brand. It analyses the available advertising opportunities on the advertising exchanges, decides which one of them is most relevant and ultimately determines the right price to pay for a specific impression. The advert is then displayed on the screen of the user. This whole process occurs in under a second. Self-learning The accuracy of Albert improves with every online advertisement it delivers, as it incorporates new data whilst continuing to learn from previous data. This ability of Albert enables it to adapt to changes in the marketplace in order to capitalize on opportunities and to minimize purchasing of non-effective inventory, including fraudulent advertising activity. Understanding of consumer behavior patterns Albert has powerful and actionable insights into consumer behavioral patterns and web properties that it can leverage, such that the Directors believe it is able to target the most relevant audience for a particular advertising campaign more effectively, and achieve KPIs set by brands quicker and more cost effectively, than its peers. Adgorithms offers clients targeted online advertising via demographic, geographic location, time of day and behavioral characteristics. Direct Adgorithms works directly with companies who wish to advertise their goods and services, and also with media agencies working on their behalf to optimize advertising campaigns. At the outset of an engagement, the Company is supplied with creative materials, such as a banner or video advertisement, a pre-defined KPI to launch an advertising campaign and an advertising budget. Examples of KPIs include a number of user click-through on a banner or a user watching a video advertisement for a specified period of time. The creative materials and KPIs are inputted into and processed by Albert, following which it bids for impressions in real time based on observed or predicted user intent. Albert will then optimize the performance of the campaign until the KPI is reached. Albert does this by using its own data and also proprietary data that its clients provide (including data in relation to which users have the highest value to the client), contributing to the feedback loop. By using Albert to determine the right price to pay for a particular impression, the Company has a proven ability to maximize ROI from a client's advertising budget and reduce customer CPA. The automation of the campaign management by Albert also minimises the need for human intervention, creating efficiencies and reducing labour costs for the advertiser, and particularly for media agencies (which often manage many campaigns concurrently). ALBERT The Company's technology solution enables online advertisers to efficiently and effectively engage and convert customers. Its solution is comprised of the Adgorithms software, called Albert, data assets, the feedback loop and access to display, video, mobile and social advertising inventory through the online advertising exchanges. Overview On a daily basis, the Company is presented with billions of opportunities to deliver an advertisement to users when advertising impressions become available through the various advertising exchanges. For each impression that becomes available, the Company has realtime software systems that recommend an advertiser's specific creatives (e.g. banners or videos) based on a prediction of the likelihood of a user engaging with an advertisement. Albert is designed to determine the most appropriate advertisement to show to the user and determines what price to pay for the advertising impression. The core of the Company's solution involves: · determining a user's engagement with display advertisements, which is a relatively rare event that requires a large sample size of relevant data to accurately predict; · obtaining a large sample size of relevant data, which is difficult, in particular where the most relevant data points are also the most sparse e.g. very recent data on specific product interest; and · building powerful, scalable and flexible systems that operate both accurately and quickly, between the time a user navigates to a page and an advertisement is delivered. Albert is designed to continuously download data from advertising exchanges, analysing and storing it in its internal database. It then re-calibrates its prediction models so that the prediction and bidding are constantly up to date with new media sources available through the exchanges and with the ever-changing pricing and quality landscape of existing media. As those internal predictions are updated in Albert, they are propagated to the various exchanges so that customer's campaigns running in the exchanges can bid more aggressively for opportunities that are considered positive for that advertiser, and less aggressively or not at all for opportunities the software now considers less favourable. To achieve those performance goals, Albert acquires hundreds of gigabytes of data daily, containing information on impressions, engagements and conversions, and performs tens of thousands of updates every hour. It collects and analyses information on millions of online websites and mobile applications that are available for it to advertise in, evaluating the performance of each of the campaigns it manages in those websites and based on that generates prediction for future performance of advertising campaigns in relevant media spots. Using this feedback loop, Albert is able to choose from the tens of billions of available opportunities daily, the hundreds of millions of impressions it predicts would be optimal for its customers.
    Jeff Chester