The following is adapted from remarks delivered by Matt Brittin, President, Google EMEA, at UBA Trends Dayin Brussels, on data, ethics, and privacy by design.
I first accessed the internet in 1989 — the same year Tim Berners-Lee invented the World Wide Web.
Bright text on dark backgrounds listing links to other pages of text listing more links. Thirty years later, it’s something many of us take for granted. Half of humanity is online, using tools that we could never have dreamed of. It’s open, affordable and would have seemed magical to me as a student.
But, we’re running it all on a rule book that’s twenty years out of date; delight with the magic is tempered by concerns about how our data is used; and fears of technology being used for ill rather than for good.
A century of advertising
It’s often helpful to make sense of the future by understanding the past. Throughout history, advertising has helped make all kinds of media content affordable and accessible.
About a century ago, as the global middle class was growing, modern business could reach potential customers at an undreamed of scale. But reaching all those people — without knowing how many of them might be interested in your product — was expensive and inefficient.
Modern newspapers came up with ‘the bundle’ — ad space sold in specific sections like ‘Auto’ or ‘Fashion’. So that car companies could communicate directly to readers interested in cars; and coat sellers to readers interested in fashion.
Mass-market magazines created ways to target diverse interest sets — creating magazines or sections specifically for gardeners, those interested in the natural world, or science fiction.
And Broadcasting developed increasingly differentiated ‘genre entertainment’ — a novel form that helped advertisers segment and reach viewers based on assumptions about who was watching.
All of these inventions benefited our everyday lives — bringing us our favorite magazines, TV shows or newspapers. And with measurement and data, advertisers were reassured that they were getting value from the exercise too. Ads have long funded our favorite content, and they’ve always been targeted.
Preparing for the future
That’s what made Google Search possible. It’s free to use not because we target based on knowing anything about you — just that you are searching for cycling shoes in Brussels right now. It gives you advertising that’s relevant and useful — and privacy safe.
The web has brought an explosion of content and choice. And the chance to show a different ad to people reading the same article, or watching the same show.
But the question for the web in 2022 is whether this model of advertising is good enough. With more people managing more of their lives online than ever before, the web is going through a fundamental shift. Citizens want more online privacy and control — and for the services they use to earn, and be worthy of, their trust.
That means preparing for a future without third party cookies — by working with the industry to build and test new solutions in the Privacy Sandbox, like our latest proposal, the Topics API. Proposals that make advertising on the web more private and transparent — without needing to compromise on quality or content.
The importance of distributed computing
Now, reform also means regulation — clear tools and rules. We’re grateful to be getting a steer from regulators on a full range of issues, from cookies to online ads — and for the concern it shows for user privacy.
Of course, with increased regulation comes intense engagement. Today, some are questioning whether services like Google Analytics can be properly used in Europe under the GDPR. The concern is that because it’s run by an America-based company, Google Analytics can’t totally remove the possibility that the US government could demand access to user data.
This is a strictly hypothetical situation — because over the past 15 years, Google Analytics has never received a request of the kind speculated about in this case. While legal cases on this have only covered a few specific websites and their unique circumstances, there are others who are concerned that the same logic could be applied to any US-based provider or website — and indeed any EU-US data transfers.
Talk to anyone in the technical or security communities, and they will tell you that scaled cloud computing of the kind supporting these services makes data more secure, not less. Scale makes it easier to fight hackers, scammers and thieves — by expanding the signals needed to detect them. It’s how platforms can offer customers the greatest possible security and redundancy.
Today, Project Shield is a great example of that. It’s an advanced security technology that helps keep organizations safe from cyber attacks — particularly those designed to overwhelm small organizations with a flood of fake traffic.
We use Project Shield to protect at-risk organizations across the world, like news sites, human rights organizations or election monitors. Including in countries like Ukraine, where over 150 government and news websites are currently being kept safe and online by Project Shield and in surrounding countries affected by the war — so that they can continue to provide valuable information and services to people on the ground.
Here’s the kicker: like Google Analytics, the infrastructure that enables Project Shield relies on transatlantic data flows. We’re able to absorb massive attacks against individual websites by diffusing the traffic across a global network.
The very processes that enable Project Shield — a service that is protecting news and human rights organizations across Europe — are themselves considered suspect because they don’t adequately protect European users from the United States.
Towards a more responsible foundation
Of course, we understand that there are concerns about U.S. surveillance overreach — and we share them. Google has lobbied many years for U.S. government transparency, lawful processes, and surveillance reform — and continues to fight for protections for digital citizens outside the U.S.
We’ve done so while continuing in our belief that it is possible to advance international cooperation towards shared goals and against shared threats — and to build a future based on interests and values shared by democracies on both sides of the Atlantic.
For users, advertisers and tech, this shift towards a privacy-first internet will be a good thing.
Our studies have found that when users know that their privacy is respected, they respond with increased trust and interest. Users who feel they have control over their data are two times more likely to find content relevant; and three times more likely to react positively to advertising.
For online advertising, and the internet as a whole, this is a page-turning moment. We’re getting tools and rules. Legal clarity. Codes of practice. And a regulatory dialogue. A new future of advertising is coming: one that puts privacy front and center.
Source : Matt Brittin on data, ethics, and privacy by design