‘Pro-privacy’ third-party cookie replacement not actually great for privacy • The Register


With the arrival of Google Chrome v89 on Tuesday, Google is preparing to test a technology called Federated Learning of Cohorts, or FLoC, that it hopes will replace increasingly shunned, privacy-denying third-party cookies.

Bennett Cyphers, staff technologist at the Electronic Frontier Foundation, argues FLoC is “a terrible idea,” and urges Google to refocus its efforts on building a web that serves the people who use it.

“Instead of re-inventing the tracking wheel, we should imagine a better world without the myriad problems of targeted ads,” he said in a blog post.

Third-party cookies – files slipped into web surfers’ browsers containing identifiers and other data that marketers use to track people across different websites for the sake of ad targeting and analytics – turn out to be pretty bad for privacy. But because they’re lucrative for the ad business, Google and other companies have been trying to come up with alternative tracking tech that passes muster with privacy regulators.

FLoC is one component in Google’s so-called Privacy Sandbox, a set of technical proposals that Google and other ad tech firms have been working on to replace the third-party cookie, which Google last year said it would stop supporting after competing browser makers began blocking them by default.

“A browser with FLoC enabled would collect information about its user’s browsing habits, then use that information to assign its user to a ‘cohort’ or group,” explains Cyphers. “Each user’s browser will share a cohort ID, indicating which group they belong to, with websites and advertisers.”

Selling by the group

FLoC cohorts are intended to represent large, general interest groups; more specific behavioral ad targeting – narrowing FloC groups down – will be handled by related schemes like Turtledove and others proposals all with bird-themed names that are also due for testing soon.

Google’s own engineers acknowledge that FLoC’s privacy story remains half-baked, noting for example that FLoC currently “creates a new fingerprinting surface in that it provides the same value across sites.”

Software sandbox

In a trial run, Google Chrome to corral netizens into groups for tailored web ads rather than target individuals


Even so, the company self-evaluated FLoC and its responses indicate that FLoC’s implementation adequately addresses security and privacy concerns, at least to the extent an Origin Trial can begin.

Or nearly: FLoC’s Origin Trial, a way for Google to expose experimental web features for testing, aspired to begin on March 2, but remains gated behind a flag due to an unresolved bug.

Nonetheless, there are still unanswered questions and concerns that other technically oriented-types have about FLoC. For example, two years ago, Steven Englehardt, privacy engineer at Mozilla, opened a GitHub Issue challenging Google’s privacy claims.

While FLoC’s use of groups as a form of anonymity (k-anonymity) may make it more difficult to identity an individual user from a FLoC group designation, that doesn’t necessarily prevent the exposure of personal information.

“How personal a piece of information is does not depend on the number of people that share that attribute,” wrote Englehardt. “K-anonymity does nothing to help provide this property.”

As an example, Englehardt cited how more than 30m Americans have diabetes. “While we’re very unlikely to re-identify any of those users based solely on the knowledge that they have diabetes, I suspect we can agree that nearly every individual in this group would not want this information used by advertisers,” he said.

The eternal question

That issue remains unresolved. As Cyphers points out, Google acknowledges this risk in its FLoC explainer on GitHub and concludes it’s better at least than the current situation.

“Sites that know a person’s PII (e.g., when people sign in using their email address) could record and reveal their cohort,” the repo’s README file states. “This means that information about an individual’s interests may eventually become public. This is not ideal, but still better than today’s situation in which PII can be joined to exact browsing history obtained via third-party cookies.”

Cyphers argues that even if Google can address these risks, targeted advertising itself is fundamentally at odds with civil liberties.

“The power to target is the power to discriminate,” he writes. “By definition, targeted ads allow advertisers to reach some kinds of people while excluding others. A targeting system may be used to decide who gets to see job postings or loan offers just as easily as it is to advertise shoes. “

Google, he says, believes it can monitor FLoC to keep sensitive categories of data out – which doesn’t address the possibility that innocuous data categories can be used as proxies for sensitive ones. Zip codes, for example, have been used to target racial groups.

But Cyphers expresses alarm at the prospect of Google monitoring FLoC groups by conducting audits for the usage of sensitive data like race, gender, religion, age, health, and financial status, noting that the company has already been accused of allowing advertisers to discriminate against people.

“That is not the world we want, nor the one users deserve,” Cyphers concludes. “Google needs to learn the correct lessons from the era of third-party tracking and design its browser to work for users, not for advertisers.” ®

You May Also Like…