Privacy pundits ask nanny to leash online tracking big brothers
- — 15 November, 2012 15:19
Opting out of targeted behavioural advertising is a dud choice, say two leading privacy researchers, who are urging lawmakers in Europe to enforce Do Not Track (DNT) browser preferences.
As of last week’s Chrome 23 release all major browsers -- Internet Explorer, Chrome, and Firefox -- have shipped with a DNT feature, meaning that browsers can send a HTTP header “DNT:1” to websites that reflect a user’s preference not to be tracked.
The technology-enabled policy tool promises a simple way for consumers to tell any internet service that they do not want to be tracked, meaning Facebook, Twitter and Google users should be able to tell the respective networks that they do not want visits to be tracked on pages containing Google+, Like and Tweet buttons.
As it stands, however, users merely need to visit a site with these buttons in order to be tracked, which may not be a problem for many sites, but could be when they visit health related ones.
Without enforcement, DNT plays second fiddle to the lesser choice of opting out of the self-regulated online behavioural advertising (OBA) industry, Stanford University privacy researcher Arvind Narayanan and Claude Casterluccia from France’s INRIA argue in a pitch to European policy makers on behalf of the European Network and Information Security Agency (ENISA).
“Much of the debate today focuses on OBA instead of tracking. For example, some advertising companies interpret DNT as an opt-out of targeted behavioural ads, linking DNT to the industry self-regulatory programme,” the pair write.
“Tracking is the problem – not behavioural advertising. DNT should be interpreted as a request for not being tracked by third parties, either directly or with the help of first parties.”
Narayanan was one of the first researchers to tackle a common claim by online companies that collecting and sharing anonymised data about users posed little risk to them. He proved that wrong by attaching identities to anaonymised data that was exposed through a Netflix marketing campaign aimed at exploiting the crowd to improve its movie recommendation algorithm.
User education could help, however they argue it is weakened by users' expectation of “free” and corrupted by the likes of Google, which continues to “obfuscate the distinction between online tracking and behavioural advertising”.
The pair argue that policy makers should demand more meaningful privacy policies and work towards control tools that help users understand how their data is being used.
Government could also employ fingerprinting tools to monitor violations of user preferences and deal with the offshoring problem by preventing local ‘first parties’ from doing business with non-compliant off-shore ‘third-parties’.