Don Marti

Sat 24 May 2014 05:16:15 AM PDT

Conventional wisdom on privacy

Just wondering if there are some alternate explanations for the things that "everyone knows" about privacy.

Users are unwilling to pay for privacy technology, therefore users don't want privacy.

Users weren't willing to pay for the anti-telemarketer products offered before Do Not Call, either, and that program got 72% of US residents to sign up.

Norms are not necessarily expressed in terms of willingness to pay. If I ask you to turn your loud music down, it's because I believe the costs of coming into compliance with a norm should be on the transgressor. (This is a place where Homo economicus and Homo sapiens are different.)

Users always pick convenience or entertainment features over privacy features.

Home users didn't express interest in stable desktop OSs before Windows XP and Mac OS X came along to give them fewer crashes but with the ability to keep their existing applications. Today, users complain about privacy problems in the same way that they used to complain about "blue screens" (MS-Windows) or "bombs" (Mac OS). They're inevitable. What the machine does is annoying but that's just what machines do, right?

Today, it would be difficult to sell a computer as unstable as the ones sold to the home market in the 1990s. User expectations have changed.

Most advertising people are not Evil, therefore users don't need privacy from advertising.

Most companies that use email don't send spam, but that doesn't mean a user doesn't need a spam filter.

If you count heads, most advertisers are non-Evil. But more user interactions with new advertisers involve the Evil ones, because they burn through more lists as users catch on to them. (It's similar to the problem of hiring the top 1% of job applicants.)

Einbinder Flypaper from the Bob and Ray Show had the motto, The brand you've gradually grown to trust over the course of three generations. There are some trusted companies with which a user will choose to share information, but most contacts will be new and unknown.

Today's surveillance marketing is creepy because it's in an "uncanny valley." Making it creepier will make users finally like it.

The term uncanny valley comes from the visual arts. We know that a photo-realistic face is not uncanny, because we can watch people's reactions to existing photographs. We know that cartoonish renderings are not uncanny because we can watch people's reactions to existing cartoons.

But we don't have an example of extremely targeted advertising that people are comfortable with. What we do have is accidental random cases where an ad appears to be highly targeted and it creeps people the hell out.

Users can't beat the NSA or PLA, therefore privacy is pointless.

Surveillance marketing has negative externalities, mostly in the form of identity theft, stalking and other crime risks that the surveillance marketing industry imposes on users.

Mass-market advertising has positive externalities, in the form of valuable content that is later usable outside the ad-supported context. (I'm still reading Kurt Vonnegut stories that some Collier's advertiser paid for in the 1950s.)

So any privacy technology that tends to move ad spend from creepy to mass-market is a win for the users, even if it's not effective against extremely competent adversaries.