Home' Technology Review : July August 2012 Contents business report
technology review July/august 2012
the Value of Privacy
not always. Some disclosures seem like
bombshells to you (“I’m getting a divorce”)
but produce only virtual cricket chirps from
your social network. And yet seemingly
insignificant communications (“Does my
butt look big in these jeans?”) can produce
a torrent of responses. Behavioral scientists
have a name for this dynamic: “intermittent
reinforcement.” It’s one of the most power-
ful behavioral training techniques we know
about. Give a lab rat a lever that produces a
food pellet on demand and he’ ll only press
it when he’s hungry. Give him a lever that
produces food pellets at random intervals,
and he’ ll keep pressing it forever.
How does society get better at preserving
privacy online? As Lawrence Lessig pointed
out in his book Code and Other Laws of
Cyberspace, there are four possible mecha-
nisms: norms, law, code, and markets.
So far, we’ve been pretty terrible on all
counts. Take norms: our primary normative
mechanism for improving privacy decisions
is a kind of pious finger-wagging, especially
directed at kids. “ You spend too much time
on those Interwebs!” And yet schools and
libraries and parents use network spyware
to trap every click, status update, and IM
from kids, in the name of protecting them
from other adults. In other words: your pri-
vacy is infinitely valuable, unless we’re vio-
lating it. (Oh, and if you do anything to get
around our network surveillance, you’re in
What about laws? In the United States,
there’s a legal vogue for something called
“Do Not Track”: users can instruct their
browsers to transmit a tag that says, “Don’t
collect information on my user.” But there’s
no built-in compliance mechanism—
we can’t be sure it works unless auditors
descend on IT giants’ data centers to ensure
they aren’t cheating. In the EU, they like the
idea that you own your data, which means
that you have a property interest in the
facts of your life and the right to demand
that this “property” not be misused. But
this approach is flawed, too. If there’s one
thing the last 15 years of Internet policy
fights have taught us, it’s that nothing is
ever solved by ascribing property-like rights
to easily copied information.
There’s still room for improvement—and
profit—in code. A great deal of Internet-
data harvesting is the result of permissive
defaults on how our browsers handle cook-
ies, those bits of code used to track us. Right
now, there are two ways to browse the Web:
turn cookies off altogether and live with the
fact that many sites won’t work; or turn on
all cookies and accept the wholesale extrac-
tion of your Internet use habits.
Browser vendors could take a stab at the
problem. For a precedent, recall what hap-
pened to pop-up ads. When the Web was
young, pop-ups were everywhere. They’d
appear in tiny windows that re-spawned
when you closed them. They ran away
from your cursor and auto-played music.
Because pop-ups were the only way to com-
mand a decent rate from advertisers, the
conventional wisdom was that no browser
vendor could afford to block pop-ups by
default, even though users hated them.
The deadlock was broken by Mozilla,
a nonprofit foundation that cared mostly
about serving users, not site owners or
advertisers. When Mozilla’s Firefox turned
on pop-up blocking by default, it began to
be wildly successful. The other browser ven-
dors had no choice but to follow suit. Today,
pop-ups are all but gone.
Cookie managers should come next.
Imagine if your browser loaded only cook-
ies that it thought were useful to you, rather
than dozens from ad networks you never
intended to interact with. Advertisers and
media buyers will say the idea can’t work.
But the truth is that dialing down Internet
tracking won’t be the end of advertising.
Ultimately, it could be a welcome change
for those in the analytics and advertising
business. Now it seems as if everyone gets to
slurp data out of your computer, regardless
of whether the service is superior. Once the
privacy bargain takes place without coer-
cion, good companies will be able to build
services that get more data from their users
than bad companies.
For mobile devices, we’d need more
sophisticated tools. Today, mobile-app
marketplaces present you with take-it-or-
leave-it offers. If you want to download that
Connect the Dots app to entertain your kids
on a long car ride, you must give the app
access to your phone number and location.
What if mobile OSes were designed to let
their users instruct them to lie to apps?
“Whenever the Connect the Dots app wants
to know where I am, make something up.
When it wants my phone number, give it
a random one.” An experimental module
for Cyanogenmod (a free/open version of
the Android OS) already does this. It works
moderately well but would be better if it
were officially supported by Google.
Far from destroying business, letting
users control disclosure would create
value. Design an app that I willingly give
my location to (as I do with the Hailo app
for ordering black cabs in London) and
you’d be one of the few and proud firms
with my permission to access and sell that
information. Right now, the users and the
analytics people are in a shooting war, but
only the analytics people are armed. There’s
a business opportunity for a company that
wants to supply arms to the rebels instead
of the empire.
Cory doctorow is a science fiction author, activist,
journalist, and co-editor of Boing Boing.
The users and the analytics people are in a shoot-
ing war, but only the analytics people are armed.
There’s a business opportunity for a company that
supplies arms to the rebels instead of the empire.
6/6/12 5:28 PM
Links Archive May June 2012 September October 2012 Navigation Previous Page Next Page