Index

Highlights from Gerd Gigerenzer’s interview with Russ Roberts.

On the public’s concern about online privacy:

And, as you hinted before, there’s the so-called Privacy Paradox, which is that, in many countries, people say that their greatest concern about their digital life is that they don’t know where the data is going and what’s done with that.

If that’s the greatest concern, then you would expect that they would be willing to pay something. That’s the economic view. […]

[…] Germany is a good case. Because in Germany, we had the East German Stasi. We had another history before that—the Nazis, who would have enjoyed such a surveillance system.

And, so Germans would be a good candidate for a people who are worried about their privacy and would be willing to pay. […]

I have done three surveys since 2018, the last one this year. With representative sample of all Germans over 18. And asked them the question: ‘How much would you be willing to pay for all social media if you could keep your data?’

We are talking about the data about whether you are depressed, whether you’re pregnant, and all those things that they really don’t need.

So: ‘How much are you willing to pay to get your privacy back?’

75% of Germans said nothing. Not a single Euro. […]

So, if you have that situation where people say, ‘My greatest worry is about my data’; at the same time, ‘No, I’m not paying anything for that,’ then that’s called the Privacy Paradox.

The public’s concern about surveillance is similar to the concern about the environment: the public understands the problem, but doesn’t really care.

I believe most people fake their concerns about surveillance and environmental decay because that’s what they are expected to do in polite company. The public shows its true color once it has to expend resources on solving the problem instead of merely virtue signaling.

Gerd Gigerenzer made another great point about surveillance; we get our citizens started early these days:

I think there’s already surveillance in a child’s life. Remember Mattel’s Barbie? The first Barbie was modeled after a German tabloid cartoon, the Bild-Zeitung, and it just gave totally unrealistic long legs and tailored figures. The result was that quite a few little girls found their body not right. In 1998, the second version of Ken could talk briefly—utter sentences like, ‘Math is hard. Let’s go shopping.’

The little girls got a second message: They’re not up to math. They are consumers. And the 2015 generation, called Hello Barbie, which got the Big Brother Award, can actually do a conversation with the little girl. But, the little girl doesn’t know that all the hopes and fears and anxieties it trusts to the Barbie doll are all recorded and sent off to third parties, analyzed by algorithms for advertisement purposes.

And also, the parents can buy the record on a daily or weekly basis to spy on their child.

Now, two things may happen, Russ. One is the obvious, that maybe when the little girl is a little bit older, then she will find out, and trust is gone in her beloved Barbie doll and also maybe in her parents.

But, what I think is the even deeper consequence is: the little girl may not lose trust. The little girl may think that being surveilled, even secretly, that’s how life is.

And so, here is another dimension that the potential of algorithms for surveillance changes our own values. We are no longer concerned so much about privacy. We still say we are concerned, but not really. And then, we’ll get a new generation of people.

There are already plenty of scary stories, like this one: Google falsely told the police that a father was a molesting his son.

Despite all this I’m still running most of my digital life on Google’s infrastructure. I must make a move.