Advertisement

Meet The Woman Who Did Everything In Her Power To Hide Her Pregnancy From Big Data

The server room at Facebook’s data center in Prineville, Ore. CREDIT: AP PHOTO/FACEBOOK, ALAN BRANDT
The server room at Facebook’s data center in Prineville, Ore. CREDIT: AP PHOTO/FACEBOOK, ALAN BRANDT

Janet Vertesi, assistant professor of sociology at Princeton University, had an idea: would it be possible to hide her pregnancy from big data? Thinking about technology — the way we use it and the way it uses us — is her professional life’s work. Pregnant women, she knew, are a marketing gold mine; a pregnant woman’s marketing data is worth 15 times as much as the average person’s. Could Vertesi, a self-declared “conscientious objector” of Google ever since 2012, when they announced to users that they’d be able to read every email and chat, navigate all the human and consumer interactions having a baby would require and keep big data from ever finding out?

Here’s what she found: hiding from big data is so inconvenient and expensive that even Vertesi doesn’t recommend it as a lifestyle choice. (She presented her findings at the Theorizing the Web conference in New York last week.) So what does that mean for companies who say users can just “opt out” if they aren’t happy with (so-called) privacy policies? Can you be a person on the internet without sacrificing all your data to the Google Powers That Be? I talked to Vertesi about her experiment, its implications, and why hiding from big data can make you look like a criminal.

What’s the origin story of this experiment? How did you decide to try to hide your pregnancy from big data?It was really with that [New York Times] story from last year about Target finding out that girl was pregnant before she did. It was this captivating story. I knew there was this growing online collection of trackers and widgets and cookies. But what I didn’t know was how much that was happening in stores as well. So it really became an experiment in big data and how that hits the ground on a personal level. There is this rapid rise of a technologized industry to track, collect, analyze and identify. And I just wanted to see: what would it take to not be detected? Could I do it?

How involved were you in social media before this experiment began? Were you an “Instragram all the brunches” kind of user, or a “just wish someone happy birthday on Facebook” type?I’ve been on Facebook for about ten years now, [since] it first opened [only] to the Ivies. I got off Google two years ago, when they changed their privacy policies. So I don’t use any Google products. I was already a conscientious objector to Google… But I’m very active online…I’m not fearful of that technology; it was really an experimental thing. What would it look like to try to do it? And what would it look like to do it without walking away from something like Facebook or Twitter? It would be maybe too easy to just shut down my Facebook account. I don’t think I would have learned what I learned.

Advertisement

And most people, for purposes of jobs or a social life, don’t have the luxury of opting out. I have a friend who says that, at a certain point, opting out of all of these technologies is like opting out of life.For a lot of us, things like LinkedIn and Facebook and Instagram and Twitter, they’re part of our professional network. It’s fine and good to say “If you don’t like it, get off LinkedIn.” But if you’re a contract worker, LinkedIn is how you get your employment. For me, so much of my colleagues’ social interaction happens on Facebook that participating in that virtual interaction is a very important part of the social ties for my professional community. So that made it not really preferable to get off… I do think, to a certain extent, you can make decisions about which services you want to be involved in.

Talk me through the process of your experiment. How do you go about hiding your pregnancy from big data?[My husband and I] decided, first of all, that we’d have to be careful about what we said on social media. [We] also asked our friends and family to be careful. It’s not just about what you say; it’s what your friends say and whether or not they tag you. So we called everyone to say we’re really excited, we have this news, but please don’t put it online. We explained the experiment and said, please don’t put it on Facebook. Because Facebook are the most immediate offenders for data collection.

It’s kind of like the old days of the internet, when there wasn’t the whole layer of trackers and sites that could tell who you were and what you’re doing. Tor made me feel safe.

We ordered everything baby-related on Tor. I’ve used a lot of browser plugins and software on my career. A lot of people just asked if I downloaded an ad blocker. But I wasn’t worried about the ads; I was worried about the data collection that fuels the advertising. If I had an ad blocker, I wouldn’t be able to see what the internet knew about me. So we used a traceless browser for baby things. Everything else, I did on my normal browser. We got everything in cash that we could. We’d do research online, using Tor, and then go out and buy things in cash in person. For some purchases online, we made through Amazon, and we set up an Amazon account from a private email account and had it deliver to a local locker in Manhattan, so it wasn’t associated with our address. We stocked it with Amazon gift cards that we bought with cash. So we did those kinds of things to draw a distinction between our online lives and our offline lives.

It’s so funny to hear someone talk about Tor to do something as cute-sounding as “buying baby stuff.” Usually when I hear Tor, I think of drug dealers and people making illegal purchases in Bitcoin.Tor is fantastic. It’s kind of like the old days of the internet, when there wasn’t the whole layer of trackers and sites that could tell who you were and what you’re doing. And with respect to privacy, that was really important to me. Tor made me feel safe. I think that’s really important. It made me feel — it’s a funny thing to say that people always associate Tor with the dark web, but it’s actually not. Tor can be used for a lot of different activities that are not illicit.

Typical dark internet user, probably. CREDIT: Shutterstock/Sergey Mironov
Typical dark internet user, probably. CREDIT: Shutterstock/Sergey Mironov

How difficult was it to hide your pregnancy from big data, compared to other people you know who have been pregnant and not attempted this kind of experiment?It was so much work. I didn’t expect it to be as hard as it was. It was extremely impractical and very inconvenient, which revealed to me how convenient everyone has made the process of tracking. The notion that, it’s so inconvenient to not be tracked, why would you do it?

Advertisement

It was expensive. If you’re avoiding things like loyalty cards in stores, you’re missing discounts. Buying things in cash in Manhattan, there’s the Manhattan markup to think about. It was inconvenient and not cost-effective. And it was incredibly discomfiting socially, because it was difficult to maintain some regular interactions on social media, on Facebook, without being nervous about being outed. Just wishing someone “happy birthday” on your wall could mean they’ll say, “congrats on the pregnancy” back on my wall. And to that end, even though I was on social media, I found myself heavily censoring the stuff I did and said, because of that concern.

It was so much work. I didn’t expect it to be as hard as it was. It was extremely impractical and very inconvenient.

And finally it was disconcerting because the kinds of things you’re doing are, if it were taken in the aggregate, it looks like we’re up to no good. Who else is on Tor every day and pulling out cash all over the city and taking out enormous gift cards to buy a stroller? It’s the kind of thing, taken in the aggregate, that flags you in law enforcement systems. Fortunately, we never had the FBI show up at our door. But you start noticing the lengths, the extremes you have to go to to try to not be tracked. They put you in a very, very discomfiting position. So I wouldn’t recommend it.

What were the reactions of your friends and family to the experiment?I’m a big tech-head, so I don’t think anyone thought it was strange that I would try such a thing… I’m always trying to think of ways to help people think critically about technology, to unpack it… What are the values built into this system? What kind of things does this system assume about us as a user, and what kind of things is it getting wrong? The technique is called Infrastructural Inversion: taking an infrastructure that pervades your everyday life and trying to make it visible…There are lots of infrastructures in our lives: water, gas, the internet. A lot of us assume it just kind of happens magically and we don’t even think about it. The point of this technique is to get you to think about it.

So I don’t think any of my family or friends were upset. What some of them were was a little confused about what we meant… I said, don’t put it on Facebook, but I still got Facebook messages and chats from friends and family members… What’s strange for me was them thinking that that didn’t qualify as “on Facebook.” And I have to email them immediately and say: please don’t put this on Facebook. And they say, “That’s why we didn’t put it on your wall.” And don’t you realize, every interaction you have on this platform is being tracked, is being watched, is being analyzed to better serve advertisers? How could you assume that just because this message happened in a private chat window that the servers aren’t recording that?

For a lot of people, there was kind of an “aha!” moment.

You said you quit all Google products two years ago. What was the breaking point for you?

When Google knew I was engaged before anybody else did, that did it for me.

Wait, what?!? How did that happen?

Google reads your email, reads your chats. It knows what you’re searching for. It sees you when you’re sleeping and knows when you’re awake. And the server is economically incentivized to remember. The way to make money on the internet these days is to get people to exchange personal information for free, and you get them to do that by making them think they’re just interacting with the service: sending an email or searching or chatting with a friend. But there’s this underlying architecture there.

Google reads your email, reads your chats. It knows what you’re searching for.

Google just updated their privacy policy to explicitly state that they read your emails and your chats to better serve you targeted advertising. The final straw for Google was when they changed the privacy policy in 2012, and that they aggregated information about you on all these platforms. And I know that people behave differently in different contexts. That is a fact of social life. And to prohibit them from doing that is deeply problematic for social relations. So for me, it was an act of protest, and now it’s that life is better this way.

Advertisement

[This experiment] was one of the first times that I thought about what it would take to opt out from collection. Because you hear all the time: if people don’t like it, they’ll stop using the service. But people don’t stop using the service. And I know a lot of people really don’t like it, and it’s not just that they’re upset because Facebook made some change to its layout. I think the deep, underlying reasons that people are uncomfortable is how these interactions are being tracked. They don’t like being stalked by a pair of shoes they looked at once on the internet two years ago.

It’s interesting to see the contrast between how most people react to the idea that a corporation like Google or Facebook is reading all their chats and emails versus the reaction to these recent revelations about the NSA violations of privacy. It’s just this amazing branding: people want to like Google, because of the Google doodles and the ping-pong at lunchtime and the “Don’t Be Evil,” and people want to hate the NSA because it’s this government corporation that’s stalking us. Why do you think people are more accepting of this invasiveness from the Googles of the world even though we get outraged over the same issues when the source is a place like the NSA?

I don’t think those companies are evil. I have many friends who work there — at Google, at Facebook, Intel, Yahoo — because I work in human and computer interactions, so I interact with these groups all the time. But they have done a brilliant PR job both internally and externally, [with slogans like] “don’t be evil.” Or the idea that you should be sharing all the time with all the people in your lives. These are very powerful expressions and they’re very affective, too, in an emotional kind of appeal. Who wants to be seen as someone who doesn’t share? [We all know] it’s important to not be evil.

But I think when we have those large, ideological statements, they invite critical thinking. They invite thoughtfulness…Part of being a sociologist of technology… is also about thinking about the ideologies that underlie the technologies, the kinds of myths that we tell ourselves about technology in our lives.

How aware do you think people are of the fact that these companies are mining all their data? That every search, every email, every chat: all of it is being read? Do you think most people just don’t realize the extent of big data’s reach?I think that’s a huge part of it. The success of these companies has been built on the fact that you use their services and all you focus on is the service you’re currently using and not how that’s being captured. You forget about the server involved. The economic incentive of the server is to remember every interaction you have on it. That’s its job, because that’s how the company makes its money. And then to do data aggregation and analysis, to figure out who you are and what you might like and what category you fit in. That’s the JOB of service, is to get you to input that information. But all you see is, I’m writing an email to my friend. I’m talking to someone on Facebook. All you see is that personal interaction with your friends, and you forget about this underlying architecture that’s incentivized to remember.

That’s part of the problem: it becomes less and less obvious to people. To a certain extent, they know it’s happening, but they’re already a part of a system where they’re interacting with friends and family and they’re expected to continue those interactions.

Can a reasonable person live life hiding from big data, like you did for this experiment, but forever?Experience has shown that it is possible, but it’s really not easy, and it comes with a lot of sacrifices. And it requires some technical skill. So to that end, it’s my concern about the opt-out idea. I don’t actually think it’s feasible for everyone to do this. I don’t think that’s the answer. I don’t think that’s the simple answer to the big data problem: that you can just turn this stuff off, that you can not do the things that you clearly need to do for your daily life. But I really want to emphasize, I did this as an experiment to see what it would take, to see what these systems were demanding of us that we’d forgotten about, and how it is that they worked. And so I don’t expect people to do this. In fact, I wouldn’t recommend it.

I don’t actually think it’s feasible for everyone to do this. I don’t think that’s the answer.

But I do always recommend that people take a minute to think seriously and thoughtfully about what services and products they do want to engage with. My job is not to say: everyone should be a Luddite and reject all technology all the time. I think that would ridiculous argument. My job is to say: you have the power and the authority to think about which of these services you’d like to use. And the ones you think you can’t resist [or] you need to use for your job, you have the power and authority to think about how it is you want to use them and what information you want to give them.

So I don’t think everyone should get off Google tomorrow. But I think our public discourse about technology would be richer if people started to have these conversations. And I think the possibilities of our technology would be so much richer.

What do you envision as the future of this “big data collection as status quo” phenomenon? A bunch of angry citizens storming the Googleplex? A dark, Hunger Games-y dystopia where corporations and the government know everything about every citizen?I hope it’s not going to be as black and white as either of those scenarios. I have an optimistic future in mind. I think we’re at a turning point where enough people are demanding [change] that it may incentivize some new technologies, and some investments in new models, for how to sustainably make money online. And I think things like Snapchat — even though you don’t know how long THEY keep the pictures you take. So some of these newer Purity apps are exploring a user base that is interested in non-permanence and not being tracked. I think they offer a really interesting opportunity, way more than we realize.

And we need to think seriously about alternate economic models. The rapid, extraordinary profitable success of companies that have gone the data collection route make it seem like that’s the only way to make money online. I think we need to open up the possibilities. That’s the hope.