You might have heard the myth that, during FDR’s presidency, the media had a sort of gentleman’s agreement with the office of the president not to detail Roosevelt’s condition – particularly with regard to his wheelchair, and the effects of Polio he suffered. It turns out that’s mostly a myth – that even in 1932, the New York Times Magazine described then-Governor Roosevelt’s “wheel[ing] around in his chair.”
Did this get suppressed, or did we simply forget about it in favor of the more convenient narrative? Convenience in this case being the uplifting idea that the media of the day were somehow more respectful, because they didn’t pry into America’s leader’s personal life – that they knew his public image was important enough to obviate focusing on his person itself.
Fast-forward to 1975, when an attempt was made on then-President Ford’s life, and a former marine foiled the attempt. The marine, Oliver Sipple, made an agreement with the media not to focus on his sexuality… But Harvey Milk, of all people, decided it was better that the world know that gay people could be heroes too. Milk didn’t consider the outcome of his outing Sipple – that the marine would lose his entire life, effectively. Milk focused, like many people in that kind of situation, on the idea of the greater good that would be served through the story. Here, the media isn’t the bad guy, a gay rights activist is… Or is he?
Fast-forward again, to 2012, when Gawker journalist Adrian Chen publicly outed a Reddit troll and cost the man his job, and the health insurance that kept his disabled wife cared for. All because Chen felt like it was a good story. The troll, Brutsch, is not the good guy. But neither is Chen, who destroyed someone’s life for a story on a news website by violating their online privacy and tracking them down. Chen’s reasoning? From the Wired article describing both of the above stories as parallels;
In identifying Butsch and shining a spotlight on his insidious practices, Chen’s article condemns Butsch’s choice of using the mask of pseudonymity to hide behind actions that have societal consequences. Public shaming is one way in which social norms are regulated. Another is censorship, as evidenced by the Reddit community’s response to Gawker.
Privacy is a troubling concept; the idea that anything about you, things you wish to keep personal and secure, could be revealed to the wrong audience at any time keeps a lot of us up at night. And for good reason! No one wants to be strung up by the court of public opinion, for any reason, ever. Even people who choose lives of public visibility (actors, media personalities, etc) who don’t need to be outed in detail to give context for their bad acting shrink from the idea that something they’ve done, or an offhand comment they’ve made, could run afoul of The Public’s ideals.
The “Out By Proxy” Issue is Not New.
Much of the discussion around online privacy frames this outing of people – the publication and focusing on their private information – as if it were a result of new tools, recently available to the web. That’s simply not the case – as Milk/Sipple proved 40 years ago. Out-by-proxy was an issue even when I first started in online communities 20 years ago.
In the mid- to late-90s, I spent a lot of time in chat rooms. Text systems built on Perl scripts, designed to log and display serialized entries from those signed on, at a near-realtime pace. We used it for roleplay – collaborative fiction writing – and I was part of a community that had about five years’ history on this chat alone, by the time everyone grew out of it. In the 90s, 5 years was an eternity in web time.
No one played under their own real name. Handles were par for the course. Even if you got to know someone fairly well and met them “in real life” – which I did, one of them was my first real relationship, and people looked at me like I was asking to be murdered by internet weirdos… There was still the chance that everyone else you were writing with wasn’t what they said they were.
During my time on these chats, I can easily recall at least two instances of people being “found out” by a number of means, and having their real details publicized within our small circle. Names, addresses, photos, schools they attended, whether they had been married or had kids – the names of their pets.
Keep in mind, this was the late 90s – nearly twenty years ago – before Google was properly a thing, much less reverse image search. In both instances, the person who’d been found out disappeared – one without a trace, and the other simply stopped playing, but kept in touch with those people who didn’t care about the revelation through non-chat means.
This process of discovering, revealing, and using the data around an otherwise anonymous person on the web is known as doxxing – and it’s becoming rather common. In an Economist article on doxxing, comments section in particular reveal attitudes on the effects of doxxing. How this intentional destruction of online privacy works, and the complete lack of accountability related to it are the subject of deep discussion in a lot of places.
Earlier this month, TechCrunch reported that The Online Privacy Lie is Unravelling – and made a case for why that’s important… But they’re a bit late to the party, clearly. When Milk doxxed Sipple forty years ago the idea of privacy as we think of it was already under fire – and on the web, it never existed at all. Any belief otherwise is delusion. At that time, being outed as gay was damning on its face. Now, however, being outed can include much more than your sexuality – any behaviour you exhibit, anything you allow to be recorded and documented – especially if it goes into the web – becomes a condition you can be outed for.
Spent too much time drinking in your teens? It’s probably documented. Kicked a dog by accident? Might be a viral video without context before you know it. Participating in a chat, or on twitter, behind a mask of anonymity? Be ready to be doxxed, apparently, because that’s the big red button hanging over us all.
We do have some control over our own data, if we look for it.
If you’ve put that information into an electronic form, it might be less of a breach of privacy and more of an intentional disclosure. It has its upsides, however – one of the reasons I’ve written for the last six years on a domain carrying my name was a concern over identity fraud. I don’t get enough visibility to qualify a public figure, which is fine, but there’s definitely value in owning your namespace and controlling the context around it.
We rely on mass disinterest for protection, but that’s not privacy.
The reason TechCrunch’s article rings false, at least for me, is that it’s not Google or Microsoft who’s doxxing people. Most of the privacy debate is now focused on marketers, businesses, advertisers and so on. Almost none of it comes back to what more often than not puts people in danger; the laser focus of targeted and specific public attention.
When you see a video of someone committing animal cruelty, and hear about how the person was found in record time by a crack team of investigators… It’s other stakeholders, independent actors and occasionally smaller groups who create this publicity for that bad actor. It’s this process that brings context and focus onto an individual, in a way no advertising targeting or aggregate data point ever could.
TC’s right, though, in that “Start-ups should absolutely see the debunking of the myth that consumers are happy to trade privacy for free services as a fresh opportunity for disruption.” But the answer is not necessarily that transparency related to data and algorithms is a must. Education about what online privacy is, and what it isn’t, will become very important in the near future. Their last note, “Services that stand upon a future proofed foundation where operational transparency inculcates user trust — setting these businesses up for bona fide data exchanges, rather than shadowy tradeoffs” is particularly apt. However, that should have been a general commentary about the need for education, not an admonishment of businesses doing business things.
Our reliance is on being part of the crowd. There’s definitely safety for your person as part of an aggregate – without identifying data and backing context, the most you can do with a data point is throw off the averages a bit. And even then, there are very likely to be bigger outliers in some other region that make your data look uninteresting. This means that until there’s a very particular reason for people to care about who you are, that will offer them some gain, there’s no value in the effort it takes to back you out from the crowd and doxx you.
The safety of the herd. It’s effective for most, but not very attractive because we know precisely what it takes for this system to fail us individually.
We don’t have a good answer for online privacy yet, no matter what anyone says.
Even the best software – and hardware barriers can’t prevent breaches in data security. We see this all the time with stories about government actors attempting to breach national firewalls, proving that security on its own is not enough. We also saw this with last year’s iCloud hack that released hundreds of celebrities’ nude selfies and caused a massive kerfuffle. Apple’s since gone on a total “This Is Your Data” bent, and pooh-pooh’d Google, America’s NSA, and other bodies for trying to destroy privacy with their data collection techniques.
Backing all of your data out of the aggregate would destroy the value of that aggregate – as well as limiting your potential to benefit from its value. Many services are tied to data aggregates, whether we believe they’re related to online privacy or not. Social networks improve their features by way of aggregate data analysis, as do online games. However, where most people are now concerned with the amount of personally identifying information bodies such as Facebook have attached to their accounts, very few people ever worry about how a multi-player online video game could negatively affect them (in which case, see above regarding doxxing).
Think of the census – your country probably conducts one. It’s not just used for making sure everyone’s in the right place; it’s also used for economic projections, spending plans, as background information for poles, and many other statistics that help keep the country running. This real world, physical aggregate is no more or less valuable than its online counterpart – though it’s never referenced in the same bad light.
The other side of this, is that the argument “If you have nothing to hide you have nothing to fear” is utter and demonstrable bullshit. Everyone has issues about their person that they’d simply rather not have brought forcibly to light. It doesn’t matter if it’s a a particular kink, a lifestyle choice, what sort of small-clothes they wear, or a self-esteem issue because the right side of their nose is slightly more rounded than the left side… No one likes their very-personal weirdness exposed without consent or control.
Jeff Jarvis suggested, years ago, that no one really cares about privacy – they just hate being surprised. That, I think, is a major key to solving for privacy in the future; creating situations where, by our intentional delivery or withholding of information, we’ll understand the potential outcomes of that action.
That’s not privacy as an outside regulatory force, though; it’s self-censorship. And that’s a lot harder to sell.
Back to FDR’s wheelchair. My point here, on the whole, is that privacy is a result of context and framing. We believe that things are private not because no one has access to them, but because they’re framed in a private context – these bits and bytes of information about us are obscured, or undisclosed, and as such exist within a framework of privacy. When that’s breached, we’re right to be upset. However, just like that “unreported” wheelchair, even information within a broader public knowledge can be respected, and treated with dignity such that eventually we believe it was never public at all (even though we all know it).
The conversation about online privacy has to stay a cultural one – not the subject of a business or regulatory study.