Stanford researchers propose “obscurity” as an alternative to “privacy” in perceiving data protection

Woodrow Hartzog and Evan Selinger, both researchers affiliated to the Center of Internet and Society at Stanford Law School, propose “obscurity” as an alternative to “privacy” to handle the protection of personal data. 

Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power.

Facebook’s announcement of its new Graph search tool on Tuesday set off yet another round of rapid-fire analysis about whether Facebook is properly handling its users’ privacy. Unfortunately, most of the rapid-fire analysts haven’t framed the story properly. Yes, Zuckerberg appears to be respecting our current privacy settings. And, yes, there just might be more stalking ahead. Neither framing device, however, is adequate. If we rely too much on them, we’ll miss the core problem: the more accessible our Facebook information becomes, the less obscurity protects our interests.

While many debates over technology and privacy concern obscurity, the term rarely gets used. This is unfortunate, as “privacy” is an over-extended concept. It grabs our attention easily, but is hard to pin down. Sometimes, people talk about privacy when they are worried about confidentiality. Other times they evoke privacy to discuss issues associated with corporate access to personal information. Fortunately, obscurity has a narrower purview.

Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn’t mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience great effort as a deterrent.

Online, obscurity is created through a combination of factors. Being invisible to search engines increases obscurity. So does using privacy settings and pseudonyms. Disclosing information in coded ways that only a limited audience will grasp enhances obscurity, too. Since few online disclosures are truly confidential or highly publicized, the lion’s share of communication on the social web falls along the expansive continuum of obscurity: a range that runs from completely hidden to totally obvious.

Read their whole piece on

3 responses to “Stanford researchers propose “obscurity” as an alternative to “privacy” in perceiving data protection

  1. Wow. People at Stanford used to be smarter. Obscurity is no protection. No one fears the casual browser so much as the systematic data miner who knows what he/she is doing.


    • Very good point. Didn’t think of it this way, especially after I read the ENISA report on the right to be forgotten, in which they concluded that the only way to make this right effective is to work with search engines which will supposedly leave out certain possible results from the results list. I am wondering, though, whether that is technically possible and whether data miners would be actually stopped by this practice…


  2. The only problem with obscurity is the simple fact the Corporations don’t set their products to use it. Ever notice their own ”inhouse” software N networks often take advantage of it.
    Just try N find info on a MS website for an example. Even most of the language used is nothing more than Double Speak to their consumers.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.