When you put privacy and math together = differential privacy, a new approach to safeguarding data



It happens so often in the data protection and privacy field to read something mind-blowing, that it is no surprise this field enjoys more and more enthusiasts. Take for example a piece of information published on December 31st, 2012, on the Scientific American website:

“A mathematical technique called “differential privacy” gives researchers access to vast repositories of personal data while meeting a high standard for privacy protection”.

So this means that you can use math also to safeguard personal data, not only to make profiles based on personal data. Ha!

The article talks about a body of work a decade in the making, which is now starting to offer a genuine solution.

“Differential privacy,” as the approach is called, allows for the release of data while meeting a high standard for privacy protection. A differentially private data release algorithm allows researchers to ask practically any question about a database of sensitive information and provides answers that have been “blurred” so that they reveal virtually nothing about any individual’s data — not even whether the individual was in the database in the first place.

“The idea is that if you allow your data to be used, you incur no additional risk,” said Cynthia Dwork of Microsoft Research Silicon Valley. Dwork introduced the concept of differential privacy in 2005, along with McSherry, Kobbi Nissim of Israel’s Ben-Gurion University and Adam Smith of Pennsylvania State University.

Read the whole story HERE.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.