Differential Privacy
Paper Paper Slides
Abstract from Dwork, 2009:
Over the past five years a new approach to privacy-preserving data analysis has born fruit. This approach differs from much (but not all!) of the related literature in the statistics, databases, theory, and cryptography communities, in that a formal and ad omnia privacy guarantee is defined, and the data analysis techniques presented are rigorously proved to satisfy the guarantee. The key privacy guarantee that has emerged is differential privacy. Roughly speaking, this ensures that (almost, and quantifiably) no risk is incurred by joining a statistical database. In this survey, we recall the definition of differential privacy and two basic techniques for achieving it. We then show some interesting appli- cations of these techniques, presenting algorithms for three specific tasks and three general results on differentially private learning.
Abstract from Ruggles, 2019:
The Census Bureau has announced new methods for disclosure control in public use data products. The new approach, known as differential privacy, represents a radical departure from current practice. In its pure form, differential privacy techniques may make the release of useful microdata impossible and limit the utility of tabular small-area data. Adoption of differential privacy will have far-reaching consequences for research. It is likely that scientists, planners, and the public will lose the free access we have enjoyed for six decades to reliable public Census Bureau data describing US social and economic change.