Transparent privacy is principled privacy

Published in Harvard Data Science Review, 2022

Link (HDSR)

Differential privacy revolutionizes the way we think about statistical disclosure limitation. Among the benefits it brings to the table, one is particularly profound and impactful. Under this formal approach to privacy, the mechanism with which data is privatized can be spelled out in full transparency, without sacrificing the privacy guarantee. Curators of open-source demographic and scientific data are at a position to offer privacy without obscurity. This paper supplies a technical treatment to the pitfalls of obscure privacy, and establishes transparent privacy as a prerequisite to drawing correct statistical inference. It advocates conceiving transparent privacy as a dynamic component that can improve data quality from the total survey error perspective, and discusses the limited statistical usability of mere procedural transparency which may arise when dealing with mandated invariants. Transparent privacy is the only viable path towards principled inference from privatized data releases. Its arrival marks great progress towards improved reproducibility, accountability and public trust.