Presented by:


Celia Zhang


Hello! I work at Google on a team enabling differentially private SQL queries.

No video of the event yet, sorry!

Differential privacy (DP) provides formal guarantees that the output of a database query does not reveal too much information about any individual present in the database. While many differentially private algorithms have been proposed in the scientific literature, there are only a few end-to-end implementations of differentially private query engines. Crucially, existing systems assume that each individual is associated with at most one database record, which is unrealistic in practice.

In our paper “Differentially Private SQL with Bounded User Contribution”, we proposed a generic and scalable method to perform differentially private aggregations on databases, even when individuals can each be associated with arbitrarily many rows. We expressed this method as an operator in relational algebra, and implemented it in an internal SQL engine. To validate this system, we tested the utility of typical queries on industry benchmarks, and verified its correctness with a stochastic test framework we developed. We highlighted the promises and pitfalls learned when deploying such a system in practice, and we published its core components as open-source software (

It would be an exciting step for privacy to enable the capability of issuing differential private queries for PostgreSQL, as one of the leading open-source relational database management systems. In our open-sourced software we have created a Postgres extension adding anonymous statistical aggregation functions. These aggregation functions do not fully provide differentially private guarantees. In order to do so, the syntax trees in Postgres underlying DP queries must be additionally rewritten. We would like to explore the possibility of making this a Postgres community project.

20 min
Postgres Conference 2020
Regulated Industry