Probabilistic Programming for Security and Privacy

Manually writing programs that conform to security and privacy policies is challenging. In this project, we develop novel techniques that leverage the recent advances in probabilistic programming and program synthesis to automatically enforce probabilistic policies, which can be used to protect sensitive user data from highly nontrivial probabilistic inference attacks. Examples include policies that impose probabilistic bounds on the information leaked to attackers and differential privacy policies.

SPIRE: Synthesis of Probabilistic Privacy Enforcement

SPIRE is a novel synthesis approach that automatically transforms a program into one that complies with a given policy, where the policy is defined as a set of probabilistic assertions on the distribution over the program inputs (capturing the attacker belief). The SPIRE approach consists of two ingredients. First, we phrase the problem of determining how much information the program's outputs leak about the secret as probabilistic analysis, and check whether the leakage is within the bounds specified by the policy; we depict the safe bounds in green in the figure above. Second, to enforce the policy and reduce the amount of leaked information, the key idea is to synthesize an enforcement that transforms the program by adding uncertainty to its outputs. We present two synthesis procedures that add uncertainty to the program's outputs as a way of reducing the amount of leaked information: an optimal one based on SMT solving and a greedy one with quadratic running time. Below, you can find the paper and the source code of the SPIRE system.

The SPIRE system is available on GitHub. To build and run it, follow the instructions in the README included in the repository.


Synthesis of Probabilistic Privacy Enforcement
Martin Kučera, Petar Tsankov, Timon Gehr, Marco Guarnieri, Martin Vechev
ACM CCS 2017