[Back]


Contributions to Proceedings:

M. Maffei, R. Munz, F. Eigner, P. Francis, D. Garg:
"UniTraX: Protecting Data Privacy with Discoverable Biases";
in: "Principles of Security and Trust", LNCS 10804; Springer, Lecture Notes in Computer Science, Schwitzerland, 2018, ISBN: 978-3-319-89721-9, 278 - 299.



English abstract:
An ongoing challenge with differentially private database systems is that of maximizing system utility while staying within a certain privacy budget. One approach is to maintain per-user budgets instead of a single global budget, and to silently drop users whose budget is depleted. This, however, can lead to very misleading analyses because the system cannot provide the analyst any information about which users have been dropped.

This paper presents UniTraX, the first differentially private system that allows per-user budgets while providing the analyst information about the budget state. The key insight behind UniTraX is that it tracks budget not only for actual records in the system, but at all points in the domain of the database, including points that could exist but do not. UniTraX can safely report the budget state because the analyst does not know if the state refers to actual records or not. We prove that UniTraX is differentially private. UniTraX is compatible with existing differentially private analyses and our implementation on top of PINQ shows only moderate runtime overheads on a realistic workload.

Keywords:
UniTraX, Data, Privacy, Protecting


"Official" electronic version of the publication (accessed through its Digital Object Identifier - DOI)
http://dx.doi.org/10.1007/978-3-319-89722-6

Electronic version of the publication:
https://publik.tuwien.ac.at/files/publik_270456.pdf


Created from the Publication Database of the Vienna University of Technology.