Sept 20, 2013 (Friday) at Columbia CS Conference Room
|10:00 – 10:30.||Introduction/Coffee|
|10:30 – 11:50.||Joint talk by Hugo Krawczyk (IBM) and Vlad Kolesnikov (Bell Labs)
Scalable Private Database Querying: Theory and Practice
|12:00 – 12:50.||Elaine Shi (UMD)
Practical Oblivious Computation: From Theory to Hardware to Compiler
|1:00 – 3:00.||Lunch (not provided)|
|3:00 – 3:50.||Kobbi Nissim (BGU)
Private Approximations and Big Data
|4:00 – 4:50.||Adam Groce (UMD)
Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in Statistical Database Privacy
Tal Rabin (IBM) and Sanjam Garg (IBM)
with the help and support of Allison Lewko and Tal Malkin (Columbia).
- Scalable Private Database Querying: Theory and Practice
Hugo Krawczyk (IBM) and Vlad Kolesnikov (Bell Labs)
In this talk, we address the problem of performing private DB search at scale, including its motivation and performance requirements. We will discuss the technical difficulties which might lead to lower bounds and corresponding practical solutions.We will outline two approaches to the problem, and discuss interesting aspects of the solutions. This work, performed by IBM/UCI and Columbia/Bell Labs teams is partially supported by IARPA.This is joint work with Bell Labs/Columbia/IBM/UCI teams.
- Practical Oblivious Computation: From Theory to Hardware to Compiler
Elaine Shi (UMD)
I will describe a new binary-tree based paradigm of constructing Oblivious RAM, leading to extremely simple constructions. Within this framework, I will describe Path ORAM. Under reasonable assumptions about the block size, Path ORAM achieves O(log n) bandwidth overhead with just a little more than O(log n) trusted cache — this is nearly optimal in light of Goldreich and Ostrovsky’s lower bound.Based on Path ORAM, we implement the first real-life ORAM-capable secure processor prototype called Phantom. We run real-world programs such as sqlite on top of Phantom, and demonstrate reasonable practical performance.Then, I will describe programming language techniques that can compile a program into its memory-trace oblivious equivalent, while achieving order-of-magnitude speedup in comparison with the naive approach of placing all variables in a single, large ORAM.Finally, I will describe a vision of building a cloud computing platform secure against physical attacks.
- Private Approximations and Big Data
Kobbi Nissim (BGU)
One of the main issues of big data — scalability — can be addressed by trading accuracy with approximation. Approximation seems hence a compelling strategy also for achieving efficiency in the context of computations performed over sensitive data, as in secure multiparty computation protocols. Care must be taken, however, so that the trading of an exact computation with an approximation would not result in leakage of more than what is intended.In 2003, Feigenbaum et al. introduced the notion of private approximation: the result of a private approximation algorithm should reveal no additional information beyond what follows from the output of the function being approximated. Subsequent work resulted in mixed results: On one hand, there exist families of problems for which private approximations enable efficient protocols. On the other hand, the approach can be shown to fail w.r.t. other problems in a very strong sense: for these problems every efficient private approximation algorithm must leak information beyond the exact computation. The latter negative results motivated looking into notions of approximation that circumvent impossibility by permitting small (quantified and well defined) leakage.In the talk, we will introduce the notion of private approximation, motivate it and examine it (and some variants) as a strategy for secure computation over large datasets. We will review the main positive and negative results, and discuss some research directions.
- Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in Statistical Database Privacy
Adam Groce (UMD)
Differential privacy guarantees that a given database query protects individuals’ information even when the adversary possesses perfect information about everyone else in the database. This guarantee is convenient, but we believe stronger than is necessary in many circumstances. We introduce a new framework, coupled-worlds privacy, that generalizes differential privacy. This framework allows explicit bounds to be placed on how much the adversary knows about the database, and this uncertainty can be used to reduce or eliminate the noise needed to make a given query private.In this talk I will motivate the definition, which compares the information an adversary could gain in the real world to what could be gained from a “scrubbed” version of the database in which an individual’s information has been removed. I will also discuss various properties of the definition and compare it to other definitions that were motivated in similar ways. Finally, I will discuss several mechanisms that we have proven to be private under this definition.
Joint work with Raef Bassily, Jonathan Katz, and Adam Smith.