| dc.description.abstract | This thesis focuses on the question of what degree of privacy is achievable in the real world for long-running applications. We explore this question in two main settings: private advertising and anonymous communication. In doing so, we consider constraints each application may have in practice and what adversarial model is realistic for the context in which the application will be deployed. For real world applications, achieving perfect privacy — especially against a worst case adversary — can be impossible. That is, perfect privacy, while achievable in theory, may in practice require assumptions that conflict with usability, deployability, or utility requirements. This presents a challenge as privacy-preserving technologies can, necessarily, only provide privacy for the people who use them. Because of this, designing around user experience is critical, even if doing so requires compromises in the theoretical degree of privacy a system can provide or the strength of adversaries considered in its threat model. In the space of private advertising, we first propose a novel protocol, AdVeil, that eliminates leakage of user data beyond that revealed by the input/output of the ads ecosystem as a whole. We then provide a minimal modeling of the functionality of digital advertising which we use to prove that, even for systems like AdVeil with minimal leakage, the advertising metrics released at the end of the protocol are sufficient to leak information about end users to advertisers when combined with their audience targeting criteria. In the space of anonymous communication, we propose ShorTor, a new routing protocol for Tor that utilizes techniques popular with content distribution networks (CDNs) to reduce latency while maintaining Tor’s existing anonymity guarantees. We evaluate this protocol using a dataset of over 400,000 latency measurements we collected between the 1,000 most popular Tor relays. | |