Moral-Crypto-Review

Name: Samuel Tay
UW Net ID: tays
Paper: The Moral Character of Cryptographic Work
Authors: Phillip Rogaway

Summary

Shortly after the Snowden disclosures, Phillip Rogaway wrote this paper to highlights issues plaguing the cryptography community (and, in some cases, the sciences more broadly) that must be amended for a fair and just democratic society. In part 1, Phillip provides some historical context for how science, politics, and ethics have intersected since WW2 and argues that ultimately, the linking of ethics to science was only a temporary phase (with the exception of the field of physics - however, the evidence for this exception is purely anecdotal).

In part 2, Philip makes the case that cryptography is inherently political and hence cryptographers have an obligation to expend their efforts in morally defensible ways. The parts that I found most interesting were the demonstrations of how certain topics are presented and their ramifications, e.g.

  1. the centralized trust in the IBE model being implicit and largely glossed over
  2. differential privacy operating under the assumption that data collection in aggregate is just fine, further enforcing the anti-private status quo
  3. intelligence agencies leaning on unrealistic (at the time) privacy technologies like FHE and iO for political cover.
    And, perhaps most importantly, this section ends with the argument that the purely academic side of cryptography has sucked much of the oxygen from the field, leading to stagnation in useful research for practical security and privacy. This is most damningly demonstrated by the NSA quote summarizing the Eurocrypt 1992 conference.

Part 3 provides an overview of mass-surveillance. Phillip shows law-enforcement's misleading framing of privacy and security as mutually exclusive, and then shows a more accurate model, including conjectures for the future if mass surveillance is left unchecked; the conjecture is a bleak one, with failed democracy and no social progress. He closes this section by saying that such rational thought and modeling aren't even necessary in this case; we know instinctually that a world with mass surveillance is "not a place where man belongs".

In part 4, Phillip closes by offering various actionable pieces of advice for his fellow cryptographers:

  1. Work on problems in secure messaging and research avenues that have practical privacy implications.
  2. Stop letting the miltary-industrial complex guide the direction of cryptography.
  3. Form a broad, systems-level view of a given application of cryptography. Focus on the APIs between different components of the system.
  4. Seek out and improve privacy tools.
  5. Avoid simplistic cartoon clip-art; reflect the reality and severity of various adversaries.
  6. Be intentional with language. The words privacy and anonymity are vague and don't have the best connotations, where as anti-surveillance is explicit.

In brief, the message Phillip wants his fellow cryptographers to take away is this: we are the ones who can find cryptographic solutions to thwart mass surveillance, and we have a moral obligation to do so.

Reflection

I very much appreciated Phillip's perspective. I found it quite motivating. Although he specifically writes to the scientific community, supporting his argument of their moral obligations with the fact that their work is subsidized by the public, much of his writing can extend to people working in the private sector as well.

In my career personally, I've tried to work at companies that align with my values. In practice, this mostly results in me choosing jobs where there is some social benefit to my work. For example I've worked at SimSpace which provides network simulation software for cybersecurity training, typically used by the government and large banks to train their cyber defense teams. I then worked at Phylum on a tool that helps secure one's software supply chain (essentially offering a CI check with a risk threshold whenever a lockfile changes).

Most recently, I've started a new job at Sunscreen. We offer an FHE compiler (built on top of Microsoft SEAL) to help make FHE more accessible to engineers, without them having to gain expertise in FHE cryptography, schemes, parameter selection, etc.. While the schemes of today are still too inefficient for things like state-of-the-art machine learning, there have been massive improvements since Phillip's remarks, which seemed to treat FHE as pie-in-the-sky technology. But, I'll concede that those remarks may have been warranted at the time, and even now, it's unclear how much more performant these schemes can get. Furthermore, if history is any guide, people will tend to prioritize performant computation over private computation. Still, I'm happy to work on a project that helps get FHE into the maintream and provide tools to help others create privacy-focused products. I think Phillip would approve.