How is someone losing their key a "technical problem"? Is that hard to own up and put the actual reason in the summary? It's not like they have stockholders to placate.
we will adopt a 2-out-of-3 threshold mechanism for the management of private keys [1]
The trustee responsible has resigned so why weaken security going forward?
I would have thought cryptography experts losing keys would be pretty rare, like a fire at a Sea Parks.
To me, the entire matter is mostly amusing; the negative impact on IACR is pretty low. I now have to spend 10-15 minutes voting again. No big deal.
It saddens me that Moti Yung is stepping down from his position as an election trustee; in my opinion, this is unwarranted. We have been using Helios voting for some time; this was bound to happen at some point.
Don't forget that the IACR is not a large political body with a decent amount of staff; it's all overworked academics (in academia or corporate) administering IACR in their spare time. Many of them are likely having to review more Eurocrypt submissions than any human could reasonably manage right now. There are structural issues in cryptography, and this event might be a symptom of the structural pressure to work way more than any human should, which is pervasive not just in cryptography, but in all of science.
From what I heard on the grapevine, this scenario was discussed when Helios was adopted; people wanted threshold schemes to avoid this exact scenario from the start, but from the sources I can find, Helios does not support this, or at least it does not make threshold encryption easy. The book Real-World Electronic Voting (2016)[^0] mentions threshold encryption under "Helios Variants and Related Systems", and the original Helios paper (2008)[^1] mentions it as a future direction.
You don't have to tell these academics that usable security is important. Usable security is a vital and accepted aspect of academic cryptography, and pretty much everyone agrees that a system is only as secure as it is usable. The hard part is finding the resources—both financial and personnel-wise—to put this lesson into practice. Studying the security of cryptographic systems and building them are two vastly different skills. Building them is harder, and there are even fewer people doing this.
[^0]: Pereira, Olivier. "Internet voting with Helios." Real-World Electronic Voting. Auerbach Publications, 2016. 293-324, https://www.realworldevoting.com/files/Chapter11.pdf
[^1]: Adida, Ben. "Helios: Web-based Open-Audit Voting." USENIX security symposium. Vol. 17. 2008, https://www.usenix.org/legacy/event/sec08/tech/full_papers/a...
It'd be more robust in my opinion to have 4 mostly trustworthy people and a 3-in-4 secret share. That seems as good as 3 trusted people.
- Availability is a security requirement. "Availability" of critical assets just as important as "Confidentiality". While this seems like a truism, it is not uncommon to come across system designs, or even NSA/NIST specifications/points-of-view, that contradict this principle.
- Security is more than cryptography. Most secure systems fail or get compromised, not due to cryptanalytic attacks, but due to implementation and OPSEC issues.
Lastly, I am disappointed that IACR is publicly framing the root cause as an "unfortunate human mistake", and thereby throwing a distinguished member of the community under the bus. This is a system design issue; no critical system should have 3 of 3 quorum requirement. Devices die. Backups fail. People quit. People forget. People die. Anyone who has worked with computers or people know that this is what they do sometimes.
IACR's system design should have accounted for this. I wish IACR took accountability for the system design failure. I am glad that IACR is addressing this "human mistake" by making a "system design change" to 2 of 3 quorum.
Break your systems, identify the issues, fix it.
I want this to happen because I want mathematically secure elections.
That said… holy shit, you didnt think one of three groups could possibly lose a key due to human error!?