PrivaTegrity: The end of the Crypto War?

The following article presents a project for creating a privacy service with a distributed consensus backdoor for targeting actors who are agreed to be bad. It’s supposed to be a solution that should be able to satisfy privacy activists and those who want security.

Do you think this could be the right way to go?

That’s pretty close to being acceptable. If the system could somehow be made such that trustless public logs remain of all attempts to use the backdoor (whether succesful or not), I would probably be willing to use the system myself. That’d solve the main issue I have with the system. Namely, the susceptibility to corruption. With trustless public logs, I would no longer have to just guess but would actually have information I could use to judge for myself if they’re trustworthy.

That would be an acceptable level of oversight over the backdoor usage.

1 Like

That’s a good point. Use of the backdoor (or actually “front door” as it has been called in a Facebook conversation) should be made public!

I’ve also read about an idea for a modified version of this scheme in which every user selects a number of trusted parties who should have the decryption keys individually. This would avoid the 9 big servers of the original proposal from being obvious targets for attacks – which seems to be one of its main weaknesses. Not sure whether that modification would actually work out, but it’s a really interesting idea.

I haven’t finished reading the draft of their paper detailing how this mixnet is supposed to work but I realized that, there’s practically nothing the servers can do to prevent users from sending encrypted messages that only the recipient can open. That’d leave the backdoor only useful for figuring out who sent messages and to whom, the content is very simple to encrypt with a key only the actual recipient has.

While that can probably be made to work, it sounds like it will seriously weaken the privacy properties of the system. Even having completely alternate sets of 9 servers would weaken them, but in that case the privacy sets are just divided cleanly. But if users can mix&match servers as they please, that might make for an order of magnitude weaker system. This is because the choice of servers identifies users to some degree. So unless you use the exact same set as a great many others, you aren’t anonymised much.

1 Like

Good points. These observations emphasize that anonymity is really the main point of this system. Kinda like in Tor – see also: