When it originally launched, it marketed itself as directory where you could link your social accounts using cryptographic proofs, so that anyone who was wondering if "@lutoma" on twitter and "lutoma" on Hacker News are the same person could easily check. I.e. pretty much what Keyoxide now seems to aim to do. Simple enough and reasonably useful.
But then at some point they tacked on some sort of Dropbox-y encrypted file system that also kind of but not really includes web hosting if you set your files to public.
And if you visit their website now all their landing page talks about is their Slack clone with end to end encryption without mentioning any of the other stuff.
And I just checked out their docs and there's a section on wallets for the Stellar crypto currency so apparently they also baked that in recently.
idk it just seems like a company with absolutely zero direction that just builds whatever the product managers find interesting at any given time and I gave up trying to understand what it is they do.
But since they sold to Zoom, it seems to have been financially successful so fair play to them.
That's because the service was to capture users to then monetize them and the people visiting the site to verify their ID. Ads, other services, etc.
Every tech startup does the same thing. Capture users with something useful for free, get them hooked, go public, and then bilk them just enough that, well, just enough of them stick around...while monetizing the data they give you, even if it's just their IP address at any given moment.
In addition to what all the other comments have said about the history of keybase, the present-day state of it is a sorry one. There was a drastic cut to development activity, suggesting it has gone into "maintenance mode" ever since the Zoom acquisition.
I haven't set it up correctly. Just laziness (DNS) on my part. Tho technically it means that the site doesn't belong to me (except it does since the root domain is verified).
Besides the fact that a signify/minisign are a raw key instead of being padded with identity information, in what way are they actually better?
Similarly, minisign makes no claims at identity at all. You get a random string, and the user is responsible for knowing which key is for what user. The minisign public key contains nothing but the key.
To me, that is a horrible user experience.
A PGP public key contains many bits of information besides just the key, and that is how this is even possible.
PGP tools are install nearly everywhere (except windows) by default, while minisign is an extra install.
Nah troll baiting is saying "PGP tools are installed nearly everywhere (except windows) by default" when windows still has almost 80% market share on PCs. Most of the rest are on macos. If you want something to be useful that relies on network effects then they, and mobile users are who you need to accommodate. Linux on the desktop users are a rounding error.
PGP is bloated and complex, trying to be the Swiss Army Chainsaw of encryption and verification. It lacks modern features of encryption tools such as channel binding (which opens whole categories of exploits) and perfect forward secrecy (instead preferring to use long-lived keys, which are a nightmare of their own).
PGP also allows a host of insecure options without mandating secure ones: while it does offer good algos and distribution mechanisms (e.g. WKD), it also offers poor/deprecated algos and disasters like keyserver pools.
As for Signify: Minisign supports trusted and untrusted comments in signatures to supply metadata.
Some distributions like OpenBSD and Void Linux have finally upgraded their package signing from PGP to Signify; Debian is in the process of migrating from PGP to Ed25519 sigs (https://wiki.debian.org/Teams/Apt/Spec/AptSign). If we manage to switch enough distros off PGP, maybe we can remove the need to have PGP installed by default.
> PGP is bloated and complex, trying to be the Swiss Army Chainsaw of encryption and verification. It lacks modern features of encryption tools such as channel binding (which opens whole categories of exploits) and perfect forward secrecy (instead preferring to use long-lived keys, which are a nightmare of their own).
Strong identity and long lived keys are a requirement. Perfect forward secrecy does not make sense in a world were I want to prove that all things signed by me are in fact signed by me. If I generated a new key, how do I distribute that key to someone else in a way they can trust that key? And if you have that secure channel working and trust worthy, why do you even need to sign anything?
Even in the case of debian apt signing changes, all the key signing happens in the public view, there is nothing secret about it.
Perfect forward secrecy is only for encryption and does not make sense in the case of signatures, but even in the case of encryption (age) you have a similar key distribution problem. If you are constantly making ephemeral keys that's great, but it also means you need the receiver to make a new key on every file the sender wants to send. This means people will still have long lived keys, and perfect forward secrecy does not apply.
Lastly, in most async communication, like email, PFS is very difficult because you don't have channels that can easily negotiate channels. There is autocrypt which gets pretty far, but has it's own troubles. For any real time communication, TLS and Olm are the way to go. But at this point we are very very far away from minsign which.
> PGP also allows a host of insecure options without mandating secure ones: while it does offer good algos and distribution mechanisms (e.g. WKD), it also offers poor/deprecated algos and disasters like keyserver pools.
PGP has been around for a long long time. None of the options were insecure when they were implemented. Minsign and age have decided that their tools will only have 1 algo, the correct and most secure one. However that can change in just a few years. At which point it will either become more like PGP or we will have to migrate everyone to the new best tool. This is madness.
A PGP implementation that was more aggressive at deprecating weak methods is far better than this new pointy tools.
> As for Signify: Minisign supports trusted and untrusted comments in signatures to supply metadata.
As you say this data is untrusted and unsigned. Having it proves nothing about the ownership of the private key.
> Some distributions like OpenBSD and Void Linux have finally upgraded their package signing from PGP to Signify; Debian is in the process of migrating from PGP to Ed25519 sigs (https://wiki.debian.org/Teams/Apt/Spec/AptSign). If we manage to switch enough distros off PGP, maybe we can remove the need to have PGP installed by default.
I have read their reasoning on this several, and I firmly believe this is a huge mistake. Or at the very least a lot of work that provides no actual value. However the keys here are distributed by packages, so there is already a TOFU when you install the system. In the end there is not actual end user impact here, besides making it much more difficult to know if the signature of a package is valid manually.
> Strong identity and long lived keys are a requirement. Perfect forward secrecy does not make sense in a world were I want to prove that all things signed by me are in fact signed by me. If I generated a new key, how do I distribute that key to someone else in a way they can trust that key? And if you have that secure channel working and trust worthy, why do you even need to sign anything?
A communication platform should handle this for you. Every modern E2EE platform offers out-of-band verification. The problem with PGP is it's used as a "go-to" encryption and verification tool for other situations where PFS are necessary due to its network effect ("it's already installed"). Minisign/Signify, in comparison, are used just for verification; they are a better fit for this because they don't try to be anything else.
> but even in the case of encryption (age) you have a similar key distribution problem. If you are constantly making ephemeral keys that's great, but it also means you need the receiver to make a new key on every file the sender wants to send. This means people will still have long lived keys, and perfect forward secrecy does not apply.
Age is for file encryption. An E2EE communication protocol/platform should have provisions to automate this process.
> PGP has been around for a long long time. None of the options were insecure when they were implemented. Minsign and age have decided that their tools will only have 1 algo, the correct and most secure one. However that can change in just a few years. At which point it will either become more like PGP or we will have to migrate everyone to the new best tool. This is madness.
It is better to have new versions of a standard than to extend a standard to include all possibilities. The equivalent would be making all TLS and SSL versions just extensions of the original SSL standard; a client talking to a server would negotiate between a plethora of secure and insecure cipher suites.
Age uses 256-bit AES keys and the HKDF; Signify uses Ed25519 signatures. If these are found to be insecure in the coming decades, the solution would be to release a spec for Age 2 or Signify 2 instead of just adding an option to use a different combination of algos/curves/KDFs.
> As you say this data is untrusted and unsigned. Having it proves nothing about the ownership of the private key.
No, I didn't say that. I said that it supports both trusted and untrusted comments. Trusted comments are signed.
> In the end there is not actual end user impact here, besides making it much more difficult to know if the signature of a package is valid manually.
Given a source, sig, and pubkey, verification can be done with the following command:
This isn't any harder than using GPG unless you're using the "it's already installed" argument, which kinda proves my point about the network effects exacerbated by having a "do-it-all" tool instead of a "do-one-thing" tool. Minisign dogfoods this to sign its releases, and you can see Fedora's RPM spec use it quite easily to verify its tarball:
> It is better to have new versions of a standard than to extend a standard to include all possibilities. The equivalent would be making all TLS and SSL versions just extensions of the original SSL standard; a client talking to a server would negotiate between a plethora of secure and insecure cipher suites.
> Age uses 256-bit AES keys and the HKDF; Signify uses Ed25519 signatures. If these are found to be insecure in the coming decades, the solution would be to release a spec for Age 2 or Signify 2 instead of just adding an option to use a different combination of algos/curves/KDFs.
I hope you realize this doesn't make sense. Would minsign2 be able to read minsign1? For an end user, they will need to know even more out of band information about the signature they want to verify.
The extremely poor ergonomics of minsign are my problem with it. It doesn't have solutions for many problems, and it's users flat out hand wave future problems.
> I hope you realize this doesn't make sense. Would minsign2 be able to read minsign1? For an end user, they will need to know even more out of band information about the signature they want to verify.
"Signify 2" would be a specification, not a program. A future Age or Signify implementation would be able to handle multiple versions of their specs without mixing their components together just as easily as:
- A TLS/SSL program (OpenSSL, LibreSSL, bssl, etc) can work with TLS 1.2 and 1.3.
- libcurl can handle HTTP/1.0, HTTP/1.1, and HTTP/2.0
Say Age 2 uses, I don't know, "AES-512" (I pulled that out of my ass for illustrative purposes, don't tell anyone) and Argon2id instead of AES-256 and the HKDF. Your choices would be Age 1 or Age 2; you wouldn't be able to use both AES-512 and the HKDF, or AES-256 and Argon2id. There would be fewer combinations to keep track of, making it easier for other implementations to support less as insecure standards get phased out (c.f. TLS 1.0, 1.1). Insecure standards will still be around and be relatively simple to implement for anyone interested (especially for Signify, whose spec is so basic).
This is how versioned specs are supposed to work, with iteration rather than extension. The command I demonstrated wouldn't have to change.
If Age followed the PGP approach, a future version would support choosing either AES-256 or AES-512, and either the HKDF or Argon2id. You'd mix-and-match, and write all the metadata into the file. It'd be much harder to move away from bad standards. Instead, right now, Age just writes the version of the Age spec being used; that implies the rest.
Not in any relative sense. The commonly used Signal protocol for example is much more complex and is only applicable to one narrow category of application. In the Keyoxide case we are only doing signing so that is the only part of the OpenPGP standard that would be applicable.
>...channel binding...
Could you expand on how that might relate to PGP applications?
>...perfect forward secrecy...
... is not relevant for the protection of things like files. Even for messaging it is rarely of any value as users like to keep their old messages around.
OpenBSD needed something compatible with the license of the base distribution. The mistake made was that signify was not made to use a preexisting format. So it is an attempt to create a whole new standard in a way that provides no benefit to anyone.
> In the Keyoxide case we are only doing signing so that is the only part of the OpenPGP standard that would be applicable.
The problem is that PGP does a lot of stuff besides signing too; this is bad design, and is one in a long list of issues with PGP. Its network effect extends to areas where issues like the lack of channel binding and PFS are relevant (e.g. communication platforms): people use the "it's already installed (for something else)" argument far too often to justify using it for something it isn't suited for (most other things).
> OpenBSD needed something compatible with the license of the base distribution. The mistake made was that signify was not made to use a preexisting format. So it is an attempt to create a whole new standard in a way that provides no benefit to anyone.
Signify's standard is dead simple, and already has several implementations. The friction involved in adopting it is low enough for this to be a minor concern. The benefit is that it helps us move away from complex dependencies like GPG and towards simple ones like minisign/signify that only do what's necessary.
Excuse me, but does your alternative provide toolchains and user interfaces for every major platform in existence today, including Mac, *nix, iOS, Android, Windows, a library for every major language in existence, and 25 years of attempts to break it?
If not, I don't see how you can claim it is superior in every way, because here are at least two ways in which PGP/GPG are by far superior.
This might represent a common misconception about PGP. The algorithm preference information is embedded in the PGP identity. It is signed by the certification/signing key. So a downgrade attack would involve breaking the cryptogrphy used for the certification/signing key. CAST5 is not used for certification or signing and has been nowhere close to the start of the preferences for a really long time. Having CAST5 as some sort of last ditch backward compatibility thing is no more of a weakness than the fact that the computer I am using to write this has an MD5 command.
Having said that, what is wrong with CAST5 for PGP application? What does it have to do with a system that is all about signing things?
PGP/GPG has much broader adoption, and is not fundamentally broken as a standard. It's also what people are used to. I don't know what to do with a minisign key.
PGP sucks in general. But it seems to be the only thing in it's own space. When will there be a viable alternative to it?
age doesn't support signing, keys are meant to be ephemeral. minisign/signify are not a suitable equivalent and are more suited to signing+verifying software releases.
I really like the general idea of decentralized identity. Personally I'd prefer to keep my identities on different apps/platforms mostly (99%) separate. It seems to me that giving an adversary a map (especially usernames and email identities) of your online presence is a bad idea especially if they get access to one account and get some private details they may be able to use to socially engineer their way into other accounts.
I'm not sure what the best implementation of decentralized identity is (although proof-of-personhood systems like BrightID seem interesting[0]), but ideally the different platforms would cryptographically sign statements for you like "This user has a positive reputation on our platform" which you can disclose to other platforms without them being able to learn your username on the original platform.
I think it's not really viable to expect people to install a browser extension that interferes with their login sessions to sensitive websites, but if this technology was built into browsers it could be very interesting.
The UX might be a little awkward, but you could have a "Notarised documents" location under "Save Page As" which stores the current page with an accompanying file containing the notary details. Then you'd be able to upload these documents via a web interface to a smart contract.
There would have to be some way to cryptographically blind the connection between the distributed ID and these documents, though, as the documents would be plaintext and contain actual usernames.
I'm guessing most people try to obtain the same username on every service, so generally it's not hard to trace people's other identities. And some people, including myself, are happy to have all their identities out there.
It's also possible to correlate the shape of the social graph across services to deanonymize users across social networks.
So I don't think not proving ownership of multiple identities buys much for anonymity. For deanonimization just a reasonable probability is sufficient. But a proof adds more utility to the person proving their identity. You can still have hidden throwaway accounts when needed.
There's also https://keys.pub/ (from someone ex-Keybase, if I remember correctly). I haven't looked at either closely. Can anyone compare and contrast?
Keyoxide doesn't use any cryptography, everything is handled by identities in PGP. I doubt it needs any auditing.
The only reasonable attack vector I can see is hijacking the website (or proxy server) to return different keys or show something is verified when it's actually not.
Decentralized identity is a pie in the sky to me. It sounds great but when you really start to think about what identity is, it’s formed by your relationships and connections.
Tools like this may be useful in some instances but auth will always tend towards centralization
reply