> I certainly want to get rid of gpg from my life if I can
I see this sentiment a lot, but you later hint at the problem. Any "replacement" needs to solve for secure key distribution. Signing isn't hard, you can use a lot of different things other than gpg to sign something with a key securely. If that part of gpg is broken, it's a bug, it can/should be fixed.
The real challenge is distributing the key so someone else can verify the signature, and almost every way to do that is fundamentally flawed, introduces a risk of operational errors or is annoying (web of trust, trust on first use, central authority, in-person, etc). I'm not convinced the right answer here is "invent a new one and the ecosystem around it".
It's not like GPG solves for secure key distribution. GPG keyservers are a mess, and you can't trust their contents anyways unless you have an out of band way to validate the public key. Basically nobody is using web-of-trust for this in the way that GPG envisioned.
This is why basically every modern usage of GPG either doesn't rely on key distribution (because you already know what key you want to trust via a pre-established channel) or devolves to the other party serving up their pubkey over HTTPS on their website.
Yes, not saying that web of trust ever worked. "Pre-established channel" are the other mechanisms I mentioned, like a central authority (https) or TOFU (just trust the first key you get). All of these have some issues, that any alternative must also solve for.
So if we need a pre-established channel anyways, why would people recommending a replacement for GPG workflows need to solve for secure key distribution?
This is a bit like looking at electric cars and saying ~"well you can't claim to be a viable replacement for gas cars until you can solve flight"
A lot of people are using PGP for things that don’t require any kind of key distribution. If you’re just using it to encrypt files (even between pointwise parties), you can probably just switch to age.
(We’re also long past the point where key distribution has been a significant component of the PGP ecosystem. The PGP web of trust and original key servers have been dead and buried for years.)
> As a practical implementation of "six degrees of Kevin Bacon", you could get an organic trust chain to random people.
GPG is terrible at that.
0. Alice's GPG trusts Alice's key tautologically.
1. Alice's GPG can trust Bob's key because it can see Alice's signature.
2. Alice's GPG can trust Carol's key because Alice has Bob's key, and Carol's key is signed by Bob.
After that, things break. GPG has no tools for finding longer paths like Alice -> Bob -> ??? -> signature on some .tar.gz.
I'm in the "strong set", I can find a path to damn near anything, but only with a lot of effort.
The good way used to be using the path finder, some random website maintained by some random guy that disappeared years ago. The bad way is downloading a .tar.gz, checking the signature, fetching the key, then fetching every key that signed in, in the hopes somebody you know signed one of those, and so on.
And GPG is terrible at dealing with that, it hates having tens of thousands of keys in your keyring from such experiments.
GPG never grew into the modern era. It was made for persons who mostly know each other directly. Addressing the problem of finding a way to verify the keys of random free software developers isn't something it ever did well.
What's funny about this is that the whole idea of the "web of trust" was (and, as you demonstrate, is) literally PGP punting on this problem. That's how they talked about it at the time, in the 90s, when the concept was introduced! But now the precise mechanics of that punt have become a critically important PGP feature.
I don't think it punted as much as it never had that as an intended usage case.
I vaguely recall the PGP manuals talking about scenarios like a woman secretly communicating with her lover, or Bob introducing Carol to Alice, and people reading fingerprints over the phone. I don't think long trust chains and the use case of finding a trust path to some random software maintainer on the other side of the planet were part of the intended design.
I think to the extent the Web of Trust was supposed to work, it was assumed you'd have some familiarity with everyone along the chain, and work through it step by step. Alice would known Bob, who'd introduce his friend Carol, who'd introduce her friend Dave.
In a signature context, you probably want someone else to know that "you" signed it (I can think of other cases, but that's the usual one). The way to do that requires them to know that the key which signed the data belongs to you. My only point is that this is actually the hard part, which any "replacement" crypto system needs to solve for, and that solving that is hard (none of the methods are particularly good).
> The way to do that requires them to know that the key which signed the data belongs to you.
This is something S/MIME does and I wouldn't say it doesn't do so well. You can start from mailbox validation and that already beats everything PGP has to offer in terms of ownership validation. If you do identity validation or it's a national PKI issuing the certificate (like in some countries) it's a very strong guarantee of ownership. Coughing baby (PGP) vs hydrogen bomb level of difference.
It much more sounds to me like an excuse to use PGP when it doesn't even remotely offer what you want from a replacement.
StarMax series (and the 4400) seemed to be about as close to CHRP as we got. My off-brand StarMax clone (PowerCity) had a PS/2 and an ISA port. Ran BeOS well, and had a quirk that I could hear a tight loop on the speaker.
AFAIK most StarMax systems that were released (a prototype exists of a CHRP StarMax model) are based on the Tanzania / LPX-40 design, which is mostly a traditional PCI PowerMac[1], albeit with oddities like support for PC style floppy drives. PS/2 is handled by the CudaLite microcontroller which presents it to the OS as ADB devices for example. I've not heard of a version with ISA slots, although I assume you could just have a PCI to ISA bridge chip, even if MacOS presumably wouldn't do anything with it.
Right, I think those were the closest we got to the CHRP standard, as they moved the platform toward PC-style floppies, PS/2, ATX PSU and even more generic "platform" stuff than most clones. I'm fairly sure I had an ISA slot, I do remember trying to get a bargain bin NE2K card working in mine under linux (it didn't work). Definitely did nothing under OS 8/9.
The powercity models were interesting, because they came out after Apple revoked Motorola's clone license. A German company, ComJet, bought up the boards and sold unlicensed clones cheap. Case was slightly different, but otherwise they corresponded to StarMax models (fairly certain they were identical but may have been last revision boards).
Kinda sorts. The systems that the "MacOS on CHRP" thing ran on had a very strange looking device tree, with some bizarre combination of PC and Mac peripherals.
Refer to the "Macintosh Technology in the
Common Hardware Reference Platform" book for more information, if you're curious about the Mac IO pieces.
The Motorola Yellowknife board seems remarkably similar to this system, as well as the IBM Long Trail system (albeit with Long Trail using a VLSI Golden Gate versus a MPC106 memory controller). Both of them use W83C553 southbridges and PC87307 Super I/O controllers.
The architecture is kind of weird, but the schematics on NXP's website can probably elucidate a bit more on the system's design.
I really cannot say Uber's use of Go is particularly idiomatic to me, having started writing Go more than a decade ago now. It just strikes me as overwrought, and I've worked on big services.
UEFI itself is way too complex, has way too much surface (I'm surprised this didn't abuse some poorly written SMI handler), and provides too little value to exist. Secure boot then goes on to treat that place as a root of trust, which is security architecture mistake, but works ok in this case. This all could be a lot better.
Electrolytics are usually nothing too fancy, but it is proprietary. Water and electrolytes, hence the name. PCBs are in the big transformers and what used to be called bathtub caps which looked like this https://i.ebayimg.com/images/g/VjwAAOSwfGJjYtHx/s-l400.jpg (think 1950s electronics stuff)
I don't see how enabling secure boot helps here, since UEFI is responsible for enforcing that and is compromised. I'm sure some might recommend more roots of trust and signing down and verification that starts at the chipset, but I'd recommend an alternative with less attack surface and better user control: a jumper.
The article specifically says this is self-signed so won’t work with SecureBoot enabled.
This is technically a bootloader, so it has to find a way to get loaded by the UEFI. The article doesn’t say it’s able to do that, the guys has to manually trust the signing certificate or disable secureboot.
It's heartwarming to see that the spirit behind Azureus is still alive. SWT might not be what the Duke himself wants in a Java GUI framework, but it's practical and I remember the "chunks bar" in the Azureus GUI fondly. It'll enjoy firing up BiglyBT after all these years. Using a largely memory safe language makes a lot of sense for P2P software.
Potentially worth pointing out that Go is memory safe only when single threaded (races can corrupt memory), and this kind of application is very likely to use multiple threads.
But I do also generally expect it to be safer than C++. The race detector prevents a lot of badness quite effectively because it's so widely used.
Go is safe from the perspective of RCEs due to buffer overflow, which is what matters here. Happy to be enlightened otherwise, but "I broke your (poorly implemented, non-idiomatic, please use locks or channels ffs) state machine" is a lot better than "I am the return instruction pointer now"
Voting for new legislators, personally. I wish they'd do something about PG&E or housing instead of criminalizing software development of chatbots. Truly useless, and I wish we had more choice of non-insane candidates.
I see this sentiment a lot, but you later hint at the problem. Any "replacement" needs to solve for secure key distribution. Signing isn't hard, you can use a lot of different things other than gpg to sign something with a key securely. If that part of gpg is broken, it's a bug, it can/should be fixed.
The real challenge is distributing the key so someone else can verify the signature, and almost every way to do that is fundamentally flawed, introduces a risk of operational errors or is annoying (web of trust, trust on first use, central authority, in-person, etc). I'm not convinced the right answer here is "invent a new one and the ecosystem around it".