Yep, let's accept the monstrous industries which lock down culture for money.
I for one support their efforts. The same way we store seeds in vaults deep in the depths of the earth, we should do this for digital content too, and without retaliation from any specific industry.
What would be better is solve the root problem. These (illegal, somewhat legitimate) hoarding sites are most valuable for research literature which, given the public funding nature of these things should not be gated to begin with.
The comsequence of resolving the symptoms is that illegitimate use piggy back on it. Artistic literature that would legitimately deserve protection get hoarded as well.
Sweating authors of clearly copyrightable arts, typically novels, manuals, are seeing their work accessed free of royalties. For the sake of freely distributing scientific literature.
It makes it impossible to make then distinction given the legitimate utility is operating in a dark domain.
Yes, we should archive everything. And we should perhaps reform IP more broadly and re-think how we treat our culture. And we shouldn't expect retaliation.
But retaliation will happen, and I worry that it's going to pull down one of the most incredible archives along with it.
And that's why helping torrenting and seeding the content of AA is vital: they can take down a domain name but not block everyone who seeds.
I said this before but if you've got some spare GB/TB on a computer/server, consider "donating" it for culture preservation purposes:https://annas-archive.se/torre nts
Looking at mutable torrents recently and it's ability to circumvent censorship via DHT.
I wonder if it could be revived.
The only problem is the mutable torrents standard is in draft and not adopted widely. I think I saw someone proposing using DHT in a way that allows them to host websites. If feasible becomes very difficult to take down.
Looking down on the US is just masking the enormous dysfunction in the EU and many European countries.
I have been to some countries outside of Europe. Some were much worse some were much better, but I do not particularly care. So many cities in Europe are awful places to live.
If I were running a volunteer project, I would be dumping thousands a month into top-tier hosting across multiple datacenters around the world with global failover.
the _if_ is doing a lot of heavy lifting there. You're free to complain about it but Fdroid has been running fine for years and I'd rather have a volunteer manage the servers than some big corporation
They quite notably haven't been running fine for years: https://news.ycombinator.com/item?id=44884709 Their recent public embarrassment resulting from having such an outdated build server is likely what triggered them to finally start the process of obtaining a replacement for their 12 year old server (that was apparently already 7 years old when they started using it?).
In what world is it embarrassing to not buy hardware you don't need? The servers worked fine for years. When there was an actual reason to spend money, they bought something new. Sounds like good stewardship of the donations they receive.
I finally just upgraded my 9 year old computer with an i5-6600k to a Ryzen 9 5950x because I wanted to be able to edit home videos. I already rarely even used 1 core on the old CPU, the new one is 7x more powerful, and it's an ebay part from 5 years ago. I don't foresee needing to upgrade again for another decade. I probably would've been good for another 15-20 years if I had upgraded to a DDR5 platform, but RAM prices had already spiked, so I just swapped the motherboard and CPU.
Nah, if you actually read into what's available there, it's clear that the compilers have never implemented features to make this broadly usable. You only get runtime instruction selection if you've manually tagged each individual function that uses SIMD to be compiled with function multi-versioning, so that's only really useful for known hot spots that are intended to use autovectorization. If you just want to enable the latest SIMD across the whole program, GCC and clang can't automatically generate fallback versions of every function they end up deciding could use AVX or whatever.
The alternative is to make big changes to your build system and packaging to compile N different versions of the executable/library. There's no easy way to just add a compiler flag that means "use AXV512 and generate SSE2 fallbacks where necessary".
The people that want to keep running new third-party binaries on 12+ year old CPUs might want to work with the compiler teams to make it feasible for those third parties to automatically generate the necessary fallback code paths. Otherwise, there will just be more and more instances of companies like Google deciding to start using the hardware features they've been deploying for 15+ years.
But you already know all that, since we discussed it four months ago. So why are you pretending like what you're asking for is easy when you know the tools that exist today aren't up to the task?
reply