Or: Put all of Windows inside of a VM, within a host that uses disk encryption -- and let it run amok inside of its sandbox.
I did this myself for about 8 years, from 2016-2024. During that time my desktop system at home was running Linux with ZFS and libvirt, with Windows in a VM. That Windows VM was my usual day-to-day interface for the entire system. It was rocky at first, but things did get substantially better as time moved on. I'll do it again if I have a compelling reason to.
With a VM running on an encrypted file system, whatever a warrant for a bitlocker key might normally provide will be hidden behind an additional layer that Microsoft does not hold the keys to.
(Determining whether that is useful or not is an exercise for the person who believes that they have something to hide.)
Sure, the plan you outline does sound very simple. And in an ideal world, that'd be perfectly fine.
Except we don't live in an ideal world.
See, for example, the fuckery alluded to above.
Therein: Linking a Microsoft account to a Windows login is something that appears to happen automatically under some circumstances, and then bitlocker keys are also automatically leaked to the mothership...
The machine is quite clearly designed with the intent that it behaves as a trap. Do you trust it?
If you believe Windows to be so actively malicious that it would go behind your back and enable key backups after you've explicitly disabled them, you should probably assume that it will steal your encrypted information in other ways too.
This continued usage of the word "you," as if directly and specifically targeted at me, that you're using: At first, I thought it was a mistake, but now I'm pretty sure that it is a very deliberate word choice on your part.
Therefore, based on that...
Since this is about me, then: I'd like to ask that you please stop fucking with me.
We can discuss whatever concepts that you'd like to discuss, in generalities, but I, myself, am not on the menu for discussion.
It's not just Teams. You need to be constantly vigilant not to make any change that would let them link your MS account to Windows. And they make it more and more difficult not only to install but also use Windows without a Microsoft account. I think they'll also enforce it on everybody eventually.
You need to just stop using windows and that's it.
The only windows I am using is the one my company makes me use but I don't do anything personal on it. I have my personal computer next to it in my office running on linux.
No, it is a poor pixel density when compared with a printed book, which should be the standard for judging any kind of display used for text.
At the sizes of 27" or 32", which are comfortable for working with a computer, 5k is the minimum resolution that is not too bad when compared with a book or with the acuity of typical human vision.
For a bigger monitor, a 4k resolution is perfectly fine for watching movies or for playing games, but it is not acceptable for working with text.
I have an LG OLED C3 as a monitor, 42". I may be able to distinguish separate pixels if looking at a '.' or something like that (a stuck pixel happened for a few weeks, which I could notice on a white background).
But the density is definitely enough for text for the distance required for such a screen size. At least when using grayscale AA, because OLED subpixel...
AMD doesn't have it. I just confirmed by grepping through dmesg and journalctl -b, the only time it appears is due to UPS driver notifications (unrelated).
I’ve recently compared WebP and AVIF with the reference encoders (and rav1e for lossy AVIF), and for similar quality, WebP is almost instant while AVIF takes more than 20 seconds (1MP image).
JXL is not yet widely supported, so I cannot really use it (videogame maps), but I hope its performance is similar to WebP with better quality, for the future.
You have to adjust the CPU used parameter, not just quality, for AVIF. Though it can indeed be slow it should not be that slow, especially for a 1mp image. The defaults usually use a higher CPU setting for some reason. I have modest infrastructure that generates 2MP AVIF in a hundred ms or so.
I tested both WebP and AVIF with maximum CPU usage/effort. I have not tried the faster settings because I wanted the highest quality for small size, but for similar quality WebP blew AVIF out of the water.
I also have both compiled with -O3 and -march=znver2 in GCC (same for rav1e's RUSTFLAGS) through my Gentoo profile.
Maximum CPU between those two libs is not really comparable though. But quality is subjective and it sounds like webp worked best for you! Just saying though, there is little benefit in using the max CPU settings for avif. That's like comparing max CPU settings on zip vs xz!
I know, thats why I used max CPU settings. But when processing map tiles with a final total compressed size of half a terabyte, and each one is 200kB, taking 20s per tiles is prohibitively expensive.
Yea, that's understood to be the opinion, blandly repeating it adds little to the discussion.
It's simple. In it's simplicity it left many features on the floor. I just can't connect with the idea that someone would need to constantly be on MDN in order to work with it. It's not so horrible that it defies logic.
It’s not simple, though. Simple would be something like an object wrapping YYYY-MM-DD, like a COBOL programmer in the 1950s would’ve used. Instead, people have made thousands of variations of bugs around the complexity even basic usage forces you to internalize like the month number being zero-based while the year is 1900-based but the day of the month is 1-based following standard usage.
reply