> 4. Ends up in test environment for, what, a month.. nothing using getaddrinfo from glibc is being used to test this environment or anyone noticed that it was broken
"Testing environment" sounds to me like a real network real user devices are used with (like the network used inside CloudFlare offices). That's what I would do if I was developing a DNS server anyway, other than unit tests (which obviously wouldn't catch this unless they were explicitly written for this case) and maybe integration/end-to-end tests, which might be running in Alpine Linux containers and as such using musl. If that's indeed the case, I can easily imagine how noone noticed anything was broken. First look at this line:
> Most DNS clients don’t have this issue. For example, systemd-resolved first parses the records into an ordered set:
Now think about what real end user devices are using: Windows/macOS/iOS obviously aren't using glibc and Android also has its own C library even though it's Linux-based, and they all probably fall under the "Most DNS clients don't have this issue.".
That leaves GNU/Linux, where we could reasonably expect most software to use glibc for resolving queries, so presumably anyone using Linux on their laptop would catch this right? Except most distributions started using systemd-resolved (most notable exception is Debian, but not many people use that on desktops/laptops), which is a locally-cached recursive DNS server, and as such acts as a middleman between glibc software and the network configured DNS server, so it would resolve 1.1.1.1 queries correctly, and then return the results from its cache ordered by its own ordering algorithm.
For the output of Cloudflare’s DNS server, which serves a huge chunk of the Internet, they absolutely should have a comprehensive byte-by-byte test suite, especially for one of the most common query/result patterns.
I've been using pyenv for a decade before uv and it wasn't a "major pain" either. But compared to uv it was infinitely more complex, because uv manages python versions seamlessly.
If python version changes in an uv-managed project, you don't have to do any extra step, just run "uv sync" as you normally do when you want to install updated dependencies. uv automatically detects it needs a new python, downloads it, re-creates the virtual environment with it and installs the deps all in one command.
And since it's the command which everyone does anytime a dependency update is required, no dev is gonna panic why the app is not working after we merge in some new code which requires newer python cause he missed the python update memo.
this picture does show differently in Chrome and Safari, but if I analyze it using the methods you did I arrive at a different result - I don't see an iHDR chunk there, instead I see a gAMA chunk and if I remove it with pngcrush it shows normally in Chrome.
There have been ads in App Store for a long time. The upcoming change is that they will also appear further down in search results, right now they only show on top...
There are GPUs from 3 different generations in that list... Quadro 6000 is an old Fermi from 2010, Quadro RTX6000 is Turing from 2018, RTX6000 Ada is Ada from 2022...
Oh and there's also RTX PRO 6000 Blackwell which is Blackwell from 2025...
I gave up understanding GPU names a long time ago. Now I just hope the efficient market hypothesis is at least moderately effective and as long as I buy from a reputable retailer the price is at least mostly reflective of performance.
They've hyperoptimized all these marketing buzzwords to the point that I'm basically forced into the moral equivalent of buying GPU by the pound because I have no idea what these marketers are trying to tell me anymore. The only stat I really pay attention to is VRAM size.
(If you are one of those marketers, this really ought to give you something to think about. Unless obfuscation is the goal, which I definitely can not exclude based on your actions.)
Ubuntu has alphabetical order too, but that's only useful if you want to know if "noble" is newer than "jammy", and useless if you know you have 24.04 but have no idea what its codename is and
Android also sucks for developers because they have the public facing numbers and then API versions which are different and not always scaling linearly (sometimes there is something like "Android 8.1" or "Android 12L" with a newer API), and as developers you always deal with the API numbers (you specify minimum API version, not the minimum "OS version" your code runs in your code), and have to map that back to version numbers the users and managers know to present it to them when you're upping the minimum requirements...
> Ubuntu has alphabetical order too, but that's only useful if you want to know if "noble" is newer than "jammy"
Well, it was until they looped.
Xenial Xerus is older than Questing Quokka. As someone out of the Ubuntu loop for a very long time, I wouldn't know what either of those mean anyway and would have guessed the age wrong.
> Development stared in the first months of 1995, and the game was released in North America and Australia on December 9, 1995.
This feels absolutely insane for today's standards. And not just in the gaming world. Somehow with all the advancement of libraries, frameworks, coding tools, and even AI these days, development speeds seem so much slower and it seems like too much time is spent on eye candy, monetization and dark patterns and too few times on things people actually like to see - that's what made us buy games and software in the old days.
(But also in the gaming world, especially the past few years when almost no game studio develops its own engine, assets don't look more detailed than what was used 3 years ago, stories seem hastily written and it feels like 80% of developer's time is spent on making cosmetic items for purchase which often cost more than the base game price)
Also somehow we spend lots of times researching UX and developing tutorials (remember when software had the "?" button next to the close button and no software "tutorials" were needed?) and yet all the games and software are harder to learn than what we had in the 90s and 00s.
What you are looking at is corporate environments; the studios of the past (Ex: Westwood and Blizzard) had a small headcounts, and people were direct decision makers.
> StarCraft was originally envisioned as a game with modest goals that could fit into a one-year development cycle so that it could be released for Christmas, 1996
> Warcraft II had only six core programmers and two support programmers; that was too few for the larger scope of StarCraft,
No boardrooms of PMs, and Directors, and VPs, and execs, chiming in every decision, leading to fast turnarounds.
Not at all crazy. You could very easily get a game with the same art style, features, number of missions done now in a month but people want much more. QOL features, multiple platforms, high quality graphics - $50 (average game price back then) is $105 now - you can't sell any game for that price nowadays, and a game at WC2 level of features wouldn't be accepted by customers for more than $5. A full price $59.99 game now needs a billion different side quests, character customisation, full VA, multiplayer servers, an orchestral score, etc etc or people just won't buy it.
> is $105 now - you can't sell any game for that price nowadays
But you don't need to. Just sell it to Steam for a $39.99 or whatever and have much, much more sales than in '95. And as a bonus you would still recieve some sales years after.
Sure, you won't get in Top 100 and wouldn't earn bazillions...
Crazy how much bigger modern games are … I wonder how many total pixels were shipped in the art assets of Warcraft 2 vs. StarCraft 2? My guess is at least 4 orders of magnitude higher for SC2
> it seems like too much time is spent on eye candy, monetization and dark patterns and too few times on things people actually like to see
Not necessarily, but you need to look at the indie (PC) game space instead of AAA and mobile.
Top level game development in the 90's is comparable to indie games today, although granted, in the 90's they made huge technological leaps and the developers needed a lot more in-depth knowledge. But I can guarantee that someone can build Warcraft 2 today within a year. Hell, you can get the basics set up in a weekend I'm sure.
That said, even indie games suffer a bit from scope creep, and few developers actually limit themselves by saying "we release within a year and that's it". If a game is successful, continued development is beneficial. And with Kickstarter they can get money upfront (like what a publisher would pay initially), and with early access they can start making revenue to fund continued development. Which is a self-reinforcing cycle - as long as they publish updates and new features, people will keep playing and buying the game. Some games (like Factorio) end up in early access and continuous development for 10 years.
Yeah, 9 patches for the original game, then the Battle.net Edition in 1999 (which added support for TCP/IP networking and Battle.net matchmaking), and at least one downloadable patch for that.
Even smaller games now have ludicrously long development cycles as developers have learned they can exploit mentally challenged gamers by selling them "early access" (unfinished games).
Early access sometimes means unfinished, but in other cases they're fine - Factorio is an example, it had a fully fleshed out game in early access, then they spent another 5+ years adding features and fixes and the like. During that time, a lively modding community sprung up which added loads of playable content to the game.
> HBO Max is coming to EU in Jan and UK sometime next year finally
This is a very misleading sentence. HBO Max has been available in 14/27 EU countries since 2022, and by now it's available in 22/27 EU countries, 4 of the remaining ones are covered by Sky, with which they signed an exclusive distribution agreement valid until 2025 back in 2019 - even before HBO Max was launched in the USA.
I would expect most datacenters to use their own local recursive caching DNS servers instead of relying on 1.1.1.1 to minimize latency.
reply