I also got the same feeling from that, in fact, I would go as far as to say that nixpkgs and nix-commands integration with git works quite well and is not an issue.
So the phrase the article says
"Package managers keep falling for this. And it keeps not working out"
I feel that's untrue.
The most issue I have with this really is "flakes" integration where the whole recipe folder is copied into the store (which doesn't happen with non-flakes commands), but that's a tooling problem not an intrinsic problem of using git
"bug" can refer to many categories of problems, including logic errors. I've certainly seen uninitialized variables be a source of bugs. Stackoverflow for example is full of discussions about debugging problems caused by uninitialized variables, and the word "bug" is very often used in those contexts.
I think what they mean, and what I also think is that the bug does not come from the existence of uninitialized variables. It comes from the USE of uninitialized variables. Making the variables initialized does not make the bug go away, at most it silences it. Making the program invalid instead (which is what UB fundamentally is) is way more helpful for making programs have less bugs. That the compiler still emits a program is a defect, although an unfixable one.
As to my knowledge C (and derivatives like C++) is the only common language where the question "Is this a program?" has false positives. It is certainly an interesting choice.
I mean that bug is *not in* uninitialized variable - bug in program logic. E.g. one of code pathes don't initialize variable.
So, I see uninitialized variables as a good way to find such logic errors. And, therefore, advice to always initialize variable - bad practice.
Of course if you already have a good value to initialize variable - do it. But if you have no - better leave it uninitialized.
Moreover - this will not cause safety issues in production builds because you can use `-ftrivial-auto-var-init` to initialize automatic variables to e.g. zeroes (`-fhardened` will do this too)
This is indeed exactly correct. Probably on its own this is the most important reason for most people to use it, as I think most of the millions of C++ developers in the world (and yes there are apparently millions) are not messing with compiler flags to get the checks that probably should be there by default anyway. The keyword auto gives you that.
If you really need a portable binary that uses shared libraries I would recommend building it with nix, you get all the dependencies including dynamic linker and glibc.
Yea that's a common theme of excuses for both Rust and Nix. Wrong though, because most anyone who can use a computer at all can learn the basics of Vim.
Seeing that flake.nix badge of complexity lets me know a project will be a nightmare to set up and will break every other week. It's usually right next to the Cargo.toml badge with 400 dependencies underneath.
To be honest I don't know what to say, you can use nix in many ways, and you don't even require to know the language.
The easiest entry-point is to just use it like a package manager, you install nix (which is just a command...) and then you have available the whole set of packages which are searchable from here: https://search.nixos.org/packages
nix-shell is just to download&add programs temporary to your PATH.
I don't feel that this is harder than something like "sudo apt install -y xxxxx" but for sure more robust and portable, and doesn't require sudo.
If at some point you want to learn the language in order to create configurations or packaging software, it may require to check a lot more documentation and examples, but for this I think it's pretty straightforward and is not harder than any other package manager like aptitude, homebrew or pacman.
Nix with Flakes never randomly break, I still have projects from 3 or 4 years ago that I can still run `nix build` and getting it running. Yes, if you try to update the `flake.lock` this may introduce breakages, but this is expected if you're pining `nixos-unstable` instead of a stable branch.
I’m not sure what you mean by “a nightmare to set up”. You install Nix on your current OS with the determinate.systems installer, and you enter `nix run github:johndoe/project-containing-a-flake-dot-nix-file` to try out the project and have the full reproducible build taken care of by Nix.
Sure, installing packages the proper way requires a little bit more setup (Home Manager, most likely, and understanding where is the list of packages and which command to build to switch configuration), but as trivial as other complex tasks most of us hackers are capable of doing (like using `jq` or Vim).
Whoa! This is a revelation. I already loved Nix and used nix-shell extensively, but this is the missing piece: fully reproducible Python scripts without compromise.
Or install direnv and put your dependencies into a shell.nix, setup the bash hook according the manual, create the .envrc with the content "use nix". Then type "direnv allow".
Then you can do e.g. use other persons python scripts without modifying their shebang.
It is if the script is written badly, gets truncated while it's being downloaded, and fails to account for this possibility.
Look into tailscale's installation script, they wrapped everything into a function which is called in the last line — you either download and execute every line, or it does nothing.
This "what if it gets truncated in the middle of the download, and the half-run script does something really bad" objection gets brought up every time "curl | bash" is mentioned, and it always feels like "what if a cosmic ray flips a bit in your memory which makes the kernel erase your hard drive". Like, yes, it could happen in the same way getting killed by a falling asteroid could happen, but I'm not losing sleep over it.
Just living far from major datacenters is enough. I get truncated downloads pretty regularly, maybe a couple times a month or so. The network isn't really all that reliable when you consistently use it across the globe.
It usually happens on large files though, due to simple statistics, but given enough users, not hard to imagine it happening with a small script...
That's quite uncommon. Typically your distribution checks that the downloaded source/binary has the correct checksum and an experienced maintainer checked the (sandboxed) installation. Here someone puts an arbitrary script online that runs with your user's permission and you hope that the web page is not hijacked and some arbitrary dev knows how to write bash scripts.
So the phrase the article says "Package managers keep falling for this. And it keeps not working out" I feel that's untrue.
The most issue I have with this really is "flakes" integration where the whole recipe folder is copied into the store (which doesn't happen with non-flakes commands), but that's a tooling problem not an intrinsic problem of using git