Hacker Newsnew | past | comments | ask | show | jobs | submit | curtisf's commentslogin

This could be stated much more succinctly using Jobs to be Done (which is referenced in the first few paragraphs):

Your customers don't want to do stuff with AI.

They want to do stuff faster, better, cheaper, and more easily. (JtbD claims you need to be at least 15% better or 15% cheaper than the competition -- so if we're talking "AI", the classical ML or manual human alternative)

If the LLM you're trying to package can't actually solve the problem, obviously no one will buy it because _using AI_ OBVIOUSLY isn't anyone's _job-to-be-done_


It could, but under the current system, candidates who are affiliated with major parties (i.e., essentially everyone who ends up winning an election) already need to win the support of their party, and the process for this is generally opaque and largely controlled by often less-moderate insiders

Also, having viable third party choices puts more pressure on larger parties to field more widely palatable candidates, or risk losing their majorities


I just think that seeing the current gerrymandered districts where I live and the crazy people who come out of the party, I would rather voters choose individuals than parties.

If someone doesn’t tow the party line, the party would immediately replace them the next year and this would give parties even more power.


I do not understand what this could mean.

There are clear formalizations of concepts like Consistency in distributed systems, and there are algorithms that correctly achieve Consensus.

What does it mean to formalize the "Single Source of Truth" principle, which is a guiding principle and not a predictive law?


Here ‘formalize SSOT’ means: treat the codebase as an encoding system with multiple places that can hold the same structural fact (class shape, signature, etc.). Define DOF (degrees of freedom) as the count of independent places that can disagree; coherence means no disagreement. Then prove:

- Only DOF=1 guarantees coherence; DOF>1 always leaves truth indeterminate, so any oracle that picks the ‘real’ value is arbitrary.

- For structural facts, DOF=1 is achievable iff the language provides definition‑time hooks plus introspectable derivation; without both (e.g., Java/Rust/Go/TS) you can’t enforce SSOT no matter how disciplined you are.

It’s like turning ‘consistency’ in distributed systems from a principle into a property with necessary/sufficient conditions and an impossibility result. SSOT isn’t a predictive law; it’s an epistemic constraint. If you want coherence, the math forces a single independent source. And if the same fact lives in backend and UI, the ‘truth’ is effectively in the developer’s head; an external oracle. Any system with >1 independent encoding leaves truth indeterminate; coherence only comes when the code collapses to one independent source (DOF=1).


They might not _notice_ but that doesn't mean it's not affecting their ability to use their computer smoothly.

With computers such a huge part of almost everyone's lives now, it's a travesty for one of the largest companies in the world to inflict something so subpar on so many old-style


My favorite concrete example of this is the textually beautiful "xd" crossword puzzle format.

Interesting video story: https://youtu.be/9aHfK8EUIzg (2016)

Data site: https://xd.saul.pw/data


Using words written by other people without disclosure has always been frowned upon. It's called plagiarism.

Plagiarism is bad for a lot of reasons, all of which also apply to the undisclosed use of generative AI.


Livestock emits between 10% to 20% of global greenhouse gases (in carbon equivalent/100y-GWP) [1]

In contrast, all data centers (not just AI) currently use less than 1.5% of all electricity, making up less than 0.3% of global emissions [2]. Although recent increases in data center electricity usage is lamentable, even in the short term future, much of this can and more importantly _will_ be low-carbon energy, and the ratio should continue to improve with time.

A 1% reduction in livestock emissions is therefore about the same as a 50% reduction in data center emissions.

[1]: https://thebreakthrough.org/issues/food-agriculture-environm...

[2]: https://www.carbon-direct.com/insights/understanding-the-car...


[flagged]


The numbers are what the numbers are, not what you want them to be.

Minimizing cow farts is simply a better focus.


I never said anything about electricity consumption in my origional post, my disingenuous friend.


[flagged]


Are you making up a guy to be mad at?


It's only game, why you heff to be mad?


By replacing (some) farmed meat with farmed fungi protein.

Although it's theoretically possible for a disease to infect both fungus and animals, because the biology is so different, the risk is greatly, greatly reduced.

In addition, it may be possible to reduce the use of treatments such as antibiotics which, in their currently mass application to farmed animals, could directly lead to the development of antibiotic resistant in diseases which affect humans and animals.


Plus, chucking the contents of a few biotanks in case of infection is a hell of a lot better than having to kill and waste millions of birds.

I mean, industrial slaughter isn't a pretty process, even in better plants, which most aren't, but where they come to wipe out the barn, they're not putting animal welfare first.


Ah, good point.


Formal verification is explicitly NOT testing.

It is a method where a computer verifies a proof that the program adheres to its specification for _all_ inputs (subject to whatever limitations the particular method has).

Types are the simplest kind of formal verification, and with sufficiently advanced dependent type-systems, can be used to prove that programs obey arbitrarily complex specifications. However, this can be extremely laborious and requires significantly different skills than normal programming, so it is very rarely done in industry


I think this discussion dismisses the "physics" of writing code, which rewards laziness.

Effects make _the right thing to do_ (proper sandboxing, testability, assertions, ...) the _easiest_ thing to do.

Build scripts aren't sandboxed because sandboxing bash functions is nigh impossible -- not because people don't want to.

The discussion on assertions is especially confusing, because that is exactly what effect systems excel at. The effect of an assertion would be Assert, and you can choose to handle it however you want, at a higher level. If you want to crash, handle Assert in main by Exit(1)ing. If you want to reject the request but keep the server alive, handle by SetResponse(500)!; CloseRequest()!. If you want to ignore it and trundle on, return to the point of the assertions continuation.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: