Hacker Newsnew | past | comments | ask | show | jobs | submit | jerf's commentslogin

Yes, it absolutely can.

I'm sure the various high-end intelligence agencies have a much better view on this than the public does. All kinds of ways of cross-checking the numbers, all by doing things they'll be doing in their normal course of events.

A normal person could probably do a decent job with an AI that isn't too biased in the direction of "trust gov numbers above all else" and tracking down and correlating some statistics too obscure and too difficult to fake. (Example: Using statistical population sampling methodology on some popular internet service or something.) The main problem there being literally no matter what they do and how careful they are, they'd never be able to convince anyone of their numbers.


I dunno about everyone else but when I learn more about what a model is and is not useful for, my subjective experience improves, not degrades.

Not when the product is marketed as a panacea.

Dying on the exact same frame, or just generally in the same spot?

In the case of the latter my first thought would be thermals. Different video codecs have significantly different decoding costs, and may also stress different parts of your system. You could check for that by playing that same video but not starting at the beginning and see if it's the same duration. Or jump to just before it dies and see if it plays through.

If by "downloaded" you mean The High Seas, those who provision the high seas are often on the cutting edge of using codecs with every last feature turned on to make the videos smaller to squeeze every last bit out of the encodings that they can, which can make them unusually expensive to decode. Or so I've heard.


The major problem with sticking an Android tablet on to exercise equipment is the difference in life spans. Android tablets are generally going to last you 4-5 years. Weight equipment should be able to last decades. There is some simple & cheap hardware that can last decades, but it is legitimately harder to program.

Even worse was an article some months back about Android tablets hooked to heating & cooling systems expected to last 20 years. There's no way those things are making it at scale.


> Weight equipment should be able to last decades.

"should" or "actually can"? Do you have references to show that's the actual lifespan of the equipment, mechanically?


Weight training equipment lasts decades all the time. It's just big piles of metal, it's not hard to get right.

What actually prompted the engineering-CYA "should" is if the Android tablet is controlling some sort of robotic system for selecting weight sizes, that that system might have an expected life span on par with a tablet, being a physical thing moving around some pins or something in a potentially hostile user environment. That'll break long before anything else would.


So you don't have a reference.

I'm just going to ignore this.


If you are the sort of person who needs a reference for "weight equipment lasts a long time", feel free. Whatever guilt and shame you think I should be feeling over such a claim, believe me, I don't. I'm more in the "feeling pity for you" department here; I've been around enough to know what kind of person types messages like this.

The golden ratio is very mathematically interesting and shows up in many places. Not as prolific as pi or e, but it gets around.

I find the aesthetic arguments for it very overrated, though. A clear case of a guy says a thing, and some other people say it too, and before you know it it's "received wisdom" even though it really isn't particularly true. Many examples of how important the "golden ratio" are are often simply wrong; it's not actually a golden ratio when actually measured, or it's nowhere near as important as presented. You can also squeeze more things into being a "golden ratio" if you are willing to let it be off by, say, 15%. That creates an awfully wide band.

Personally I think it's more a matter of, there is a range of useful and aesthetic ratios, and the "golden ratio" happens to fall in that range, but whether it's the "optimum" just because it's the golden ratio is often more an imposition on the data than something that comes from it.

It definitely does show up in nature, though. There are solid mathematical and engineering reasons why it is the optimal angle for growing leafs and other patterns, for instance. But there are other cases where people "find" it in nature where it clearly isn't there... one of my favorites is the sheer number of diagrams of the Nautilus shell, which allegedly is following the "golden ratio", where the diagram itself disproves the claim by clearly being nowhere near an optimal fit to the shell.


This video helped me solidify my opinion that the Golden Ratio is no more attractive or appealing than any other fraction or ratio.

https://www.youtube.com/watch?v=AofrZFwxt2Y


I've never seen that one, but yeah, that is very definitely what I was getting at. Very in line with my thinking. Thank you for the expansion.

It's counterintuitive but something becoming easier doesn't necessarily mean it becomes cheap. Programming has arguably been the easiest engineering discipline to break into by sheer force of will for the past 20+ years, and the pay scales you see are adapted to that reality already.

Empowering people to do 10 times as much as they could before means they hit 100 times the roadblocks. Again, in a lot of ways we've already lived in that reality for the past many years. On a task-by-task basis programming today is already a lot easier than it was 20 years ago, and we just grew our desires and the amount of controls and process we apply. Problems arise faster than solutions. Growing our velocity means we're going to hit a lot more problems.

I'm not saying you're wrong, so much as saying, it's not the whole story and the only possibility. A lot of people today are kept out of programming just because they don't want to do that much on a computer all day, for instance. That isn't going to change. There's still going to be skills involved in being better than other people at getting the computers to do what you want.

Also on a long term basis we may find that while we can produce entry-level coders that are basically just proxies to the AI by the bucketful that it may become very difficult to advance in skills beyond that, and those who are already over the hurdle of having been forced to learn the hard way may end up with a very difficult to overcome moat around their skills, especially if the AIs plateau for any period of time. I am concerned that we are pulling up the ladder in a way the ladder has never been pulled up before.


The Great Lakes have a management principle that is basically "You can use the water of the Great Lakes by permission as long as the water remains in the watershed." And permission is not automatic either.

The reason for that to a large degree is that the Great Lakes area looked over at the Southwest, which wasn't even as bad at the time as it is now, did some math, and worked out that if the Great Lakes tried to supply the Southwest that it would cause noticeable dropping of the water level. I'm sure it would be even more dropping now.

The problem is, the Great Lakes aren't just some big lakes with juicy fresh water that can be spent as desired. They are also international shipping lanes. They make it so that de facto Detroit, Chicago, and a whole bunch of other cities and places are ocean ports. Ocean ports are very, very valuable. There are also numerous other port facilities all along the great lakes, often relatively in the middle of nowhere but doing something economically significant. This is maintained by very, very large and continual dredging operations to keep these lanes open. Dropping the water levels would destroy these ports and make the dredging operations go from expensive to impossible.

So, getting large quantities of water out of the Great Lakes to go somewhere isn't just a matter of "the people who control it don't want to do that", which is still true, and a big obstacle on its own. The Southwest when asking for that water is also asking multiple major international ports to just stop being major international ports. That's not going to happen.


There's an even bigger problem if you're talking about the soutwhest in general: huge parts of it are thousands of feet above the Great Lakes. The energy costs of moving water horizontally are probably doable; pumping millions of acre-feet 5k feet vertically are almost certainly not (no matter what energy source you suggest using for this).

"why go to the shed"

A good question but there's a good answer: Debugged and tested code.

And by that, I mean the FULL spectrum of debugging and testing. Not just unit tests, not even just integration tests, but, is there a user that found this useful? At all? How many users? How many use cases? How hard has it been subjected to the blows of the real world?

As AI makes some of the other issues less important, the ones that remain become more important. It is completely impossible to ask an LLM to produce a code base that has been used by millions of people for five years. Such things will still have value.

The idea that the near-future is an AI powered wonderland of everyone getting custom bespoke code that does exactly what they want and everything is peachy is overlooking this problem. Even a (weakly) superhuman AI can't necessarily anticipate what the real world may do to a code base. Even if I can get an AI to make a bespoke photo editor, someone else's AI photo editor that has seen millions of person-years of usage is going to have advantages over my custom one that was just born.

Of course not all code is like this. There is a lot of low-consequence, one-off code, with all the properties we're familiar with on that front, like, there are no security issues because only I will run this, bugs are of no consequence because it's only ever going to be run across this exact data set that never exposes them (e.g., the vast, vast array of bash scripts that will technically do something wrong with spaces in filenames but ran just fine because there weren't any). LLMs are great for that and unquestionably will get better.

However there will still be great value in software that has been tested from top to bottom, for suitability, for solving the problem, not just raw basic unit tests but for surviving contact with the real world for millions/billions/trillions of hours. In fact the value of this may even go up in a world suddenly oversupplied with the little stuff. You can get a custom hammer but you can't get a custom hammer that has been tested in the fire of extensive real-world use, by definition.


Isn't SQLite a de facto standard? Seems like it to me. If I want an embedded SQL engine, it is the "nobody got fired for selecting" choice. A competitor needs to offer something very compelling to unseat it.

I mean as in: Most web stacks do not default to sqlite over MySQL or postgres. Why not? Best default for most users, apparently.

I think in the past it was more obvious. Rails switched to SQLite as the default somewhat recently

Yeah, that's the one prominent example but, like you said, also just rather recently. Since "the network is slow, duh" has always been true, I wonder why.

My guess would be that performance improvements (mostly hardware from Moore's law and the proliferation of SSDs, but also SQLite itself) have led to far fewer websites needing to run on more than 1 computer, and most are fine on a $5/month VPS

And stuff like https://litestream.io/ or SQLite adding STRICT mode



So what is the solution? Is the author demanding that people work for them for free to do the sustainability for them? Because that sure sounds like the only way to "resolve" the complaint.

"start treating it as what it often is: a refusal to do the harder social work in #FOSS"

Your ending is missing something... "a refusal to do the harder social work that I want you to do in #FOSS".

But I didn't promise that. Nobody promised that. FOSS is an unparalleled gift of free work and not a single line of it has formed an obligation on my part to help anyone who wants to come along and make it do something different. You are welcome to do that, but I have no obligation on any level to come along and help you "sustain" your own work. No legal obligation, no moral obligation, no community obligation no reciprocal obligation, no Kantian imperative obligation, no obligation whatsoever. If anything, you owe them, not the other way around; any other read of the ethical situation is utterly absurd.

You want "more social work" done, you feel free to do it. Don't be shocked when I'm not interested in helping.

This is just a demand for more free work from people who have already handed you the result of more free work than any other collection of work in human history. It is deeply ungrateful to demand yet more.


It seems that many people lean into the "community" aspect of open source. In real communities there are webs of mutual responsibility. If you use open source to fill the role of community in your life, it makes sense psychologically that you would project moral stakes or obligation onto the maintainers. But this is really not fair to the maintainers who don't view their work that way.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: