Hacker Newsnew | past | comments | ask | show | jobs | submit | naikrovek's commentslogin

Remember this article when you get upset that your own customers have come to rely on behavior that you told them explicitly not to rely on.

If it is possible to figure something out, your customers will eventually figure it out and rely on it.


Once a system has a sufficient number of users, it no longer matters what you "explicitly" promised in your documentation or contract.

Hyrum’s Law: all observable behaviors of your system will eventually be depended on by someone.

Even if you tell users not to rely on a specific side effect, once they discover it exists and find it useful, that behavior becomes an implicit part of your system's interface. As a result, engineers often find that "every change breaks someone’s workflow," even when that change is technically a bug fix or a performance improvement.

Reliance on unpromised behavior is something I was also introduced to as Kranz’s Law (or Scrappy's Law*), which asserts that things eventually get used for their inherent properties and effects, without regard for their intended purpose.

"I insisted SIGUSR1 and SIGUSR2 be invented for BSD. People were grabbing system signals to mean what they needed them to mean for IPC, so that (for example) some programs that segfaulted would not coredump because SIGSEGV had been hijacked. This is a general principle — people will want to hijack any tools you build, so you have to design them to either be un-hijackable or to be hijacked cleanly. Those are your only choices." —Ken Arnold in The Art Of Unix Programming


Obligatory XKCD for this

https://xkcd.com/1172/


Amtrak is not for someone that simply wants to get from A to B. I suspect a 50-day bus trip would be the same.

When I take Amtrak, it’s because I want to look out of a window for a few dozen hours and see something new (to me) every time I look out the window.

It’s probably the bus trip that they want, and not simply “go to India.”


> I’ve said many times before that I think Finder is the worst default file manager of any popular desktop environment.

[GNOME enters the chat]: "That's nothing, I'm way worse!"


When on macOS using Finder I often wish I had something as nice and consistent and usable as Nautilus.

Finder is genuinely horrible. It’s obvious no one at Apple cares about files anymore nor anyone working with them.

We’re all supposed to consume cloud these days or so it seems.


My go to example would be long lasting issues with SMB support in Finder. All operations are very slow, the search is almost unusably so. The operations that are instant on every non-Apple device take ages on a Mac. I first ran into these issues 7 years ago when I set up my NAS, and they present to this day. I tried all random suggestions and terminal commands, but eventually gave up on trying to make it perform as it does on Linux.

With Apple's focus on cloud services, fixing the bugs that prevent the user from working with their local network storage runs contrary to their financial incentives.


Is it actually though? It’s cool to criticise Nautilus but, at worst, it’s just equally as bad as Finder. Which shouldn’t be surprising given how much it’s styled to look like Finder.

However in my personal opinion Nautilus’s breadcrumb picker does edge it against Finder.

So I stand by my comment that Finder is the worst.


Nautilus opens a new window for every folder you enter. Finder does not.

That used to be a preference, and last I used it, it was not. It is forced on because that’s how the GNOME developers thought you should use it… “Our way or the highway!” — GNOME devs.

Finder wins based on that alone. Finder wins so completely because of that one single thing that I’ll never voluntarily use GNOME again.


design for design's sake is bad, and that's what Liquid Glass is. There was no thought behind it.

It is 2026 and UIs are still abysmally slow in many cases. How is that even remotely possible? Now, with that in mind, consider (just for a moment) why people might think that UX people don't know what they're doing.


> It is 2026 and UIs are still abysmally slow in many cases. How is that even remotely possible?

Because UI/X teams were separated from engineering. (Same thing happened with modern building architecture)

It's fundamentally impossible to optimize if you're unaware of physical constraints.

We need to get rid of the "It's okay to be a UI/UX designer who doesn't code" cult. (Looking at you, Adobe and Figma...)


> Same thing happened with modern building architecture

Yes. Yes, it has. I'm currently in the midst of a building project that's ten months behind schedule (and I do not know how many millions of dollars over budget), and I'd blame every one of the problems on that. I - the IT guy - was involved in the design stage, and now in construction (as in, actually doing physical labor on-site), and I'm the only person who straddles the divide.

It's utterly bizarre, because everyone gets things wrong - architects and engineers don't appreciate physical constraints; construction crews don't understand functional or design considerations - so the only way to get things right is for someone to understand both, but (apart from me, in my area - which is why I make sure to participate at both stages) literally no one on the project does.

Seen from a perspective of incentives I guess I can understand how we got here: the architects and engineers don't have to leave their offices, and are more "productive" in that they can work on more projects per year, and the construction crews can keep on cashing their sweet overtime checks. Holy shit, though, is it dispiriting to watch from a somewhat detached perspective.


Agreed. The further you are away from how a computer works internally, the worse your product for a computer will be.

We have convinced ourselves as an industry that this is not true, but it is true.


> We need to get rid of the "It's okay to be a UI/UX designer who doesn't code" cult.

I don’t think designers who don’t code are really a problem. They just need to dogfood, and be lead by someone who cares (and dogfoods as well).


In the case of Apple, I really doubt its designers don't dogfood. Do you expect them to have Android phones and linux desktops?

I would think like you, but then some of their design decision are truly baffling. I like the idea of Liquid Glass, but there are thousands of rough edges that scream lack of care.

I have a strong feeling people working and approving Liquid Glass didn't dog food it in dark mode because it just looked BAD in the first builds available.

I sometimes wonder if anyone in charge at Apple uses Apple devices the way I do. I expect they have one, consistently-apple, high-end setup and it probably works very well for their style. Some things are great but others are insane and it seems like that happens most when using things like non-apple monitors or not typing a certain way on the phone or if you don't drive the same car.

Switching windows between two non apple monitors after waking from sleep is wildly unpredictable and has insane ux like resizing itself after a drag.

My carplay always starts something playing on my car speakers even when I wasn't listening to anything before connecting. It's so off it's comical.

The iPhone alarm will go off like normal, loudly from the speaker, even if you're currently on the phone and have it up to your ear. This has been a problem since my very first iPhone.

There has been a bug about plugged in physical headphones being unrecognized sometimes after waking from sleep even if it worked fine when going into sleep. I checked once in probably 2014 and Apples' official response was that it literally wasn't physically possible despite all of us people experiencing it. The bug was ancient even at that time and >ten years later my m4 macbook pro STILL DOES IT.

Apple and apple fanboys seem to take the stance that these are all user error on my part (remember the "you just aren't a Mac person" era?). I bet some of these are configurable with settings deep in some menu somewhere so from a certain perspective that's right but also underscores my point about the limitations of myopic dogfooding.

As a fun aside, the ux for turning on the "Voice Over" tutorial is the worse thing I've ever experienced on an Apple device. I was laughing out loud trying to figure out how to get out of it instead of finishing the unknown remaining steps. I feel bad for folks who need that accessibility in order to be effective.


It’s amazing what you can do when you’re not afraid so write your own tools.

They should have looked at Plan9 and the Rio window manager there.

I don’t know how GPU acceleration would have fit in, but I bet it would have been trivial provided the drivers were sufficient.

All of Rio in Plan9 is 6K lines of code and it’s a more powerful display protocol and window manager (all of the fundamentals are there but none of the niceties) than anything else I’ve ever seen.

The defacto way to remote into a Plan9 system from any OS even today is to use a client side program which implements it all the same way Plan9 does.


The beauty with Free Software and Linux distros is that "they" don't have to do it, anyone who wants to (including you!) can do it.

In theory. In practice, every app is designed for X11 or Wayland, building your own means you need to follow what most people use anyway if you want to have any working app on your system, or rewrite every app yourself

5-10 years ago there were no apps designed for Wayland. If you build it they (app developers) might actually come!

yep, that's why I stuck with x11, as it still does everything I need it to do. it's actually never crashed since they stopped fixing it.

Site is down, I think. :(


I add em dashes to everything I write now, solely to throw people who look for them off. Lots of editors add them automatically when you have two sequential dashes between words — a common occurrence, like that one. And this is is Chrome on iOS doing it automatically.

Ooh, I used “sequential”, ooh, I used an em dash. ZOMG AI IS COMING FOR US ALL


Anyone demonstrating above a high-school vocabulary/reading level is obviously a machine.


Ya—in fact, globally replaced on iOS (sent from Safari)

Also for reference: “this shortcut can be toggled using the switch labeled 'Smart Punctuation' in General > Keyboard settings.”


I also use Em-Dashes, this is about how weird the thing is tonally


> I'm really sorry to have to ask this, but this really feels like you had an LLM write it?

Ending a sentence with a question mark doesn’t automatically make your sentence a question. You didn’t ask anything. You stated an opinion and followed it with a question mark.

If you intended to ask if the text was written by AI, no, you don’t have to ask that.

I am so damn tired of the “that didn’t happen” and the “AI did that” people when there is zero evidence of either being true.

These people are the most exhausting people I have ever encountered in my entire life.


You're right. Unfortunately they are also more and more often right.


Small English nitpick:

> ff slow down video.mp4 by 2x

How do you slow something down by 2x? x is a multiplier. 2 is a number greater than 1. Multiplying by a number greater than 1 makes the result LARGER.

If you’re talking about “stretch movie duration to 2x”, say that instead.

Saying something is 2x smaller or 2x shorter or 2x cheaper doesn’t make sense. 2x what? What is the unit of “cheap” or “short” or “small”?

How much is “1 slow down”? How many “slow down” are in the movie where you want twice as many of them? Doesn’t make sense does it? So how can something be slowed by 2x? That also doesn’t make sense.

I know what is trying to be said. I know what is meant. Please just say it right. Things like throw us autistic people for a freaking loop, man. This really rustles our jimmies.

Language is for communicating. If we aren’t all on the same page about how to say stuff, you spend time typing and talking and writing and reading and your message doesn’t make it across the interpersonal language barrier.

I don’t want to see people wasting their time trying to communicate good ideas with bad phrasing. I want people to be able to say what they mean and move on.

I also don’t want to nitpick things like this, but I don’t want phrases like “slow down by 2x” to be considered normal English, either, because they aren’t.


> Small English nitpick:

> 2x? x is a multiplier.

Translation of English is often problematic because of the multiple valid interpretations of simple words, and concepts that have many synonyms.

The solution here is to use arithmetic to supercede English. It may then become apparent that what is meant is x as a denominator.

Translate x into 'times', and then think of 'times' not as strictly multiplication but instead as an iteration (which, after all, is what multiplication is), and that might get you closer to what is meant, which is a standard arithmetic inversion of multiplication to division.

> Saying something is 2x smaller or 2x shorter or 2x cheaper doesn’t make sense

It does, if you do the inversion. Something 2 times smaller is half (1/2) as big.

Two ways of saying the same thing is half the fun of learning English!


“x” is a multiplier in your first example, and an inversion in your second?

Pick one.

Or just phrase things correctly to begin with.


Reminds me of a thing Steve Mould mentioned in a video about a claim in a book "The temperature outside an aeroplane is six times colder than the temperature inside a freezer."

https://www.youtube.com/watch?v=C91gKuxutTU - Stand-up comedy routine about bad science


Isn’t it somewhat common to say something like “slow this down by a factor of 2”?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: