Hacker Newsnew | past | comments | ask | show | jobs | submit | dontlaugh's commentslogin

They don’t care, they’re defunding Xbox and even the Windows team is hollowed out.

When the rumour was Windows 10 will be the last windows! I don't think people thought it would because of win11 would be so unbearable it would finally drive users to Linux.. but here we are. RIP.

If you want to play games with friends, you have to play whatever the group plays. This is especially problematic as the group tries out new games, increasing the chance you can’t join because you’re not on Windows.

Personally I'd be interested to see what would happen if Sony/MS did what they could to make keyboard/mouse experience as good as possible on their consoles (I'm writing from a position of ignorance on the state of mouse/keys with current consoles) and encouraged developers to offer a choice in inputs, so that the locked-down machines can become the place for highest confidence in no/low cheaters. If other people want to pay through the nose to go beyond what consoles offer on the detail/resolution/framerate trifecta then I'm sure they could do so, but I really don't see how you lock down an open platform. That challenge has been going for decades.

> I'm writing from a position of ignorance on the state of mouse/keys with current consoles

I'm far from an authority on this topic but from my understanding both Sony/MS have introduced mkb support, but so far it looks to be an opt-in kind of thing and it's still relatively new.


All major consoles support keyboard & mouse or similar.

The problem is more the audience. Console players generally expect to be able to just connect the console to the TV, sit on the sofa and play with the official controller. That’s all the game are required to support to be published on the platform.

Even if you were willing to play at a desk, you’d be matchmaking into a special (and small) mouse pool on the console game. Anyone willing to go through so much faff will accept the extra annoyances of a PC, even with kernel anti cheat.


Well, Nintendo's latest console comes with two mice that you can both use at the same time even.

This really depends on the friends you have. I've never encountered this limitation because no one in my friend group plays competitive ranked games. Basically anything with private sessions doesn't require anticheat, so Valheim, RV There Yet, Deep Rock Galactic, etc. all work fine.

Sure, that helps.

But even then, when everyone is trying out a new indie game there’s a chance it won’t work on non-Windows. It’s happened to me.


Yes, but Linux really has gotten a lot better in recent years. At least whatever runs on Steam. I almost never had any problems with newer indie games.

I think indies are safe. The potential problem I can see lying ahead - at least for me - is Battlefield.

My friends are understanding that I don't play games with rootkit anti cheat (whether on Linux or Windows). There are enough games that we can play other games together still, and when they want to play the games with such anti-cheat (e.g. Helldivers 2) they simply play without me. No big deal.

I am playing Helldivers 2 on Linux right now. Works perfectly. It crashes less often than it does for my friends who play on PC!

Helldivers 2 works on Linux though? One of my buddies uses Linux and he played with us all the time.

i've read helldivers2 kernel component only runs on a normal windows install, you should be able to play on linix via wine/proton without any of that

More like it only happened because Sony restricted hardware access under Linux. If they had allowed GPU access, there would have been no motivation to attack the hypervisor.

Which will also become a historical artifact as new protocols are made to use little endian.

For which protocols? TCP/IP itself is network byte order all the way down to the bottom of the jar.

For example, Cap'n Proto and QUIC are both little endian.

TCP is becoming increasingly less relevant, although I don't know if it'll ever actually disappear.


Capn Proto and QUIC are are layer 6 and 7 (presentation and application protocols respectively). Quic is built on top of UDP.

Layers 3-4 (network, transport) are both big-endian - IP packet headers and TCP/UDP headers use big-endian format.

This means you can't have an IP stack (let alone TCP/UDP, Quic, Capn Proto) that's little-endian all the way through without breaking the internet.

Outside the webdev bubble, it's pretty much QUIC that is irrelevant - it's just another UDP based application protocol.


UDP is an implementation detail of QUIC, just a way to give IP-ish functionality to userspace. In practice, QUIC is a TCP alternative.

The OSI layer model is not necessarily as relevant as it used to be.


You're kind of saying "look over here!" but I'm not that easily distracted. You said "Which will also become a historical artifact as new protocols are made to use little endian". It's never going to become a historical artifact in our lifetimes. As the peer poster pointed out, QUIC itself has big-endian header fields. IPv4/IPv6 both use big-endian at layer 3.

The OSI layer model is extremely relevant to the Cisco network engineers running the edges of the large FAANG companies, hyperscalers etc. that connect them to the internet.


I was wrong about QUIC, for some reason I was sure I'd read it's little-endian.

I'm just pointing out that UDP is an extremely thin wrapper over IP and the preferred way of implementing new protocols. It seems likely we'll eventually replace at least some of our protocols and deprecate old ones and I was under the impression new ones tended to be little endian.


Foolishly, QUIC is not little-endian [1]. The headers are defined to be big-endian. Though, obviously, none of UDP, TCP, or QUIC define the endianness of their payload so you can at least kill it at that layer.

[1] https://www.rfc-editor.org/rfc/rfc9000.html#name-notational-...


Oh really? I must’ve misread.

Over 800ms is not even a little fast. I’m on WiFi to ADSL, lights static websites are way faster than that.

Exactly, Gnome/Linux or KDE/Linux would make a lot more sense.

Both are being baked

https://distrowatch.com/table.php?distribution=gnomeos

https://distrowatch.com/table.php?distribution=kdelinux

The question is if either will catch any interest and if so, what will happen to regular distributions.


Except that it can be both and more: you can have Gnome, KDE, and other DEs and libraries installed and use app based on all of them simultaneously.

Sure, although every distro has a default.

systemd/Linux maybe? Lots of things are more significant than GNU, either way.


XP was arguably better.

Exactly, even the second one was a poor caricature.


That film had many problems, but the acceptable frame rate was not one of them. Most criticism wasn’t about that.


True but there was specific criticism about how the framerate made it far too easy to see the parts of the effects, sets and costumes that made it clear things were props and spoiled the illusion. Maybe we just require a new level of quality in set design to enable higher frame rates but it clearly has some tradeoff.


I think that’s definitely the case with 4K, and we’ve seen set detail design drastically improve lately as a response.

I don’t see how it’s the case for frame rate, except perhaps for CGI (which has also improved).

I think just like with games, there’s an initial surprised reaction; so many console-only gamers insisted they can’t see the point of 60 fps. And just like with games, it only takes a little exposure to get over that and begin preferring it.


If films shot at a decent enough frame rate, people wouldn’t feel the need to try to fix it. And snobs can have a setting that skips every other frame.

Similar is the case for sound and (to a much lesser extent) contrast.

Viewers need to be able to see and hear in comfort.


If you think this is about snobbery, then I'm afraid you've completely misunderstood the problem.

This is more comparable to color being turned off. Sure, if you're completely colorblind, then it's not an issue. But non-colorblind people are not "snobs".

Or if dialog is completely unintelligible. That's not a problem for people who don't speak the language anyway, and would need subtitles either way. But people who speak English are not "snobs" for wanting to be able to understand dialog spoken in English.

I've not seen a movie filmed and played back in high frame rate. It may be perfectly fine (for me). In that case it's not about the framerate, but about the botched interpolation.

Like I said in my previous comment, it's not about "art".


There is no such thing as the soap opera effect. Good quality sets and makeup and cameras look good at 24 or 48 or 120 fps.

People like you insisting on 24 fps causes people like me to unnecessarily have to choose between not seeing films, seeing them with headaches or seeing them with some interpolation.

I will generally choose the latter until everything is at a decent frame rate.


> There is no such thing as the soap opera effect.

What has been asserted without evidence can be dismissed without evidence.

I'll take the Pepsi challenge on this any day. It looks horrible.

> Good quality sets and makeup and cameras look good at 24 or 48 or 120 fps.

Can you give an example of ANY movie that survives TV motion interpolation settings? Billion dollar movies by this definition don't have good quality sets and makeup.

E.g. MCU movies are unwatchable in this mode.

> People like you insisting on 24 fps

I don't. Maybe it'll look good if filmed at 120fps. But I have seen no TV that does this interpolation where it doesn't look like complete shit. No movie on no TV.

Edit: I feel like you're being dishonest by claiming that I insist on 24 fps. My previous comment said exactly that I don't, already, and yet you misrepresent me in your very reply.

> causes people like me to unnecessarily [… or …] seeing them with some interpolation

So you DO agree that the interpolation looks absolutely awful? Exactly this is the soap opera effect.

I know that some people can't see it. Lucky you. I don't know what's wrong with your perception, but you cannot simply claim that "there's no such thing" when it's a well known phenomenon that is easily reproducible.

I've come to friends houses and as soon as the TV comes on I go "eeew! Why have you not turned off motion interpolation?". I have not once been wrong.

"There's no such thing"… really… who am I going to believe? You, or my own eyes? I feel like a color blind person just told me "there's no such thing as green".


I agree with you that the interpolation isn’t ideal, I’m not praising it. It’s merely a necessity for me to not get headaches. It’s also much less noticeable on its lowest settings, which serve just to take the edge off panning shots.

The “soap opera effect” is what people call video at higher than 24 fps in general, it has nothing to do with interpolation. The term has been used for decades before interpolation even existed. You seem to be confused on that point.

Source video at 120 looks no worse than at 24, that’s all I’m saying.


Yeah, but soap opera effect also isn't only framerate either.

Earlier video cameras exposed the pixels differently, sampling the image field in the same linear fashion that it was scanned on a CRT during broadcast. In the US this was also an interlaced scanning format. This changes the way motion is reproduced. The film will tend to have a global motion blur for everything moving rapidly in the frame, where the video could have sharper borders on moving objects, but other distortions depending on the direction of motion, as different parts of the object were sampled at different times.

Modern digital sensors are somewhere in between, with enough dynamic range to allow more film-like or video-like response via post-processing. Some are still rolling shutters that are a bit like traditional video scanning, while others are full-field sensors and use a global shutter more like film.

As I understand it, modern digital sensors also allow more freedom to play with aperture and exposure compared to film. You can get surprising combinations of lighting, motion blur, and depth of field that were just never feasible with film due to the limited sensitivity and dynamic range.

There are also culturally associated production differences. E.g. different script, set, costume, makeup, and lighting standards for the typical high-throughput TV productions versus the more elaborate movie production. Whether using video or film, a production could exhibit more "cinematic" vs "sitcom" vs "soapy" values.

For some, the 24 fps rate of cinema provides a kind of dreamy abstraction. I think of it almost like a vague transition area between real motion and a visual storyboard. The mind is able to interpolate a richer world in the imagination. But the mature techniques also rely on this. I wonder whether future artists will figure out how to get the same range of expression out of high frame rate video or whether it really depends on the viewer getting this decimated input to their eyes...


You have never seen a movie at 120fps. Gemini Man exists at 60fps and that is as close as you are going to get. That blu-ray is controversial due to that fps. I thought it was neat, but it 100% looks and feels different than other movies.

Thanks. I'll give it a try.

> You seem to be confused on that point

Please stop repeatedly misrepresenting what I said. This is not reddit.

I have repeatedly said that this is about the interpolation, and that I'm NOT judging things actually filmed at higher framerates, as I don't have experience with that.

> Source video at 120 looks no worse than at 24, that’s all I’m saying.

Again, give me an example. An example that is not video games, because that is not "filmed".

You are asserting that there's no such thing as something that's trivially and consistently repeatable, so forgive me for not taking you at your word that a 120fps filmed movie is free of soap opera effect. Especially with your other lying.

So actually, please take your misrepresentations and ad hominems to reddit.

Edit: one thing that looks much better with motion interpolation is panning shots. But it's still not worth it.


There is plenty of 50/60 and even 120 footage out there, some is even popular. I’m sure you can find it yourself.

I don’t see what I lied about or what Reddit has to do with anything. I will definitely stop replying to someone so needlessly aggressive.


So true, everybody else is wrong and you're right.


Getting headaches from low frame rate is rare, I guess. I only know a few others with this problem.

But preferring high frame rate is common, as evidenced by games and the many people who use TV interpolation features.


There is no evidence that people prefer high frame rate movies. Motion interpolation on TVs is set on by default, not a conscious choice the end user is making.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: