Hacker Newsnew | past | comments | ask | show | jobs | submit | ipdashc's commentslogin

Bit of a clickbaity way of phrasing it, but I'm also curious what the result was? From googling it I don't see any stories about recent changes to the calculator app, other than a few features like graphing.

I'm sorry, I wasn't intentionally trying to write clickbait, I was just agreeing with the parent and did not consider how it would come off to other parties.

What happens is the calculator window pops up ~immediately, but the entire contents of the window are a stupid logo--for at least 5 full seconds--until the UI elements actually load and you can actually use the calculator to calculate things.

The most basic thing our PCs do is they calculate. The Intel 4004 was designed... for a calculator. calc.exe, that erstwhile snappy, lightweight native Win32 application is now apparently some Electron abomination with a footprint the size of Windows 98 and a launch time to match.


No worries! Sorry, I phrased that rudely / like an accusation.

That makes sense though. Yeah, it is really depressing. I guess they just don't prioritize start time at all. The hilarious part is, like... Blender on my computer starts up almost instantly! Versus a calculator...


I just tried it on regular Windows 11 Pro and it just opened the calculator.

I bet the friend just pressed the Windows key, and typing "Calc" and quickly pressing enter caused Bing to search for calc instead. Common failure because window's start-menu search/load/discovery is a total mess.

Even in this case it opens the calculator. Web search results are further down.

if you are searching for something for the first time (or after caching invalidates), it seems like it prioritizes search sources that have already completed.

on my computer, that means web-search almost always completes first. So most of the time if I type in something "new" and don't wait, it'll bring up Bing.

Sometimes it looks like "downloads folder" file search completes before Installed app search completes, because on one occasion I typed in an app's name and it launched the INSTALLER for the app.

once all the searchs resolve it behaves "as expected". I am very surprised if you don't have the same symptoms I'm describing. Why is your computer behaving different from every Win11 install I ever interact with?


I just tried a search for "downloads" and the first result was "Downloads folder privacy settings". I never search for that so it wasn't cached. I even pasted in the query to give it less time to search before pressing enter.

I don't think I've changed any settings for search. Everything is still enabled. There's over 250,000 items in the search index so I haven't removed indexed locations. My computer is pretty much a high-end gaming PC using last generation CPU and GPU. But really I've never seen this behavior anywhere - including my very basic laptop. Maybe I could see this happening on computers that are still using a HDD but I haven't tested that.


My thoughts as well. And it's not like we're talking about taking random people off the street and teaching them to program, it's just a UI framework. And the stuff people are talking about in here isn't IDEs or CAD suites, it's like... the calculator app and the start menu. What kind of devs is MSFT hiring and paying $200k a year that can't learn a UI framework?

I know the "basically" is probably doing a bunch of heavy lifting, but dang, that's still awesome to think about. I didn't know hardware development was at the point where a hobby project CPU, apparently mostly developed by one guy, can realistically end up in a mass produced product like that.

Quick edit: sounds like "basically" wasn't doing that much heavy lifting after all, wow https://www.raspberrypi.com/news/risc-v-on-raspberry-pi-pico...


I'm in the same boat as them, I honestly wouldn't care that much if all my health data got leaked. Not saying I'm "correct" about this (I've read the rest of the thread), just saying they're not alone.

It's always been interesting to me how religiously people manage to care about health data privacy, while not caring at all if the NSA can scan all their messages, track their location, etc. The latter is vastly more important to me. (Yes, these are different groups of people, but on a societal/policy level it still feels like we prioritize health privacy oddly more so than other sorts of privacy.)


Seems a little harsh and unkind over what's just a fun article. It's not a news publication or a textbook, it's Phrack, lol. I thought it was neat.

It's a blog post. And the criticism is on a message board... It's par for the course.

Classifying Phrack as a blog is about as accurate as classifying future interest payments as liabilities.

Yeah that seriously whiplashed me too, I'm genuinely confused. Google Meets has always worked completely fine for me, good performance, works well on mobile, Firefox, etc. Nothing special but it works. Probably my favorite of all the meeting apps.

Teams meanwhile is absolutely my least favorite, takes forever to load, won't work in Firefox, nags me to download the app, confusing UI. I don't think I've ever heard anyone say they like teams.


I'm too young to have used VB in the workforce, but I did use it in school, and honestly off that alone I'm inclined to agree.

I've seen VB namedropped frequently, but I feel like I've yet to see a proper discussion of why it seems like nothing can match its productivity and ease of use for simple desktop apps. Like, what even is the modern approach for a simple GUI program? Is Electron really the best we can do?

MS Access is another retro classic of sorts that, despite having a lot of flaws, it seems like nothing has risen to fill its niche other than SaaS webapps like airtable.


You can add Macromedia Flash to that list - nothing has really replaced it, and as a result the world no longer has an approachable tool for building interactive animations.

https://www.youtube.com/watch?v=hnaGZHe8wws

This is a nice video on why Electron is the best you might be able to do.


Thanks for the link - this is a cool video. Though it seems like it's mostly focusing on the performance/"bloat" side of things. I do agree that's an annoying aspect of Electron, and I do think his justifications for it are totally fair, but I was more so thinking about ease of use, especially for nontechnical people / beginners.

My memory of it is very fuzzy, but I recall VB being literally drag-and-drop, and yet still being able to make... well, acceptable UIs. I was able to figure it out just fine in middle school.

In comparison, here's Electron's getting started page: https://www.electronjs.org/docs/latest/ The "quick start" is two different languages across three different files. The amount of technologies and buzzwords flying around is crazy, HTML, JS, CSS, Electron, Node, DOM, Chromium, random `charset` and `http-equiv` boilerplate... I have to imagine it'd be rather demoralizing as a beginner. I think there's a large group of "nontechnical" users out there (usually derided by us tech bros as "Excel programmers" or such) that can perfectly understand the actual logic of programming, but are put off by the amount of buzzwords and moving parts involved, and I don't blame them at all.

(And sure, don't want to go in too hard on the nostalgia. 2000s software was full of buzzwords and insane syntax too, we've improved a lot. But it had some upsides.)

It just feels like we lost the plot at some point when we're all using GUI-based computers, but there's no simple, singular, default path to making a desktop GUI app anymore on... any, I think, of the popular desktop OSes?


You are totally right. Going even way back, in days of TurboPascal, you could include graphics.h and get a very cool snake game going within half an hour. Today, doing anything like that is a week of advanced stuff. Someone wanted to recreated that experience today and came up with this: https://github.com/dascandy/pixel

But as you can see how much boiler plate was needed to be written for them to write this.

https://github.com/dascandy/pixel/blob/master/examples/simpl...

See the user example and then look at src for boilder plate.

In old days, you could easily write a full operating system from scratch on 8051 while use PS/2 peripherals. Today, all peripherals are USB and USB 2.0 standard is 500 pages long.

I also agree that we have left behind the idea of teaching probably or at least removed it from the mainstream.


> ex big tech

I mean, this seems like a pretty big thing to leave out, no? That's where all the crazy high salaries were!

Also, there are still legacy places that more or less build software like it's 1999. I get the impression that embedded, automotive, and such still rely a lot on proprietary tools, finicky manual processes, low level languages (obviously), etc. But those are notorious for being annoying and not very well paid.


I'm talking about what I perceive to be the median salary/conditions with big tech being only a part of that. My point is more that I remember back in that period good salaries could be had outside big tech too even in the boring standard companies that you state. I remember banks, insurance, etc paying very well for example compared to today for an SWE/tech worker - the good opportunities seemed more distributed. For example I've seen contract rates for some of the people we hire haven't really changed for 10 years for developers. Now at best they are on par with other professional white collar workers; and the competition seems fiercer (e.g. 5 interviews for a similar salary with leetcode games rather than experienced based interviews).

Making software easier and more abstract has allowed less technical people into the profession, allowed easier outsourcing, meant more competition/interview prep to filter out people (even if the skills are not used in the job at all), more material for AI to train on, etc. To the parent comment's point I don't think it has boosted salaries and/or conditions on average for the SWE - in the long run (10 years +) it could be argued that economically the opposite has occurred.


IIRC, this happened in Washington state: https://www.eff.org/deeplinks/2025/11/washington-court-rules...

And as a result, they got rid of the cameras. Funny how that works!


Yeah, I haven't read the WSJ article, but I did read the original Anthropic experiment and I feel like the author is catastrophizing a bit much here. This is effectively just something they did for fun. It's entertaining and a funny read. Not everything has to be the end of the world.


The point is that it's an ad. No company spends money on a joke just to make a joke. Not the end of the world, although it's interesting that all the end of the world stuff comes directly out of that joke and its universe as it were. Take the joke seriously and extend its logic as far as it will go, and you get the end of the world. It's a thought experiment, or that's how I read it anyway.


I think the phrase we're looking for is "publicity stunt". Seems a fairly harmless and self-effacing one at that.


> The point is that it's an ad.

Sure, but like the other guy said, that's the point of publicity stunts. It doesn't even have to be specific to a company/ad, any silly thing like this is going to sound crazy if you take it seriously and "extend its logic as far as it will go". Like seeing the Sony bouncy balls rolling down the street ad and going "holy shit, these TV companies are going to ruin the world by dropping bouncy balls on all of us". It's a valid thought experiment, but kind of a strange thing to focus on so sternly when it's clearly not taking itself seriously, especially compared to all the real-world concerning uses of AI.

(And it is pretty funny, too. If anything I think we'd all prefer more creative ads like this and the bouncy ball one, and less AI-generated doomer Coke ads or such.)


To be fair they could have done an experiment on a transaction that typically requires a person in the loop. Rather than choosing a vending machine which, already, does not require a person in the loop for the transaction.


You’re thinking of replacing a vending machine with a chatbot which indeed doesn’t make much sense. The experiment was replacing the management of the machine. It’s not crazy to think that, money being no object, it would be great to have a person who hangs around the machine, whose job it is to ask customers what kinds of things people might want to buy from the machine, and conduct trials, tinker with pricing, do promotional deals, etc. But that’s of course impractical to have a FTE per machine to do that. The idea of this experiment was to see if that could be done with Claude. And of course as others have pointed out, it’s a simple and cheap and low-stakes version of “a business” suitable for experimenting with.


If I recall, the idea was the AI taking the role of the vending machine manager, choosing and restocking products and such. Anything on top of that was, I assume, just added for fun.


>I feel like the author is catastrophizing a bit much here.

I feel like he's catastrophizing the ordinary amount for an anti-AI screed. Probably well below what the market expects at this point. At this point you basically have to sound like Ed Zitron or David Gerard to stand out from the crowd.

AI is boiling the oceans, and you're worried about a vending machine?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: