they all do this at some point. claude loves to delete tests that are failing if it can't fix them. or delete code that won't compile if it can't figure it out
Huh. A while back I gave up fighting with Claude Code to get it to cheat the ridiculous Home Assistant pre-run integration checklist so I could run some under-development code and I ended up doing it myself.
AOT is unique because you want to compile it with all the capabilities your device has, so there still has to be some complication done, especially when you have processors that have brand new instructions to make operations significantly more efficient.
That's not the approach they're referring to, iOS doesn't support that. They're referring to delivering the compiled native code as part of the app package.
People were right about touchscreens, actually, and mobile phones.
They never did replace the productivity usecases. They replaced a lot of casual usecases, and created a bunch more usecases, mostly around media consumption.
But if you go to an office anywhere in the world, and you look around, it's not people on their phones. It's a sea of desktop computers, like it's 1995. Even at Apple. Not because everyone is out of the times, but because we did truly find the perfect form factor, and have chosen to refine it.
Apple vision pro wont replace the productivity suite, like the iPhone didn't. And it won't replace the iPhone, because it's way bigger and more inconvenient. So, I'm not sure where that leaves it.
the market agrees for the most part. VR goggle interfaces just aren't taking the world by storm. When it came out I thought: I'll wait for the iteration that comes 2 years later (the AVP 3 or whatever) since by then they'll have worked out the kinks and it will be a solid computing platform. It's 2 months shy of 2 years since general availability of the AVP and it's essentially identical to the initial release with just a minor chip upgrade. It's a dead product line
If someone cracks “smart glasses” that’s the next smartphone-size market and revolution, guaranteed, no question about it.
VR headsets ain’t it but I’m convinced the reason every company is working on them and developing AR stuff for their traditional devices (which are terrible to use for AR) is because they don’t want to still be at the starting line if someone figures out smart glasses.
This is the “answer” in plain sight and I agree. The iPhone is the beating heart of the modern Apple empire. Tim Cook has been a vocal proponent of AR since the summer of Pokemon Go. That combined with Meta getting traction with their Rayban line is almost certainly at the center of an overarching internal strategy at Apple to ensure they are positioned to maintain or even grow position as end user mobile computing form factors shift beyond the traditional smartphone. Getting the ux and app ecosystem ready visually is what ‘caused’ Liquid Glass.
Grandparents also said it about a lot of technologies that actually were worse and didn’t survive. Those are just not around anymore to be the subject of survivorship bias.
I’m not sure when we’re started dismissing the elderly’s advice as “just complaining because they’re old” but it seems we’re hell bent on reinventing the wheel of misfortune with every generation.
If old people complain about something, maybe they have a point?
If they're still complaining about something that's around, do they have a point? How do we know? What things have survived the fires of testing and should just be accepted, and what things can be groused about as bad?
It sounds plausible, but only in the shallowest “yeah, make ‘em look the same” way. Just like when they started shipping the Catalyst-based Mac apps of Messages, Photos, etc so that they’d look the same as the iOS apps (and no doubt so they could reuse some code from there instead of wasting developers on the Mac platform they hate).
It’s not as though anything about Liquid Glass makes a meaningful difference in usability.
I think this goes deeper. Transparency is clearly not a good fit for desktop or mobile apps, but imagine smart glasses where every app completely blocks your view of the things behind it. It just wouldn't work.
To move around safely with smart glasses on your face, apps need to be semi transparent from day one. It's not about superficial stylistic similarities this time. And it's not primarily about design either.
This is absolutely about core usability, just not for macOS or iOS.
You seem to have a more solid idea than I of what a Vison-like device is for. As far as I know, it’s for approximately nothing. I have no opinion on what I’d use a $4,000 AR goggles for besides the world’s most expensive way to watch Netflix on a plane, or the second-most-expensive monitor you can buy for your Mac (Apple’s hilarious $6K 6K monitor being the first, of course).
So I don’t think I necessarily buy that apps have to have any transparency at all. If I’m walking around doing things in the real world with a Vision Pro on my head, that itself beggars belief to me. It’s wildly impractical for that with its 2-hour battery life, super heavy weight, and hilarious appearance, and all those will continue to be true long after the 5-year window when the “26” OS aesthetic will likely persist.
So, might some future glasses or something benefit from transparency? Maybe. But if I find myself walking down the street with a screen on my face, I’d personally prefer to just close the apps that I don’t need, rather than look through them. If the glasses are going to highlight place names, people’s names, etc. they can do that with text floating in midair, like a subtitle.
>You seem to have a more solid idea than I of what a Vison-like device is for.
I don't. I'm just guessing what Apple may have in mind.
>But if I find myself walking down the street with a screen on my face, I’d personally prefer to just close the apps that I don’t need
Of course, but what about the apps you do need? Say you're in a shop, taking notes, browsing the shop's website, scanning barcodes with something like the Yuka app, maybe even keeping an eye on messages at the same time.
I kept wondering what's the point of covering things in this semi-transparent sludge that doesn't actually allow you to see through but still makes the things in the foreground harder to see.
Well, here's your answer. Avoiding collisions and maybe getting a vague idea of where we want to turn next.
Note that I'm not saying this is a good idea. It's just what I think Apple has in mind. I don't think we can know at this point how or if we really want to use smart glasses.
More likely the UX team touched AVP last, so some of the design language influenced what they were building.
The goal is most likely to unify the experience around iPadOS, so that one codebase ports down the phone and watch and over to the Mac and AVP.
The delta between Mac and iPad UX elements goes down every release. The latest one gave the iPad a menu bar and multi window support.
Looking at it from a certain angle, the iOS codebase is the only one which has a native team for a lot of large companies - they might not even create larger views for an iPad native version, and may instead ship Electron for the macOS release. Apple is trying to recruit the native mobile team to be able to support native releases for the whole ecosystem.
That would be interresting is Native Switft apps would work and feel better than electron apps.
But they are not much better and only consume less RAM (not that big of a deal outside of Apple hardware).
VisionOS doesn't actually have the degree of transparency of Liquid Glass, though, which makes the whole thing particularly weird. It has a much more opaque frosted glass effect.
It would help if it wasn't 3500 dollars, they did not embrace games, and were expecting developers to buy such devices for so little return in development cost, released at a time most headsets were already on yet again going down on another VR headset cycle.
Are there even enough active Vision Pro users to make the $3500 back selling an app for it, not even considering the cost to develop it, or Apple's 30% app store tax?
How about porting "I Am Rich" to the Vision Pro, and it could just show a glowing red orb floating in front of your face.
IMO one of the big misses for launch, and one of the most untapped markets for VR/AR, was business analytics & visualization. Any manager worth flying to a corporate retreat is worth getting a capex-treatable top-of-the-line device to see an extra dimension of data breakdowns. There would be a trendiness factor here, too, much like how every executive needed a Blackberry back in the day.
> During Tableau Conference 2024 in San Diego, we recruited 22 attendees to help us assess the usability, learnability, and potential utility of Tableau on the visionOS platform, along with broader perspectives on the potential for HMDs to create engaging experiences around data. Participants were tasked with a series of analytical exercises using one of three datasets. These tasks included specifying filter settings, changing data fields, and interpreting trends across various visualizations, such as bar charts, line charts, and a 3D globe. Examples of tasks included identifying the country with the highest CO2 emissions in Asia and determining when poultry production first exceeded beef production in South America.
If you want to launch a $3000 device properly, why are you making Tableau do this themselves?
Except Liquid Glass looks nothing at all like visionOS. If they had just taken a carbon copy of the visionOS UI and put it on Mac and iPhone, I doubt there would have been any controversy. Buttons don't look like they hover way higher than the UI. Sidebars and toolbar buttons are indented, they don't scream "LOOK AT ME!".
I would be shocked if Apple was making any product decisions to benefit visionOS at the expense of anything else. It’s so abundantly clear that the vision pro was a failure, it would be a horrible mistake to sacrifice anything to try and save it at this point. I think Apple is done with that experiment.
In 1998 if you could stick a jvm on it you called it the Java Thing. Java was hot hot hot. Hell, they named javascript after java for no reason at all except some syntactic family resemblance. It was the Java and XML era. Today they'd call it an AI ring or something.
Google had PageRank, which gave them much better quality results (and they got users to stick with them by offering lots of free services (like gmail) that were better quality than existing paid services). The difference was night and day compared to the best other search engines at the time (WebCrawler was my goto, then sometimes AltaVista). The quality difference between "foundation" models is nil. Even the huge models they run in datacenters are hardly better than local models you can run on a machine 64gb+ ram (though faster of course). As Google grew it got better and better at giving you good results and fighting spam, while other search engines drowned in spam and were completely ruined by SEO.
PageRank wasn't that much better. It was better and the word spread. Google also had a very clean UI at a time where websites like Excite and Yahoo had super bloated pages.
That was the differentiation. What makes you think AI companies can't find moats similar to Google's? The right UX, the right model and a winner can race past everyone.
I remember the pre-Google days when AltaVista was the best search engine, just doing keyword matching, and of course you would therefore have to wade through pages of results to hopefully find something of interest.
Google was like night & day. PageRank meant that typically the most useful results would be on the first page.
reply