I disagree with this article and what it attempts to do: frame the acquisition using a conjecture. The only thing to “believe” are the authors reasons - which are flimsy, because they are the very thing we need to be critical of.
I don’t know why the acquisition happened, or what the plans are. But it did happen, and for this we don’t have to suspend disbelief. I don’t doubt Anthropic has plans that they would rather not divulge. This isn’t a big stretch of imagination, either.
We will see how things play out, but people are definitely being displaced by AI software doing work, and people are productive with them. I know I am. The user count of Claude Code, Gemini and ChatGPT don’t lie, so let’s not kid ourselves.
The reason why that formatting is not used is because it’s not useful nor true. The table in the article is far more relevant to the person optimizing things. How many of those I can hypothetically execute per second is a data point for the marketing team. Everyone else is beholden to real world data sets and data reads and fetches that are widely distributed in terms of timing.
Some of this can be reduced to a trivial form, which is to say practiced in reality on a reasonable scale, by getting your hands on a microcontroller. Not RTOS or Linux or any of that, but just a microcontroller without an OS, and learning it and learning its internal fetching architecture and getting comfortable with timings, and seeing how the latency numbers go up when you introduce external memory such as SD Cards and the like. Knowing to read the assembly printout and see how the instruction cycles add up in the pipeline is also good, because at least you know what is happening. It will then make it much easier to apply the same careful mentality to this which is ultimately what this whole optimization game is about - optimizing where time is spent with what data. Otherwise, someone telling you so-and-so takes nanoseconds or microseconds will be alien to you because you wouldn’t normally be exposed to an environment where you regularly count in clock cycles. So consider this a learning opportunity.
Just be careful not to blindly apply the same techniques to a mobile or desktop class CPU or above.
A lot of code can be pessimized by golfing instruction counts, hurting instruction-level parallelism and microcode optimizations by introducing false data dependencies.
Compilers outperform humans here almost all the time.
Compilers massively outperform humans if the human has to write the entire program in assembly. Even if a human could write a sizable program in assembly, it would be subpar compared to what a compiler would write.
This is true.
However, that doesn't mean that looking at the generated asm / even writing some is useless! Just because you can't globally outperform the compiler, doesn't mean you can't do it locally! If you know where the bottleneck is, and make those few functions great, that's a force multiplier for you and your program.
It’s absolutely not useless, I do it often as a way to diagnose various kinds of problems. But it’s extremely rare that a handwritten version actually performs better.
An old approach to micro-optimization is to look at the generated assembly, and trying to achieve the same thing with fewer instructions. However, modern CPUs are able to execute multiple instructions in parallel (out-of-order execution), and this mechanism relies on detecting data dependencies between instructions.
It means that the shorter sequence of instructions is not necessarily faster, and can in fact make the CPU stall unnecessarily.
The fastest sequence of instructions is the one that makes the best use of the CPU’s resources.
I’ve done this: I had a hot loop and I discovered that I could reduce instruction counts by adding a branch inside the loop. Definitely slower, which I expected, but it’s worth measuring.
It is not about outperforming the compiler - it’s about being comfortable with measuring where your clock cycles are spent, and for that you first need to be comfortable with clock cycle scale of timing. You’re not expected to rewrite the program in assembly. But you should have a general idea given an instruction what its execution entails, and where the data is actually coming from. A read from different busses means different timings.
Compilers make mistakes too and they can output very erroneous code. But that’s a different topic.
It’s a milquetoast rant but I got nothing - the employee is right. You should prepare for the world and stop acting so shocked. You had decades to call out journalists for being paid mouthpieces but you didn’t because they spewed nonsense that you agreed with and benefitted you.
Now the shoe is on the other foot. Prepare for what happens next. FAFO.
You’re welcome to try but something tells me people resort to these satirical takes precisely because they (you) are powerless to do anything of significance.
You are welcome to continue posting nonsense but the world will move forward with AI with or without you.
Not who you were asking but my reasons for thinking Brave is a joke.
First they're a cypto/addtech company, which is a type of company I wouldn't trust to run my browser. And this has resulted in them doing things in the past like:
Their rewards crypto was opt-in for creators. Making it look like creators were openly asking for donations in Braves crypto currency without their consent. They had to change this due to complaints:
https://brave.com/blog/rewards-update/
They criticise the effectiveness of ad block testing websites, and urge people to use and trust privacytests.org instead. They fail to mention the conflict of interest in that privacytests is run by a Brave employee.
https://brave.com/blog/adblocker-testing-websites-harm-users...
It's a bit of a generic complaint, but quite apt for the subject matter. Mission creep kills projects, and that's true across a broad range of activities.
More specifically in the case of software, egos kill projects, and expanding the scope of your project to include broader economic or social causes usually does the same.
This is correlated to a huge change in nerd culture - pseudonymity was much more common and encouraged, with people's real-life identities or views not really taken into account. ("on the internet, nobody knows you're a dog")
Social media happened, and now most people use their real-world identities and carry their real-life worldview into the internet.
This had a huge negative effect on internet toxicity and interpersonal trust, and Eich is a good example of that - auxiliary things being dredged up about someone, used as a cudgel against them for their real or perceived transgressions.
The end result is that effective project management has become a rare breed and we see all these colossal failures like Firefox...
Yup. And if you dared to bring this up in the comments (ie. your own rewrite of a title/post), you’d get reminded of the guidelines and downvoted/flagged. Because fuck honesty - we are here for clicks and engagements.
This is a good step. Next: disclose financial incentives and other motives just to nip it in the bud.
well, I think OP is quite funny and I really enjoyed it, but it definitely goes against the entire idea of approaching things in good faith. I'm sure some or even many of them are sadly accurate, but if reinterpreting things people say through that lens became the behavioral norm on HN I think it would quickly destroy everything many people love about this place. Just my 2 cents of course.
Your analysis is out of date I think. This has already happened. Poor NXP just got their asses handed to them by the PRC. The fab they have in Italy looks nice but PRC has many of those.
Also Texas Instruments, STMicro, Onsemi, Microchip Tech, i.e. what PRC is doing after going big in mature nodes last few years they likely will also do in leading edge. IMO there's argument that since leading edge will definitely be strategically bifurcated PRC and western semi can pseudo collude to maintain higher margins, especially if PRC wants to claw back investment. But if western semi continues to drive economy/growth there's also incentive to weaponize margins.
Somebody screenshot this please. We are looking at comedy gold in the next 3 years and there’s no shortage of material.
reply