What kind of founding ethos doesn't allow tracking internal latency? Is their founding ethos "Never Admit Responsibility?"; "Never Leave A Paper Trail?"
This company's official ethical foundation is "Don't Get Caught."
From the wiki about IEX: "It was founded in 2012 in order to mitigate the effects of high-frequency trading." I can see how they don't want to track internal latency as part of that, or at least not share those numbers with outsiders. That just encourages high frequency traders again.
One would hope for a more technical solution to HFT than willful ignorance lol. For example, they could batch up orders every second and randomize them.
I worked in HFT. (Though am now completely out of fintech and have no skin in the game). "Flash Boys" traditional HFT is dead already, the trade collapsed in 2016-2018 when both larger institutions got less dumb with order execution, and also several HFTs "switched sides" and basically offered "non-dumb order execution" as a service to any institutions who were unable to play the speed game themselves. Look at how Virtu's revenue changed from mostly trading to mostly order execution services over that time period.
Flash Boys was always poorly researched and largely ignorant of actual market microstructure and who the relevant market participants were, but it also aged quite poorly as all of their "activism" was useless because the market participants just all smartened up purely profit-driven.
If you want to be activist about something, the best bet for 2026 is probably that so much volume is moving off the lit exchanges into internal matching and it degrades the quality of price discovery happening. But honestly even that's a hard sell because much of that flow is "dumb money" just wanting to transact at the NBBO.
Actually, here's the best thing to be upset about: apps gamifying stock trading / investing into basically SEC-regulated gambling.
This is what should happen, because what the game actually being played is to profit off those who cannot react fast enough to news event, rather than profit off those who mispriced their order.
Or leave things in place, but put a 1 minute transaction freeze during binary events, and fill the order book during that time with no regard for when an order was placed, just random allocation of order fills coming out of the 1 minute pause.
These funds would lose their shit if they had to go back to knowledge being the only edge rather than speed and knowledge.
This isn't a good approach because it assumes there are no market makers on trading venues, and that they (as well as exchanges) do not compete for order flow. Also, maybe you haven't noticed, but stocks are often frozen during news announcements by regulatory request, so such pauses are already in place and are designed to maintain market integrity, not disrupt it with arbitrary fills.
The availability of exclusive content should not be the point over which streaming services compete. The same content should be available on all streaming services, and they should compete on the quality of their delivery and discoverability technology.
Content producers must not be vertically integrated with content distributors.
You might not be surprised, but you should still be shocked. Being struck by a heavy weight will shock you even if you expected it. We are allowed to be shocked by things that we abhor even when we understand their causes and probability distribution. Not being shocked suggests you no longer despise it.
For me, it's tracing code/pipelines to figure out how a result was produced, typically in the context of that result being wrong somehow. Go To Definition is the most useful function in any editor.
I'm always surprised by how frequently colleagues don't think to do this and are left helpless.
This reminds me of my further theory that everyone needs one 'heavy' and one 'light' technique. The 'light' technique is something that often works well as a heuristic and can be an effective unit of iteration. The 'heavy' technique is something that you can fall back on in difficult cases, something that can reliably solve hard problems, even if it's slow.
Sometimes the heavy technique is: just ask someone else. ;)
You jest, but that's how my sister gets through life, and it's always fascinated me.
She's incredibly intelligent, but more importantly she's a phenomenal social networker. She always has someone to call to ask about any question or solve any problem that comes up in life, and she's great at connecting these people with each other when they have problems of their own - so they all want to help her with whatever she needs, just to gain access to a pool of people they themselves can talk to.
What do you do with a skillset like that? I honestly don't know - something in leadership, probably, something where finding the right people and setting them to work is the most important skill of the job.
That wasn't in jest. I worked in a place where this was a norm. Nothing was properly documented, instead everyone would just ask and answer questions on chats; somehow, this actually kept velocity high.
Found it really hard to adjust to that. I'm the kind of person that prefers to research things on my own, find hard references and understand context. But there, this was the wrong approach.
I have both felt and seen this at work and I would add to this the meta-technique of binary search. Once it is added to your light and heavy technique you can solve what seems intractable at first glance faster than many people can even orient to the problem.
Likewise. I don't always do this, but for problems that cost me much time or effort, I like to try to make sure that, if I wanted to reproduce a bug or problem, I'd know exactly how to write it.
Writing and understanding working correct software is, it turns out, a rather different skill from that of writing and understanding broken (or confusing) software. I'd also wager good money that the latter skill directly builds the former.
It's amazing that a lot of new developers don't know how to use them at all! I'm not even talking about using the command line gdb, but just the basic "Set Breakpoint" feature and attaching to processes.
Your supply chain is superficially fewer, but not smaller. The way you're counting the number of suppliers is heterogeneous: ChatGPT has a bigger surface area than 20 individuals.
Your supply chain is smaller in the sense that every person or organization you obtain code from is similar to a vendor, just an unpaid one. They are a separate entity your business depends on.
If we replace code written by 20 of those organizations with code written by ChatGPT, we've gone from 20 code "vendors" we don't know much about who have no formal agreement to speak of with us, to 1 tools vendor which we can even make an enterprise agreement with and all that jazz.
So whatever else the outcome may be, from this perspective it reduces uncertainty to quit using random npm packages and generate your utility classes with ChatGPT. I think this is the conclusion many businesses may reach.
What enterprise agreement would you make with OpenAI that would make you feel better about the supply chain? Seems to me you just get stochastic output that may or may not be battle tested level code, without guarantees either?
Simple means not complex, means not composed of even simpler parts. A twelve step plan is literally a list of simpler parts, making it not simple. Most things are complex, since there are so many ways to combine simples. It is an ironic title.
I disagree. Most things are complex, yet most things are also simple.
Don't forget that words are overloaded so they only mean things in context. Words are both simple and complex because of this.
As an example: the rules to the game of life are simple. The outcomes are complex. The rules cannot even be decomposed further, making them first principles of that universe.
> means not composed of even simpler parts
These would really be "first principles". Which is a form of simplicity. Being the simplest something can be, yet that sentence itself conveys that "simple" relies on context and in a continuum.
This relationship of "simple yet complex" is quite common. We could say the same thing about chaotic functions like the double pendulum. It is both simple and complex.
I think what they're getting at is that they sometimes use composition of functions in places where other people might call the underlying functions as one procedure and have intermediate results.
At the end of the day, you're all saying different ways of keeping track of the intermediate results. Composition just has you drop the intermediate results when they're no longer relevant. And you can decompose if you want the intermediates.
This company's official ethical foundation is "Don't Get Caught."