Hacker Newsnew | past | comments | ask | show | jobs | submit | g-mork's commentslogin

drop capital gains tax on EU stonks and watch the flight. there are a million ways they could make this attractive

If you are rich enough, it isn’t rocket science to avoid capital gains taxes in the EU. And by rich enough, just a few hundred K. (See the related FIRE Reddit boards)

> drop capital gains tax on EU stonks

Right, because it's not like France already has a large primary deficit or anything.


confused sounds in voiced glottal fricative What capital gains tax?

In general those things work pretty bad. Then someone will just make an EU company that owns US shares to do a tax arbitrage.

the BRICS Unit was proposed to resolve the "Triffin dilemma", which is that the structural deficit necessary to maintain reserve status inevitably erodes trust in the currency over time. China's economy also relies on exports which would not be helped by reserve status. which is to say an eventual replacement may not be the yuan or any other single currency

Or the Bancor[1], like how Bretton Woods was originally envisioned.

1. https://en.wikipedia.org/wiki/Bancor


dont forget the plethora of middleman chat services with liberal logging policies. i've no doubt there is a whole subindustry lurking in here

i wasn't judging, i was asking how it works. why would openai/anthrophic/google let a competitor scrape their results in sufficient amounts that it lets them train their own thing?

I think the point is that they can't really stop it. Let's say that I purchase API credits, and I let the resell it to DeepSeek.

That's going to be pretty hard for OpenAI to figure out and even if they figure it out and they stop me there will be thousands of other companies willing to do that arbitrage. (Just for the record, I'm not doing this, but I'm sure people are.)

They would need to be very restrictive about who is allowed to use the API and not and that would kill their growth because because then customers would just go to Google or another provider that is less restrictive.


Yeah but are we all just speculating or is it accepted knowledge that this is actually happening?

Speculation I think, because for one those supposed proxy providers would have to provide some kind of pricing advantage compared to the original provider. Maybe I missed them but where are the X0% cheaper SOTA model proxies?

Number two I'm not sure if random samples collected over even a moderately large number of users does make a great base of training examples for distillation. I would expect they need some more focused samples over very specific areas to achieve good results.


Thanks I that case my conclusion is that all the people saying that these models are "distilling SOTA models" are, by extension, also speculating. How can you distill what you don't have?

Only way I can think of is paying for synthesizing training data using SOTA models yourself. But yeah, I'm not aware of anyone publicly sharing that they did so it's also speculation.

The economics probably work out though, collecting, cleaning and preparing original datasets is very cumbersome.

What we do know for sure is that the SOTA providers are distilling their own models, I remember reading about this at least for Gemini (Flash is distilled) and Meta.


OpenAI implemented ID verification for their API at some point and I think they stated that this was the reason.

Sure fooled me. I follow his Twitter account and there isn't much he hasn't got building with it at this point. UX comes later. Amazing it's the random work of one person

The author wrote WebKit’s allocator and worked on JavaScriptCore for over a decade. I really enjoyed his posts on the WebKit blog over the years like this one on the concurrent garbage collector (2017) https://webkit.org/blog/7122/introducing-riptide-webkits-ret...

I don’t think so much is fil-c itself, but from the looks of the diff it’s a new platform essentially. That can require porting existing software generally which you can read from the posted diff

We still do that, it's just that realtime code review basically becomes the default mode. That's not to say it's not obvious there will not be a lot less of us in future. I vibed about 80% of a SaaS at the weekend with a very novel piece of hand-written code at the centre of it, just didn't want to bother with the rest. I think that ratio is about on target for now. If the models continue to improve (although that seems relatively unlikely with current architectures and input data sets), I expect that could easily keep climbing.

I just cutpasted a technical spec I wrote 22 years ago I spent months on for a language I never got around to building out, Opus zero-shotted a parser, complete with tests and examples in 3 minutes. I cutpasted the parser into a new session and asked it to write concept documentation and a language reference, and it did. The best part is after asking it to produce uses of the language, it's clear the aesthetics are total garbage in practice.

Told friends for years long in advance that we were coal miners, and I'll tell you the same thing. Embrace it and adapt


It's probably not, if you sit on Twitter at all you'll see posts about Pentagon pizzas almost once a week. Maybe it does measure something about the Pentagon, but I doubt it's anything close to as precise a measure as folk like to believe.


Just for sheer geekery's sake probably the ISDN talk.

For OMG eye opening factor the FreeBSD jails talk (how the hell is this thing still so buggy?) and the talk on unencrypted satellite links

For excellent follow-along value and dedication to ridiculously pointless cause the Freebox talk. "Technically I don't own this box so instead of risking damaging it I'm going to take the extremely long and entertaining route around, somehow involving Doom WAD files"

For showmanship probably the Tegra talk


> For OMG eye opening factor the FreeBSD jails talk (how the hell is this thing still so buggy?)

Because everything that complex is going to be that buggy.

With the bugs they found fix a constant number of them still remains.


Linus said 'many eyes make all bugs shallow', but compared to Linux, there are not many eyes looking at FreeBSD.


Linus has said a lot of stuff over the years and not all of it was on the money. Still, he did a lot of good and I'm very grateful for it, Linux has been my daily driver for almost two decades now (basically from when I stopped using SGI because there was no point any more).

But bugs in large codebases will always be a thing, and even though the eyes looking at FreeBSD are very, very good eyes, indeed there are not enough of them. The more interesting thing here is that they picked a really hard target. If they had done the same with Linux I would expect the number of bugs to be quite a bit higher.


That "many eyes" theory has failed us many times. For example, OpenSSL's heartbleed or the recent React RCEs.


”Most bugs are shallow” is more like it. One could also argue about the number of eyes actually looking at certain parts.


Some kind of cargo plugin that transforms all references in the project into pointers and casts prior to feeding to rustc would probably be the best practice and highly maintainable route I'd go. like "cargo expand" but with a fancy catchier name that encourages new users to rely on it. "cargo autofix" might work


Some of the more "celebrityish" talks tend to be popular by reputation, but content is often reused a lot, e.g. "10 years of Dieselgate" kind of falls into that. Watched the original, and the followup, and I think also the followup-followup, eventually it's worth checking out new topics instead, even though the presenters could not be faulted in any way.

All of these looked good to me this year: https://halfnarp.events.ccc.de/#e72b9560a7c729d1b38c93ef18a5...


do you really suppose replicating the technical requirements of a security-sensitive company of this size in-house would be so easy? I've been doing infrastructure for 25 years and wouldn't want anywhere near this project. but what you will no doubt find is a pool of overconfident volunteers creating exactly the kind of risk outsourcing the problem allowed them to avoid in the first place


The way I understand it is today is when I board on an Airbus I enter an hybrid of a mechanical and digital machine. I understand there is a lot of complex and sensitive software embedded/hosted on that plane that hopefully are not gonna kill me.

So computers are actually core to their business. They probably almost invented things like PLM too.

Nothing Airbus does is easy, this is why there are only about 2 companies like that in the world. This is why I do not see why their hosting have to be outsourced...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: