The scope and complexity of the change is crucial here. There is a sweet spot where spec-driven agent offers great value: the change is small and simple enough that you can describe it reliably with a spec (i.e. you have complete understanding how to do it), while implementing it yourself is a lot more work than writing the said spec plus reviewing and correcting the agent.
For me currently this sweet spot is TINY. It's so small that my usage of Claude Code has dropped to almost none. It's simply more practical to let myself have the agency and drive the development, while letting AI jump in and assist when needed.
Comes to germany, does not like it, makes a picture of a middle finger to Bundestag. This smells like ragebait low effort content I come to HN in order to avoid!
I was initially dismissive of the photo, but it is part of a series, titled "Study of perspective" that Weiwei has been adding to since 1995. He has been giving the finger to many symbols of imperialism and authoritarianism around the world for 30 years.
If one picture is unimaginative and vulgar, I don't see why doing it 30 years somehow makes it "art". It makes me think he's just angry and rude, just like I'd think about someone who goes around flicking people off for 30 years. In a post-modern perspective that's "telling truth to power" or something, but that's just sophistry.
While I chuckle at the fact Ai Weiwei choose to dump on Germany while enjoying unbothered refuge here, this sentence I disliked the most.
It reeks of over-thinking, philosophical elitism.
As someone who was raised by parents born in Communist Romania, talk to anyone born in an Authoritarian regime and they’ll tell you what the absence of freedom means like.
When it comes to assessing freedom, I’d stick closer to German-Romanian literature Nobel prize winner Hertha Müller.
Coincidentally, just like Ai Wewei she’s been living here in Berlin and seems not to feel particularly unfree.
When I was younger I had a few different sources for finding music - a couple of friends who were really into music and I knew they were investing time and searching for it, so I always wanted to hear what they recommend, even if it didn’t match my taste. There was also a curated website and a forum dedicated to alternative genres, like hardcore or post rock and other “edgy” stuff, where I liked to hang out. I knew this is where people really passionate about music gathered and it was interesting to see what they like and what they recommend. It was always driven my community, by people I liked or loved, or trusted their judgement.
Needless to say you get none of that with algorithms. Spotify does recommend some good songs for me regularly and I often add them to “liked” but it’s much lonelier now. Music used to connect me with other people and now it’s just me and my Spotify.
Pretty much listened to what "my crowd" in college listened to. It spanned out in various other directions over time--some by organic discovery via music festivals and the like, some via friends. Mostly don't concern myself too much with "discovery" these days.
mixcloud has been great for this for me. so many people post their mixes and their radio shows there that there is always something new to explore, and searching for something slightly off that i know i like leads to people using that in a mix so i know we're at least partly on the same wavelength when i start to listen.
And then eventually you end up with a list of mixtape makers/DJ's/radio show hosts you trust which is cool, really feels like a world radio show at times.
How do you make money by developing a web browser? You build this immensely complex piece of software and then have no choice but to distribute it for free. It seems like with the current browser landscape the only viable business model for companies building browsers is to make your money elsewhere while investing some of it into the browser development.
Eh, more the wider cultural effects of them. Vibe coding, everyone now creating kitchen sink apps that do everything under the sun, k8s everywhere, agents everywhere. It feels like a big part of the industry has lost all focus and is sprinting towards some endless vague panacea instead of focusing on solving specific well defined business problems.
It’s always been a bit like this but it feels particularly bad since AI hit mainstream coding tooling. Just my 2c :)
i think we need to come to terms with the reality of coding challenges in the interview process. i know i hate them personally, and dread having to interview again because i'll need to open leet code and remember how to do stupid shit like DFS on a graph, or manipulating linked lists. at the same time, a job opening for a SWE is opening soon in our company and we'll have to somehow filter people, and the job market is such that we'll get MANY applicants, most of them probably wrong for the job. i will probably end up giving them coding challenges (not necessarily leetcode, but some coding challenge for sure), because i need a way to grasp their problem solving and coding skills. i don't know a better way to do it in a condensed time frame of a 1 hour zoom call.
The stories of geniuses suffering from depression and other mental illnesses sure make remarkably interesting reads. It’s a pity he didn’t get psychiatric help, this could have been a boring story of an aging scientist taking care of his plants.
I’m sorry if this is a stupid question, but I want to ask it because I see the same sentiment across HN and other forums and I’m legitimately confused.
If we don’t hijack privacy in messaging, how do we fight crime happening on a message platform? If government doesn’t have access to message contents, what’s stopping criminals from using the platform and never get tracked down? Or proven guilty, since all the proof is safely encrypted? Aren’t we hurting ourselves by being so obsessed with privacy? Again, I apologize for ignorance and am curious
The criminals absolutely will move away to something that is outside government control. Many such apps already exist and you can run what you like on Android phones with custom ROMs.
Think about it - if you're a criminal, and you know about chat control, why would you risk your chats being leaked at all? Why wouldn't you use a different app that you know to be more secure (this already happens for any serious crime already btw)
It's precisely the law-abiding people whose privacy will be invaded for no-gain
Criminals don't care about what is "legal", that's what makes the criminals in the first place and in any case, what are you going to do? Ban encryption entirely?
Perhaps we also should ban mathematics and books while we are at it?
After all, criminals can read chemistry books and learn how to make explosives..
Apparently the cartels in Mexico have discovered a way to mix crystal meth with gasoline, put it in a cars tank, drive it across the boarder then separate it from the gas for distribution. The process is supposedly currently unknown to science (I’m no chemist so can’t be sure).
> If we don’t hijack privacy in messaging, how do we fight crime happening on a message platform?
Compare it to: if we don't put cameras and microphones in everybody's houses, how do we fight domestic violence?
You can't control everything, and you shouldn't want to. Giving a certain small group control over a much larger group is not a good idea, because you can never know that that small group will handle their power responsibly.
And domestic violence and crime happening on messaging platforms can still be dealt with in the traditional way: through our court system. And that happens and it works and it is fair (at least in essence, not counting corruption).
Drug manufacturers, like weed farmers (in countries other than the US, for one)? The only reason I am able to use weed as an example is because it pretty much has been normalized, but what about other drugs, say, psychedelics, ketamine, opiates? I would rather not get into the War on Drugs here, anyways.
I'm seeing more value in your comment than, apparently, the other people who have answered it here.
I think you are 100% right that above-the-law communication is not good for society. This should be obvious. At the same time, allowing government to be able to spy everywhere is also not good for society. Also obvious. The correct solution is therefore something in the middle.
I'm not convinced by the other arguments here that usually contains a hint of slippery-slope, what-aboutism or false dichotomy fallacies.
The proposed legislation is terrible: it is not balanced and does not contain any safeguard to avoid abuse.
However, it does not mean that the equally terrible situation of having easy way for criminals to avoid justice is the good solution either.
Personally, I think that a good system should be a distributed system where several independent justice organizations share the set of key needed to decrypt (for example, a message can be decrypted only if Amnesty International, Interpol and the Austrian Justice Department put their 3 keys together, each individual key being useless on its own). In this model, abuses are almost impossible while obvious crime can still be investigated. I don't know any argument that really works to say that this model is not always better than the free-for-all-all-anonymised-messaging.
> I think you are 100% right that above-the-law communication is not good for society. This should be obvious. At the same time, allowing government to be able to spy everywhere is also not good for society. Also obvious. The correct solution is therefore something in the middle.
That conclusion is logically wrong and does not follow from the premises. You are decrying fallacies in other arguments while making them yourself.
> The proposed legislation is terrible: it is not balanced and does not contain any safeguard to avoid abuse.
Then the logical course of action is to try to stop this legislation. Everything else about the argument is irrelevant right now, what matters is the close very bad thing in front of us that we can do something about. Eliminate that and then we can have a reasoned discussion about what the proper approach is.
> several independent justice organizations share the set of key needed to decrypt
There’s no such thing as keys that only good guys have access to. It has been shown time and again that someone with access will abuse it or be tricked.
I won’t go long on this point, however, as I was not familiar with your specific example. I’ll read up more on it. But again, that’s a conversation that matters later.
> That conclusion is logically wrong and does not follow from the premises.
I am not saying "the good argument is moderation" (argument to moderation fallacy), I'm saying "the two extremes are obviously wrong, and, it turns out, the middle is smarter".
The argument to moderation fallacy is when you are saying that the good answer is good because it is in the middle. I don't do that, I find the good answer and I just state that it happens to be in the middle.
I guess you are also doing a fallacy: "every solution that are in the middle is wrong because it can only be the result of the argument to moderation". It is obviously incorrect, there are plenty of solution that happens to be good and being in the middle.
> Then the logical course of action is to try to stop this legislation.
First, I'm not saying that we should not stop this legislation. I'm just answering to a comment saying "and then what", which is a discussion that we are, I hope, allowed to have.
But secondly, a very good way to stop this legislation is by proposing something that checks all the boxes used to justify this legislation while having way better safeguard.
What is your strategy? To say to people that are worried "yeah, well, too bad for you". Or to say "oh, I understand your point, why not this solution, which do what you want, and also avoid what I'm afraid of".
Of course, we both know that one reason this legislation exists is because government want to spy on us. But if we propose something that satisfies all their justifications, they will have to either drop the pretends and openly admit that want to spy (and lose the support of people who are worried), or accept the solution where they cannot spy.
> There’s no such thing as keys that only good guys have access to. It has been shown time and again that someone with access will abuse it or be tricked.
This argument is a footgun: if indeed you cannot trust no-one, then EVERY online communication is already compromised. Your phone is full of spyware, even when you choose the most trustworthy one (because your point is that they don't exist), your softwares and servers are full of back doors, your internet provider and all your VPN are recording your communication, and even if you manage to get through all that, your interlocutor will not (and your interlocutor themselves is not a good guy).
But then, I'm not saying everyone is a good guy, I'm saying that if we share DIFFERENT keys, each key being different and necessary to decrypt (think of a door having several different locks needed different keys), the probability that ALL THE GUYS are bad guys is exponentially low.
If the probability of them being a bad guy or being tricked is 10%, then the probability that a 2-key system is failing is 1%, the probability that a 3-key system is failing is 0.1%, the probability that a 4-key system is failing is 0.01%, ...
> But again, that’s a conversation that matters later.
That's a fallacy. YOU are spending your time answering my comment instead of working to stop this legislation. When I check your account, I can see that you are also posting comments on "Getting 50% (SoTA) on Arc-AGI with GPT-4o" or "Show HN: Paste2Download – No Login, No Ads, Downlo..." instead of stopping this legislation.
Then, suddenly, when some people are having a deeper discussion that can help putting the rug under the feet of the bad guys, you are, incorrectly, arguing that the best strategy would be to not propose any alternative and antagonize the innocent people that are being fooled by the bad guys.
Also, these reflections on alternative approaches exists for a while (Chaum's idea is almost 10 years old). Bad legislation on the subject reappears regularly. It is time we progress instead of just pretending that we never have time for a deeper reflection, which is obviously not true.
> The argument to moderation fallacy is when you are saying that the good answer is good because it is in the middle. I don't do that, I find the good answer and I just state that it happens to be in the middle.
If that’s your position you should remove the word “therefore” from your final sentence. Because that word means that the conclusion was drawn from the previous statements.
> if indeed you cannot trust no-one
That’s not what I said. Though admittedly my argument was too compressed and assumed the reader would understand I’m referring to the often used “good guy law enforcement” arguments. My fault for not having been clearer, I went for brevity.
> That's a fallacy. YOU are spending your time answering my comment instead of working to stop this legislation.
I don’t see how that’s a fallacy. Which one is it? You could maybe call it hypocritical or inconsistent, but none of those are fallacies. Furthermore, the point—which feels ridiculous that it needs to be spelled out—is not that you need to be fighting the legislation 24/7, but that when discussing it you should strive to focus on what it is, not what it could or should be.
> you are, incorrectly, arguing that the best strategy would be to not propose any alternative
Again, that is not what I said. Though in this instance you seem to be taking a bad faith position. I have said twice that it’s worth to have the conversation. I even said I’d read up more on your example because I wasn’t familiar with it. You’re misconstruing my argument in a way that feels really dishonest.
> and antagonize the innocent people that are being fooled by the bad guys.
Especially here. This part is just plain absurd and an attack with zero basis in reality.
> If that’s your position you should remove the word “therefore” from your final sentence. Because that word means that the conclusion was drawn from the previous statements.
What if I would have said "the solution on the left is obviously bad, the solution on the middle is obviously bad, therefore the solution is on the right"? That would obviously not be a argument to moderation fallacy, and yet the logic behind the existence of the word "therefore" stays the same. So, the "therefore" does not imply "it is because it is the middle", so, no, the "therefore" does not imply I've chosen the middle simply because it is the middle.
The "therefore" simply means that if I've explored different options and they are bad, it would be clever to consider another one. It does not mean that the middle solution is chosen _solely_ because it is the middle one.
> I’m referring to the often used “good guy law enforcement” arguments
The solution I'm proposing is not to give keys to law enforcement.
> I don’t see how that’s a fallacy. Which one is it?
A fallacy is a incorrect reasoning in an argument that looks correct superficially. It's what you have done here: there is no logical ground to link your counter-argument to my argument, nothing in your counter-argument implies my argument is incorrect. I'm not going to play fallacy golf, it's usually a sign of loosing the forest for the tree.
> Furthermore, the point—which feels ridiculous that it needs to be spelled out—is not that you need to be fighting the legislation 24/7, but that when discussing it you should strive to focus on what it is, not what it could or should be.
That's a terrible strategy. It's basically: "I don't understand the context, I don't know what this bad legislation tries to solve, I don't know what people who push for this legislation wants, I don't understand how the bad aspect from this legislation have appeared and how to remove them".
Again, I'm proposing a solution that is difficult to say no to from honest people that were tricked into thinking the bad legislation was the only way. You propose nothing, you just say "no" and antagonize your interlocutors. Who do you think is the most efficient for potentially make this legislation fail?
> I have said twice that it’s worth to have the conversation
Exact, and this discussion is happening now, and yet, you are saying "it's not the time to have it". That is incorrect, there is absolutely no reason to not have this discussion now, this discussion is very very useful to fight against this legislation.
> Especially here. This part is just plain absurd and an attack with zero basis in reality.
You realise that in this discussion, all you have done is to attack SOMEONE FROM YOUR SIDE, with the argument that they should not use their brain and try to find solution.
Let's also notice that during this discussion, you haven't talked at all of what this legislation is, what we are arguing about now is basically what would be the best strategy to take it down. Your answer to that seems to be "the best strategy is to not discuss strategy, because we can only discuss about what this legislation is", which in itself does not make sense.
You want to talk about what this legislation is, take a page from your own book and stop arguing with me, let people who want to think about the situation and design clever ideas to end up with a win-win situation do what they want.
I’ll be honest, I didn’t read most of that last message yet. No disrespect meant, I’m just tired and don’t think continuing will be a healthy use of time. For either of us.
> You realise that in this discussion, all you have done is to attack SOMEONE FROM YOUR SIDE, with the argument that they should not use their brain and try to find solution.
I did read this part, as the all caps caught my attention. I did not attack you. Disagreeing with parts of your argument in no way reflects on you. Still, my words have seemingly affected you negatively and for that I apologise as it was not my intention. I wish you a genuinely pleasant week.
> I think you are 100% right that above-the-law communication is not good for society. This should be obvious. At the same time, allowing government to be able to spy everywhere is also not good for society. Also obvious. The correct solution is therefore something in the middle.
The correct solution should be something in the middle. Old-fashioned wiretapping, with a warrant and the need to dedicate staff to installing and monitoring the tap is basically okay. The problem is that the mathematics of cryptography and the scaling inherent to information technology mean that only all-or-nothing solutions are possible. If the cryptography is intentionally broken, it's broken not just for law enforcement, it's broken for everyone. If law enforcement has a backdoor they can use with a warrant, they're capable of using it without a warrant, and probably will. And if their special keys get leaked, then again, the encryption is broken for everyone.
Like you point out, secret sharing is one way of getting around this in principle. But governments would never make their access dependent on an NGO; in practice I'm sure they'd only agree to secret sharing schemes where the separate parties were separated only by nominal bureaucratic firewalls, and then you're back to the original problem.
Right now, there are encrypted communication solutions that exist. None of them have been built by a government. We can build Chaum's network and start to use it. If we do that, government will have to either accept this network as a legal usage, or admit that they don't care about kids and crimes but just want to spy.
And sure, some government will admit that, but it's not by chance that right now the bad legislation are justified by "for the children" instead of "because we want to spy on you": admitting that will make these legislation obviously harmful, and they will be stopped even more easily.
There are some reasons why a network like the one proposed by Chaum is not used. One is that it's not easy to put in place, similarly to how difficult it was to build ethical journalism network for example. A second one is that some people don't want such network, either because they want to do illegal things or they want to spy on citizen. But another reason is the childish mentality of being instantaneously against any ideas that does not fit into the "100% anonymity" of the Silicon Valley techno-libertarian (not saying that anyone who is against is like that, but some who are against are indeed like that).
Replace "chat" with "speak", and suddenly it is trivially obvious to everyone how horrific this is:
> If we don’t hijack privacy in speech, how do we fight crime happening in private conversations? If [the] government doesn’t have access to what you say at home, what’s stopping criminals from using their homes and never getting tracked down? Or proven guilty, since all the proof was said behind closed doors? Aren’t we hurting ourselves by being so obsessed with privacy?
Should we be obsessed with privacy, or should we let the government put microphones in every house just in case there are paedophiles talking about their sex acts and hence getting away with it?
Similar arguments can be made by substituting other things that were traditionally considered the domain of only authoritarian dictatorships, such as opening all letters and reading them before they're delivered by the postal service, or keeping tabs on what books you borrow from the library.
I grew up in one of those countries, and I can tell you that it's not at all nice that they tracked what you photocopied, you know, just in case you wanted to print out anti-party ("one" party!) propaganda... I mean... something... something distributing child porn. Yeah, that's it. That's the reason.
Black & white printers cannot implement the yellow dots at least and even with color printers it is not literally every one of them.
Adding tracking information, while it shouldn't be happening, is also many steps away from e.g. the printer analyzing everything you print and reporting to the government if it is something unapproved.
think of terrorists. Suppose your country has banned/restricted selling guns and explosives. Do you think this will stop terrorists? No, it'll stop normal citizens to not do any of this stuff... but they will not be affected since they are not terrorists.
Terrorists on the other hand will find illegal ways to overcome the restrictions. It's the same with encryption - (open-source) tools with e2ee are already broadly available, sideloading is available - nothing will stop terrorists/other criminals to just install that and continue doing what they want.
On the other hand this chat control opens a huge area of opportunities for govts to spy on citizens or maybe journalists/other politicians that they dislike. And this is if we assume the system doesn't have bugs that would allow third parties/hackers to break it and get all the info by themselves, or bugs that can trigger a false-positive event.
That's why it's a bad idea. Criminals will find ways to overcome the limit, govs will get new tools for suppression (even if current govs are 'good', what if the next govt is some ultra conservative or radical nationalist, do you think they'll not use these new tools?) and normal ppl are basically left without any privacy
You should't fight crime with other crime, that makes you no better than the other criminals and doesn't solve the problem of crime but only increases it, and doesn't make society safer. If you want crime to stop, you should at least give a good example and deal with it in a responsible manner, and not deal with it by being a criminal too. Then you just get society divided even more. It doesn't help anybody. You can't expect other people to cease their crime if you're doing it yourself. A better society starts with being responsible.
Crime should not be approved of, and "crime fighting crime" shouldn't magically get an exception.
"But crime X is way worse than crime Y fighting it".
Crime should certainly be punished, but you cannot punish someone before you can prove their wrongdoing. And we have a court system for that.
Punishing people before they're proven of wrongdoing is criminal in itself. You should't give a certain group of people allowance to put prison collars on others without them having done any crime. At least not if you want to live in a free society.
> Aren’t we hurting ourselves by being so obsessed with privacy?
You’re hurting yourself more by being too lax with it. Remember that a crime is whatever the law says it is. So if an authoritarian government makes it so criticising them becomes a crime and has access to all your communication, good luck ever breaking that cycle. You can use other examples, like making homosexuality illegal.
These are real examples that real governments (or people with a good chance of being elected) want.
Remember that the ultimate goal of these laws is never to “protect the children”—that’s just the convenient given reason, because how could you be against that—but to exert more control over the populace and cement the position of those in power. Even if the current government employs the technology only for good—highly unlikely—you don’t know about the next one.
How do we fight crime happening in houses? If government agents doesn’t have access to house contents, what’s stopping criminals from using locks and curtains in homes to hide their illegal activities and never get tracked down?
If a message platform is gov controlled, criminals will use other alternatives that are not monitored, which they probably already do.
What you end up is just having the gov having all information about private citizens, which can be used against us.
For example, the government of Spain constantly shares private citizen data of people near political adversaries for political reasons, also, I don't want people knowing what I share with my girlfriend, regardless of it they do something or not with it publicly.
It’s not a stupid question. But to think that it will make people more safe is double think. Right wing dictorship is on the rise across Europe and the World. No-one knows the future, but it’s sure looking bleak. We are all making ourselves less secure to the future.
And with AI… ooft. AI will get to a point where it takes over, and decisions like these help it to. We are destroying our future fast.
You realize criminals can stack infinitely many layers of encryption onto any compromised (govt-controlled) channel right? So how is "Chat Control" supposed to be the solution??
> So how is "Chat Control" supposed to be the solution??
Who said it is supposed to be the solution?
Almost no crime problems have a the solution. Instead reducing crime is almost always a matter of a variety of measures that each make the crime a little less likely.
this resonates strongly with me because i just had to nuke an abstraction invented by my colleague, which solved a small DRY issue but introduced a big change difficulty issue.
All we had to do is do some repetitive work for values of a dictionary (stringify and lowercase). We ended up having an abstraction of a dictionary with smart value conversion behaviours, which brought pain every time the business wanted some added custom behaviour (e.g. don't lowercase this property, make human-readable that property, etc). Younger me would keep piling up complexity onto this abstraction. Modern me just duplicated some `str(..).lower()` calls, removed the whole thing and went home happy.
i get the benefit of offloading the admin costs onto your consumers, but, as always, the devil is in the details. i've met both cases where a library should've been a service, and vice versa. this advice is way too broad and abstract to be practical
For me currently this sweet spot is TINY. It's so small that my usage of Claude Code has dropped to almost none. It's simply more practical to let myself have the agency and drive the development, while letting AI jump in and assist when needed.