In general, in pre-industrial societies families built their own houses rather than "society" building them for them. Of course this was because 1) many tribal societies had no concept of land ownership so you could just build wherever someone else wasn't using, and often these were temporary for a season or two anyway 2) later feudal societies where there was land ownership had land mostly owned by a nobleman who allowed his serfs to build their cottages on his land.
I'm not sure "keeping out outsiders" is a bug. The US is experiencing what it is like to be governed by an outsider with no previous political experience and who thinks things like "laws" don't apply to him, and who thinks experts can't be trusted and puts unqualified people in charge of the military, science and health. Politicians need to develop -- they should start with a local position, and "graduate" to a national-level position before they even attempt to rule a nation.
Notably, we could do that while still abolishing first past the post. Requirements for holding a previous position could be added while simultaneously reforming the federal (and hopefully also state) systems to be compatible with multiple parties. I imagine it would be sufficient for each level to require a single term served at the previous level - city or county, state, and federal.
The downside is encouraging career politicians, but the upside is that if you can't win increasingly high stakes elections over a period of 10 years or so then you probably have no business being the president of a country this size.
I think this take highlights one of the core problems our democracy faces - winning elections and governing effectively are entirely different skill sets. These things may even be, in part, antithetical.
I merely intended it as a reasonably general proxy for relevant experience whose ruleset would be difficult to weaponize. I agree that in theory there almost certainly must be better methods than elections by which to select legislators, leaders, and other official positions. However I'm not aware of any in practice, particularly when the inevitability of bad faith attempts to abuse the system are taken into account.
And yet it is the one part of the UK that actually has a language that is spoken by a non-trivial percentage of the population (unlike NI or Scotland where a tiny percentage can speak their Celtic tongue)
> And yet it is the one part of the UK that actually has a language that is spoken by a non-trivial percentage of the population
98% of the UK population can speak English, so I'm not sure where you got that idea. Clearly every part (maybe some small, uncelebrated village breaks the rule) of the UK has a language spoken by virtually the entire population of that region.
> (unlike NI or Scotland where a tiny percentage can speak their Celtic tongue)
If you are struggling to say that England is the only country in the UK that sees most of its population still speak the language of its ancestral roots, then I suppose that's true, but when English is the most commonly used natural language across the entire world I'm not sure that is much of a feat.
What does any of this have to do with the discussion at hand?
Wales/Welsh doesn’t jive under the conditions set. Perhaps you missed "non-trivial percentage"? Outer Hebrides is a part of the UK where ~50% of its residents speak Scottish, never mind England and its English dominance, so clearly ~30% is still considered within trivial range. Otherwise "the one part" doesn't work; seeing many parts of the UK fit the bill.
The thing is, they've purchased so many historic pubs, that if you refuse to drink at one that's a choice. I'm not saying that's a terrible choice, but it's a choice that bars you from an awful lot of pubs.
People should just be into slide-rules period. Particularly in the West. We are always so amazed when people in Asia beat people with calculators using their abacuses, but the West had its mechanical computing device too, and like the abacus it can beat a calculator if used well.
Fine, that would at least teach them that LLMs are doing a lot more than "predicting the next word" given that they can also be taught that a Markov model can do that and be about 10 lines of simple Python and use no neural nets or any other AI/ML technology.
Read the famous "Story of Mel" [1] about Mel Kaye, who refused to use optimizing assemblers in the late 1950s because "you never know where they are going to put things". Even in the 1980s you used to find people like that.
I don't think that does count against the narrative? The narrative is just that each time we've moved up the abstraction chain in generating code, there have been people who have been skeptical of the new level of abstraction. I would say that it's usually the case that highly skilled operators at the previous level remain more effective than the new adopters of the next level. But what ends up mattering more in the long run is that the higher level of abstraction enables a lot more people to get started and reach a basic level of capability. This is exactly what's happening now! Lots of experienced programmers are not embracing these tools, or are, but are still more effective just writing code. But way more people can get into "vibe coding" with some basic level of success, and that opens up new possibilities.
The narrative is that non-LLM adopters will be left behind, lose their jobs, are Luddites, yadda yadda yadda because they are not moving up the abstraction layers by adopting LLMs to improve their output. There is no point in the timeframe of the story at which Mel would have benefitted from a move to a higher abstraction level by adopting the optimizing compiler because its output will always be drastically inferior to his own using his native expertise.
That's not the narrative in this thread. That's a broader narrative than the one in this thread.
And yes, as I said, the point is not that Mel would benefit, it's that each time a new higher level of abstraction comes onto the scene, it is accessible to more people than the previous level. This was the pattern with machine code to symbolic assembly, it was the pattern with assembly to compiled languages, with higher level languages, and now with "prompting".
The comment I originally replied to implied that this current new abstraction layer is totally different than all the previous ones, and all I said is that I don't think so, I think the comparison is indeed apt. Part of that pattern is that a lot of new people can adopt this new layer of abstraction, even while many people who already know how to program are likely to remain more effective without it.
Assembly was a "high level" language when it was new -- it was far more abstract than entering in raw bytes. C was considered high level later on too, even though these days it is seen as "low level" -- everything is relative to what else is out there.
There is a similar problem with genomic sequencing - when new twchnologies began to replace traditional Sanger sequencing about twenty years ago, it was (and still is) called "next generation sequencing" (NGS). But the field is still advancing.
reply