I concur with you (that this is an excellent introduction)!
Imo, your suggestions are more for intermediate/advanced active listeners that need to interact with folks in their job (e.g. bartenders, reporters, middle managers...).
Still, I feel being repetitive (e.g. 'It sounds like XYZ...is that right?') is better than nothing. Sometimes, training wheels aren't bad when learning how to ride a bike.
author here. Exactly, “it sounds like” etc are training wheels. Use them while you figure out how to do the technique. And yes, when you’re learning, it can sound stilted. As you master it, you don’t need to use those exact phrases any more.
I think the problem statement is: How do you know when to Let Go of the current boulder?
The poem suggested many many many possible when. Here's one: "unless it comes out of / your soul like a rocket,".
Unfortunately (or fortunately), in life, there is no methodology to prove that a given search problem is futile (e.g. NP-complete)... so we have to take our chances and choose. I believe that's the beauty of life: choice.
This is correct. To delve into a topic about cognitive load without talking about germane overhead disqualifies this article (i.e. similar to extraneous overhead in terms of effort but germane overhead is beneficial. Because it helps the coder's reading ability.)
The examples are good but every reader must not have the takeaway that every effortful code is bad (e.g. haskell is extremely hard to read at first but every developer swears it has very high intrinsic cognitive load)
I read the page https://www.succeedsocially.com/morefun. Here's my initial impressions. Pros: it identified several important painpoints and give several decent examples. Cons: Being a truly fun person is all about reaction reaction reaction. Fun people react authentically (while censoring their ahole side because you don't want to be fun but unlikable), ridiculously (while reading the room), and intelligently (playing to the top of the crowd's intelligence).
Consider the intended audience though. This is for people who are lost and need perspective and concrete steps for improving. Compared to all the "fake it 'till you make it" or "just stop caring" type of advice, it's helpful.
> Fun people react authentically (while censoring their ahole side because you don't want to be fun but unlikable)
But here you explain exactly what is difficult. It's like walking a tightrope and someone tells you not to fall to the left and by the way, also not to the right.
Hmmm what is a simple "hello world" project in chip design?
In computer science courses, that's as simple as a println().
In machine learning courses, that's training on mnist dataset to do character recognition.
In electrical engineering, that's buying a raspberry pi to blink led.
In chip design ... Chatgpt says to design a 1-bit full adder using verilog?
...
I understand why the article thinks the market is looking for graduate education. To design a simple chip requires an *initial investment* (as with all hardware startups really). This is different from software where one can simply launch a web app with a container hosted on your preferred cloud provider...
... That said, with the rise of LLMs lowering the barrier of entry of software even lower (e.g. vibe coding), may we see more rise of hardware startups/innovations?
FPGA dev boards are cheap nowadays, and you can start coding in a hardware definition language with a simulator. The ChatGPT answer of doing a 1-bit full adder as "hello world" makes sense.
You are obviously not going to etch silicon at home, but the design part is rather accessible as far as hardware goes.
You can absolutely etch silicon at home. Processes like wet etching (KOH, HF), reactive ion etching (RIE), laser ablation, and even electron beam lithography using repurposed CRTs are all viable at the DIY scale.
They're not used in high-volume manufacturing (you’re not replacing ASML), but they’re solid for prototyping, research, and niche builds.
Just don’t underestimate the safety aspect—some of these chemicals (like HF) are genuinely nasty, and DIY high voltage setups can bite hard.
You're not hitting nanometer nodes, but for MEMS, sensors, and basic ICs, it’s totally within reach if you know what you’re doing.
It's actually pretty scary how far this guy got: 10nm. Makes you wonder if something as powerful as an intel N100, famously a current 10nm chip + DDR4 memory, could be made like this. That would support quite a bit in terms of AI already, even if transformers are probably a bit much too ask.
> rise of LLMs lowering the barrier of entry of software even lower
Getting to your first wafer costs something like $250k and upwards of fab costs, depending on what process you're using. Hence much of chip design effort is already spent on verification, it's probably over 50% by now. This is the exact opposite of vibes because mistakes are expensive.
Businesswise it's quite tough B2B sales because you're selling into other people's product development pipelines. They need to trust you because you can sink their project, way over and above the cost of the actual parts.
Edit: I cannot emphasise enough how much more conservative the culture is in chip design and EE more broadly. It belongs to a world not just before "vibe coding" but before "web 2.0". It's full of weird closed source very expensive tooling, and is built on a graveyard of expensive mistakes. You've got to get the product 100% right on the first go.
Well, maybe the second go, production silicon is usually "B" rev. But that's it. Economics dictate you then need to be able to sell that run for a few years before replacing it with an upgraded product line.
The rule of thumb I use for chip design is that verification takes at least 2/3s of development. Sometimes more. 50% would be nice but I think is optimistic
Verification is indeed the majority of the time spent. Unlike programming, Verilog and VHDL and higher level things like Chisel aren’t executed serially by the hardware they describe like a von Neumann machine. Hello World for a chip isn’t designing the circuit, or simulating the circuit, or synthesizing the circuit to some set of physical primitives. No, it’s proving that the circuit will behave correctly under a bunch of different conditions. The less commoditized the product, the more important it is to know the real PDK, the real standard cell performance, what to really trust from the foundry, etc. Most of the algorithms to assist in this process are proprietary and locked behind NDAs. The open source tools are decades behind the commercial ones in both speed and correctness, despite heavy investment from companies like Google.
And so my point: the place where people best know how to make chips competitively in a cutthroat industry is NOT in schools, but in private companies that have signed all the NDAs. The information is literally locked away, unable to diffuse into the open where universities efficiently operate. Professors cannot teach what they don’t know or cannot legally share.
Chip design is a journeyman industry. Building fault-tolerant, fast, power-efficient, correct, debuggable, and manufacturable designs is table stakes. Because if not, there are already a ton of chip varieties available. Don’t reinvent the wheel because the intersection of logic, supply chain logistics, circuit design, large scale multi objective optimization, chemistry, physics, materials science, and mathematical verification is unforgiving.
I think you would just buy a cheap FPGA board and use that wouldn't you? No need to do a full chip until you know what you are doing. That would be like building a server farm just to do your software hello world
> Few of Twitter's most vocal posters spend time reading contemporary poetry collections, attending readings, or tracing the evolution of forms.
I recently got hooked into contemporary (i.e. modern) poetry. I fully understand why modern poetry seems hard to understand.
I believe most people innately love simple and deep modern poems.
If you like poetry related to nature (sorry major typo!), check out Ada Limon (1)
If you like poetry related to medicine and life, check out ACP poetry prize (2)
As a cloud security analyst that is thinking of going back to coding or DevSecOps, if I'm honest with myself, there is nothing new here that I have not seen before... (This is not a criticism or anything. If anything the problem is myself: if I can allocate time to learn this or use Anki to retain this).
I appreciated the article for emphasising memorising definitions and statement of theorems... But not for proofs. For proofs, a general outline would be sufficient.
For proofs, I find it a good idea to memorise (or at least implicitly retain) the reason a result is true. So, yes, an outline, but minus any of the implementation details of the proof. I kind of think every book in the definition-theorem-proof style should really be definition-theorem-reason-proof.
The reason part being essentially a one or two line natural language summary of ‘why the proof works’ — something that is almost always possible and is enlightening and conducive to efficient memorisation, but that for some reason is very rarely written down explicitly.
I think a better word is "motivation" -- why we chose this option at this juncture instead of many other options. Yes, it's a "reason", but "reason" already means something else.
The "Reason" as result is true is that it follows from the previously established axioms via logical reasoning.
Motivation is important too, but it’s not what I meant. A very simple example would be
Theorem: Every subspace Y of a second-countable topological space X is second-countable.
Reason: Intersecting each set in a basis for X with Y yields a basis for Y.
Proof: [formal symbolic stuff involving open sets and unions, and mentioning cardinality, etc.]
(I’m not claiming ‘reason’ is the best word for this — it probably isn’t. But it’s not the same thing as motivation.)
> The "Reason" as result is true is that it follows from the previously established axioms via logical reasoning.
One could argue this is not the reason a result is true; it’s the reason we know it’s true. The fact that true statements follow from established truths by logical reasoning is more a property of the formal system (which hopefully is sound and consistent) than it is to do with the notion of truth itself.
It depends if you want to be able to prove new things by yourself or not. If you want to do it, then you definitely need to understand /recall all of the whys of every section of the proof. They are all there for a reason. If you don't, you just want the intuition of why the whole theorem is true.
You should definitely memorize most of the "basic" (and short) proofs in some field you are super interested in. The intermediate and advanced proofs, only the outline is sufficient.
> For him, sitting down for twenty minutes is a much more consistent tool for maximizing energy - even compared to sleep or coffee.
Firstly, I'm glad JS (and OP) see very positive effects of meditation but I'm highly skeptical meditation is more powerful than coffee (assuming sleep has diminishing returns). I doubt this is generalisable to most of us.
> Even within my new sessions, I can already feel a difference: after ten minutes, I reach a flow state that just feels great.
> Whereas before I’d be prone to fall into vicious spirals of self-consciousness and unease, a streak of meditation would allow me to calmly, warmly, and directly engage with people. If an awkward moment arose, I wouldn’t feed my internal fire with negative self-talk, but rather look outwards with an internal smile, wait for the moment to pass, and find a clever prosocial solution. But again, that explanation understates the magic.
Secondly, again, I'm happy OP found the answer to the greatest wall for new meditation practitioners: "how do I know if meditation is working???". I haven't found it. Brain states stuffs are more art than science, I believe.
Imo, your suggestions are more for intermediate/advanced active listeners that need to interact with folks in their job (e.g. bartenders, reporters, middle managers...).
Still, I feel being repetitive (e.g. 'It sounds like XYZ...is that right?') is better than nothing. Sometimes, training wheels aren't bad when learning how to ride a bike.