I use gptel[0] with my denote[1] notes, and a tool that can search/retrieve tags/grep/create notes (in a specific sub folder). It's been good enough as a memory for me.
An example of why a basic understanding is helpful:
A common sentiment on HN is that LLMs generate too many comments in code.
For good reason -- comment sparsity improves code quality, due to the way causal transformers and positional encoding work. The model has learned that real, in-distribution code carries meaning in structure, naming, and control flow, not dense commentary. Fewer comments keep next-token prediction closer to the statistical shape of the code it was trained on.
Comments aren’t a free scratchpad. They inject natural-language tokens into the context window, compete for attention, and bias generation toward explanation rather than implementation, increasing drift over longer spans.
The solution to comment spam isn’t post-processing. It’s keeping generation in-distribution. Less commentary forces intent into the code itself, producing outputs that better match how code is written in the wild, and forcing the model into more realistic context avenues.
1.▲
Show HN: A songwriting DAW built entirely inside an Org-mode buffer(emacs.org)
432 points | 2 hours ago | 89 comments
2.▲
Why I'm still using Soulseek to trade prompt-engineered MIDI files
156 points | 4 hours ago | 42 comments
I admit I hastily tried clicking before realizing these were fake
I find it a surprisingly similar mindset to songwriting, a lot of local maxima searching and spaghetti flinging. Sometime you hit a good groove and explore it.
https://g.co/gemini/share/6aa102571d75
reply