Hacker Newsnew | past | comments | ask | show | jobs | submit | MarkusWandel's commentslogin

What is just awesome, to a tinkerer, is the great can-do spirit people had then. You couldn't buy it, so you made it yourself. They were going to do a helicopter, but for the lack of a suitable engine. As if the rest was just a solved problem, and may well have been, given the background of one of them.

And then to build a working hot air balloon that even looks pretty cool, entirely in clandestine conditions with improvised materials. In a museum in Germany you can even see a homemade twin-engine airplane that was planned for an escape attempt (that didn't happen, maybe just as well) ( https://en.wikipedia.org/wiki/Wagner_DOWA_81 ). Just incredible technical competence everywhere, that fades when the need is gone, when absolutely everything you could want is just an Amazon order away.


I feel like hard situations force people to adapt. This was not a unique time in history, it continues today in warzones and places like Ukraine. It is only a few comfortable western countires who have such comfort of everything being a click and 1 days delivery away.

Opposite experience here. My aging mom had been on Windows XP for years and years, and then someone gave her a cast-off laptop running Windows 10.

That was such a culture shock. Endless pop-ups to do this and subscribe to that and so on. And it has gotten worse since then of course.

Instead I set her up on a nice mature Linux desktop - Mate - and that was fine. Chrome, Thunderbird and not much else. And solid reliable and nobody reaching in from the cloud with the latest attempts to monetize something or push AI onto you or whatever. You turn on (unsuspend) the computer and it's the same computer it was yesterday, working just the same.


My read on this is that Whirlwind was radical because it was 16 bit parallel. Previously anything to get a compute machine working would do, and bit serial is pretty natural. This one was designed from the get-go for speed.

I think it's even more than that - it birthed SAGE and many other descendants. The wild thing is how readable and recognizable the ISA is. Someone ought to build an emulator...

Shannon's information theory asssumes optimal coding, but doesn't say what the optimal coding is.

Anyway if a single observer who lies 20% of the time gives you 4 out of 5 bits correct, but you don't know which ones...

And N such observers, where N>2, gives you a very good way of getting more information (best-of-3 voting etc), to the limit, at infinite observers, of a perfect channel...

then interpolating for N=2, there is more information here than for N=1. It just needs more advanced coding to exploit.


I don't have the math for this, but a colleague does, and after spending a few minutes on Matlab came up with about 0.278 bits/flip of channel capacity (Shannon) for the single observer, and I think around 0.451 bits/flip of channel capacity for the dual observers. That's the theoretical capacity with optimal coding. Whatever coding schemes need to be employed to get there, i.e. what redundancy to add to the bit stream... that's the hard part.

"get not you might it but"

"Erich Von Daemlichen" in typical schoolboy wordplay when I was a kid in Germany in the 1970s - when this nutjob stuff was still current, and frustratingly believed by otherwise sensible adults.

Please don't comment like this on HN, no matter who or what it's about. The guidelines make it clear we're trying for something better here.

https://news.ycombinator.com/newsguidelines.html


The 1541 is a computer, as defined by "can load and run a program". Enough protocol exists on the stock drive/IEC bus/software to do this. Fast load programs used this and I'm sure some copy protection schemes did.

But it's a computer in the same way as a bare-bones microcontroller with an ARM core is, say, the one in your car keyfob. Sure the CPU is capable but paired with just enough ROM and RAM to do the job it needs to do. And in the 1541's case that was only 2KB of RAM.


Unix oldtimer here (first exposure: 1987). A lot of copy/pasting is at the shell prompt. Aside from being super lightweight - just select something in previous output, e.g. a file path, middle click, and done - what about the key bindings? All the world uses ^C for copy, but that already does something conflicting at the Unix shell prompt.

I have to admit that I do feel like an oldtimer though. What I do at the shell prompt, others do in VS Code, and probably 10x faster once they're good at the GUI. So maybe super-lightweight copy/paste at the shell prompt just doesn't matter that much any more.


That is also the one good thing about Window's commandline, you use right click there to copy and paste which is nice. The rest sucks.

I cannot stand the Windows user experience in their command line. The Linux method actually has to software registries that allow for different content to be copied and pasted.

Oh I used CTRL+C to copy something but I need something copied first, highlight paste with middle mouse and paste with CTRL+P.

On Windows you must destroy the content of the CTRL+C and replace it with what the middle mouse can do, go back to the first source to copy and paste again.


You want a clipboard manager/history. You are using middle button paste as a work around for how hard it is to find a good clipboard manager (I'm not sure if one exists...)

I have and use all three on Linux. I only use Windows at work the IT is strict.

Tangential - what do people do faster in vscode than on the terminal ?

The whole "integrated development" experience. Take it or leave it, but old farts like me go all the way back to poring over code on printouts since your only window into it was one file at a time in an 80x25 terminal - not terminal window, actual terminal or, by then, terminal emulator.

That does affect later habits like, for example, hating the information overload from syntax highlighting. And don't even get me started on auto-indent.

Whereas younger colleagues, whose habits were formed in the era of much more sophisticated tools, have dozens of files open in tabs and can follow, say, a "where is this defined" or "where is this driven" (this is RTL code, not normal software) in an instant. Keep in mind some oldtimers had really fancy emacs setups that could do that, and vi users had things like ctags.


They imagine that they're being more efficient.

I argue the opposite! Phone cameras, while hardly perfect, are easily on par with the sort of cameras people used to do street photography with, and improving constantly.

I too remember the "no photos" rules - in the pre-smartphone era. Technically you weren't even supposed to bring a camera in to the workplace (though this was mostly unenforced).

Now you can take pictures and videos of everything, willy nilly, and nobody bats an eyelash. With a camera that you always have with you, whether you anticipated taking photos that day or not.

And yeah, you can't play shallow focus games (notwithstanding that the phone will fake shallow focus with algorithm). And you don't get real zoom (pinch zoom doesn't count).

Oh, on the "real camera" front. Show up with a Canon SX30 ("big" camera, lots of glass in front) and people might notice. But show up with an SX210 (these are cameras I happen to have) and you can get great stealth shots with its 14x zoom but no one the wiser. It's just a small point and shoot, harmless, right? This thing is leaps and bounds more capable than a camera that size back in the pre-digital days.

I'll bet a Gopro will get a pass too.


This used to actually work, at least on some sites. The text would load first, then it would reformat as the fonts and CSS assets are loaded. Ugly and frustrating, which is probably why, now you don't get content until the eye candy is all ready.

But the progressive, text first loading, would be readable from the get go, even if further downloads stalled.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: