Hacker Newsnew | past | comments | ask | show | jobs | submit | Lio's commentslogin

Yep same for me. The knife you can take anywhere without alarming people.

Nicely made and always useful.


I'd nursed a foot callus for years that hurt badly when I walked barefoot. Weeks ago, sitting on the locker room bench, I hit my limit. In desperation I pulled out my pocket knife to do some field surgery. A few minutes into it I glanced up to see two guys sitting across the room staring at me open-eyed as I dug into my foot with the tip of that pointy knife (8.5" with 3.5" blade)! I just smiled and dug that sucker out.

Should have gone after that callus a year ago! Amazing how such a tiny thing can aggravate.

But you're right about a knife alarming people. Years ago in another life I opened a similar knife to cut a cable and my boss literally jumped backward and exclaimed in fear. But he came from a place where, when someone pulls out a knife someone else usually gets stabbed.


> staring at me open-eyed

They were probably just envious you were rocking a Kershaw Iridium Dessert Warrior. Which also comes in at under $100. And the Iridium family are pretty nice knives.

https://www.bladehq.com/item--Kershaw-Iridium-Dessert-Warrio...


I've never spent more than $40 on any knife. The one I spoke of was a cheap S&W from AutoZone (the checkout line "specials" bin) for ~$13 IIRC.

And FWIW I fear if I cut myself with that Kershaw I might grow a pussy.


That is an amazing paint scheme.

I use my knife like a fidget toy. Not usually in public, but one time a sales guy came in and it was just me and him. He's basically a friend.

I flipped the knife out and his eyes got huge, his arms went out sideways and he got in a football stance.

After he calmed down, he told me he was actually attacked with a knife when he was a kid.

Not long after, I finally wore out the fastener on that knife (a buck). Luckily I had already bought a twin for backup.


sejje says >I flipped the knife out and his eyes got huge, his arms went out sideways and he got in a football stance.<

That seems unusual: if I feared in that situation I would flee. His would be a gutsy, dangerous but certainly unexpected move!

What did you do in response: say "16 - 32- HIKE"?


Tangentially, if that callus was a plantar callus (circular with a painful point in the center), you can get sticky pads with salicylic acid from the drugstore that will gradually destroy it. Much safer than digging into your foot with a knife, but I'm glad to hear it worked for you!

Thank you, this is all very useful!

Yes, I didn't know WTF was there but over the years it had grown beyond annoying , becoming so painful I couldn't tolerate it. I thought perhaps something (a splinter, piece of glass or steel, etc.) had become embedded in my foot. I was determined to dig it out. I'm tall and not flexible so I cannot easily see all of the bottom of my foot. But I can reach it.

The callus was surprisingly small (~1/2") and came out in one piece after about 10 minutes of work. Nothing embedded. No bleeding, just a lot of knife-wiggling. The bottom of the foot is really tough!



My school got given one and the science teacher swapped the motorcycle battery for a car battery.

It was great going around the playground.


I used to think that.

I really don't care about most new phone features and for my laptop the M1 Max is still a really decent chip.

I do want to run local LLM agents though and I think a Mac Studio with an M5 Ultra (when it comes out) is probably how I'm going to do that. I need more RAM.

I bet I'm not the only one looking at that kind of setup now that was previously happy with what they had..


Apple has made some good progress on memory sharing over thunderbolt. If they could get that ironed out you maybe could run a good LLM on a cluster of Mac minis. Again you cannot today but people are working on it. One guy might have gotten it to work but it’s not ready for prime time yet.

> Apple has made some good progress on memory sharing over thunderbolt

The only reason that Thunderbolt exists is to expose DMA over an artificial PCI channel. I'd hope they've made progress on it, Thunderbolt has only been around for fourteen years after all.


I've seen the AI-8850 LLM Acceleration M.2 Module advertised as an alternative RPi accellorator (you need an M.2 hat for it).

That's also limited to 8Gb RAM so again you might be better off with a larger 16Gb Pi and using the CPU but at least the space is heating up.

With a lot of this stuff it seems to come down to how good the software support is. Raspberry Pis generally beat everything else for that.


True concurrency (like JRuby), fast performance and thread-safe hashes. What's not to love.

I love that Ruby has 3 high quality implementations in MRI, JRuby and Truffleruby.


I think a lot of that comes down to cost.

If we can drop the price of electricity enough it will naturally become the favoured choice for heating and transportation too.


Memento mori. Death is inevitable but worrying constantly about it, whether your own or your loved ones, is no way to live.

As I get older and as my parents get older I take comfort from that.


Textbook enshittification from YouTube. You'll watch what we want you to watch.

That ignores the possibility that local inference gets good enough to run without a subscription on reasonably priced hardware.

I don't think that's too far away. Anthropic, OpenAI, etc. are pushing the idea that you need a subscription but if opensource tools get good enough they could easily become an expensive irrelivance.


My concern is that inference hardware is becoming more and more specialized and datacenter-only. It won’t be possible any longer to just throw in a beefy GPU (in fact we’re already past that point).

Yep, good point. If they don't make the hardware available for personal use, then we wouldn't be able to buy it even it could be used in a personal system.

There is that, but the way this usually works is that there is always a better closed service you have to pay for, and we see that with LLMs as well. Plus there is the fact that you currently need a very powerful machine to run these models at anywhere near the speed of the PaaS systems, and I'm not convinced we'll be able to do the Moore's law style jumps required to get that level of performance locally, not to mention the massive energy requirements, you can only go so small, and we are getting pretty close to the limit. Perhaps I'm wrong, but we don't see the jumps in processing power we used to see in the 80s and 90s, due to clock speed jumps, the clock speed of most CPUs has stayed pretty much the same for a long time. As LLMs are essentially probabilistic in nature, this does open up options not available to current deterministic CPU designs, so that might be an avenue which gets exploited to bring this to local development.

> there is always a better closed service you have to pay for

Always? I think that only holds for a certain amount of time (different for each sector) after which the open stuff is better.

I thought it was only true for dev tools, but I had to rethink it when I met a guy (not especially technical) who runs open source firmware on his insulin pump because the closed source stuff doesn't gives him as much control.


From some comments I read in this thread, costs could be around 100-500k USD to get anywhere near current frontier models. My concern is that the constant price reductions we saw in cost per transistor (either storage or logic) over the last ~three decades are over, and that the cost per transistor will only go up!

Local inference is already very good on open models if you have the hardware for it.

Yep I agree, I think people haven’t woken up to that yet. Moore’s Law is only going to make that easier.

I’m surprised by how good the models I can run on my old M1 Max laptop are.

In a year’s time open models on something like a Mac Studio M5 Ultra are going to be very impressive compared to the closed models available today.

They won’t be state of the art for their time but they will be good enough and you’ll have full control.


> on reasonably priced hardware.

Thank goodness this isn't in a problem!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: