Hacker Newsnew | past | comments | ask | show | jobs | submit | more quanticle's commentslogin

Is it just me, or does Ed Huang skip over the most important part of database design: actually making sure the database has stored the data?

I read to the end of the article, and while having a database as a serverless collection of microservices deployed to a cloud provider might be useful, it ultimately will be useless if this swarm approach doesn't give me any guarantees about how or if my data actually makes it onto persistent storage at some point. I was expecting a discussion of the challenges and pitfalls involved in ensuring that a cloud of microservices can concurrently access a common data store (whether that's a physical disk on a server or a S3 bucket), without stomping on each other, but that seemed to be entirely missing from the post.

Performance and scalability are fine, but when it comes to databases, they're of secondary importance to ensuring that the developer has a good understanding of when and if their data has been safely stored.


Excellent point. Many discussions here do not emphasize transactional guarantees enough, and most developers writing front-ends should not have to worry about programming to address high write contention and concurrency to avoid data anomalies.

As an industry, we've progressed quite a bit from accepting data isolation level compromises like "eventual consistency" in NoSQL, cloud, and serverless databases. The database I work with (Fauna) implements a distributed transaction engine inspired by the Calvin consensus protocol that guarantees strictly serializable writes over disparate globally deployed replicas. Both Calvin and Spanner implement such guarantees (in significantly different ways) but Fauna is more of a turn-key, low Ops service.

Again, to disclaim, I work for Fauna, but we've proven that you can accomplish this without having to worry about managing clusters, replication, partitioning strategies, etc. In today's serverless world, spending time managing database clusters manually involves a lot of undifferentiated, costly heavy lifting. YMMV.


I agree that actually persisting data reliability is tablestakes for a database, which I would assume Ed takes for granted this needs to work. Obviously lots of non trivial stuff there but this post seems to be more about database product direction than the nitty gritty technical details talking about fsync, filesystems, etc


Also, most of the "action" on this sphere is for the "super-rich" customer: Assume it has more than 1 machine, lots of RAM, fast i/o & fast networks, etc. And this means: It run on AWS or other "super-rich" environment.

There, you can $$$ your way out of data corruption. You can even loss all the data if you have enough replicas and backups.

Not many are in the game of Sqlite.

This is the space I wish to work more: I think not only mean you can do better the high-end but is more practical all around: If you commit to a DB that depends of running in the cloud (to mask is not that fast, to mask is not that reliable, for extract more $$$ from customers, mostly) then when you NEED to have a portion of that data locally, you are screwed and then, you use sqlite!


    There, you can $$$ your way out of data corruption. You can even loss all the data if you have enough replicas and backups.
That's absolutely not true. All the money and all the backups and redundancy in the world won't save you if the data doesn't make it to persistent storage. Even in a totally closed AWS environment, the fallacies of distributed computing [1] still hold. Was there a network connectivity glitch? A latency spike? What happens when two connections attempt to write to the common data store at the same time?

You can't buy your way out of having to deal with the fundamental problem of, "How do I provide the illusion of a single unified system for a highly distributed swarm of microservices?"

[1]: https://en.wikipedia.org/wiki/Fallacies_of_distributed_compu...


> That's absolutely not true.

In the absolute case, yes. But large companies have lost a lot of data, and thanks of $$$ they survive that.

In that scenarios, you pay a lot to not get in problems, but you can pay to overcome them too ("pay bigly for lawyers to follow the law and not get in trouble, and to broke the law and get away with it").

Is not the ideal and exist a point in where that could break badly, but with certain size fatal software failures that will destroy a small company are just "Thursday" for somebody big.

BTW: I don't like this, I prefer to make software solid, but everybody run on C, Js, MongoDb, etc, and this show you can survive a massive crash...


Testing delete code against production isn't that big of a deal. If you screw up, the "oops" is immediate and obvious, as is the recovery strategy — restore from backup.

The real fear with deploying code directly against production is data corruption. I've seen more than one instance where two pieces of code, both of which had impeccable unit test coverage, interacted in strange ways to cause data corruption, because of mistaken assumptions that their respective authors had about each others' code. This is the kind of thing that you need integration tests to catch. Integration tests need some kind of common environment (even if it is ephemeral) in which to run. And now you have a development environment.


lol, I didn't enumerate all the ways this is bad. Restoring from backup can mean an outage and possible loss of data (the intervening transactions that are lost), this is to be avoided, it also assumes your backups work, often an untested case. Corruption is bad to, the entire idea is bad.


The problem is that the countries that are good for crypto, e.g. Venezuela, Nicaragua, Lebanon, etc, aren't really all that good for anything else.


That's not true. Crypto is fine in most of Asia. Most of these countries are very respectable.


It would be unavailable in the sense that US entities would be legally prohibited from transacting with it, in the same way that US entities are prohibited from transacting with Binance (and, to the extent that they're flouting this prohibition, they're getting in trouble for it).


    But that relegates the centralized exchange into the position of being nothing more than a "fiat on-ramp"
That's actually a pretty profitable business. "If you want to interact in any way with crypto, you either have to go through Coinbase or some shady black market crypto dealer," is actually great news for Coinbase.


I mean, you cut off the rest of that paragraph ;P. Sure: it sounds like maybe a reasonable opportunity for Circle, but Coinbase would effectively be adding no value anymore and would have to attempt to charge (a lot, if they wanted to have any hope of making anywhere near as much money as their current business model) for something that not only are they currently doing for free but which they aren't really doing in the first place (as they are just a thin wrapper around Circle). The only reason Circle isn't bothering with this market is seemingly that they didn't want to deal directly with retail and they don't really have to because Coinbase was willing to integrate it for free... but like, Paxos (which is also regulated) does deal with individual users, so there is no reason Circle can't in a world where Coinbase is otherwise irrelevant. (The reality, though, is that there is still benefit for traders to having centralized exchanges due to their limit order book model, so I wouldn't discount Coinbase just yet, anyway ;P.)


    The only reason Circle isn't bothering with this market is seemingly that they didn't want to deal directly with retail
That's actually a really good reason! An analogous situation is with e.g. Robinhood and Citadel. In one sense Robinhood is "just" a retail onramp. All their trades go through Citadel. So in theory Citadel could, at any moment, just cut off Robinhood by allowing retail investors to trade directly with them.

But I'm willing to bet that they won't, because dealing with retail investors is a hassle. You have to have customer service. You have to deal with additional regulation. You have to deal with random social media firestorms. It's a mess. I don't blame Citadel for leaving it to Robinhood to deal with these issues and I wouldn't blame Circle for leaving it to Coinbase to deal with the retail issues surrounding crypto.


    That might seem the case from afar, but once you start writing Haskell and
    start experimenting these space leaks, you will notice that:

    1. 90% of the space leaks you write end up adding a tiny amount of memory
    usage to your functions, mostly unnoticeable. Think thunks like (1 + 2) that
    are subjected to demand analysis under optimization.

    2. 1-2% of them are serious enough to require profiling your code with
    cost-centres.
But that's pretty much the same as in C. The vast majority of memory leaks in C aren't fatal to the program. They just lead to a little bit of extra memory usage, mostly unnoticeable. And then you have the small fraction of memory leaks that draw the attention of the OOM-killer. A tacit admission that detecting code that is leaking memory in Haskell is no easier than detecting code that is leaking memory in C does not speak well for Haskell.

Memory leaks are a matter of correctness and reliability. Our computers are not ideal Turing machines. Their "tapes" are finite. Running out of memory causes the program to crash and produce incorrect results. Arguing that this only happens in a small fraction of cases, and can be handled with testing and profiling isn't persuasive, because one might say the same thing for a dynamically typed language, like Python.


"Space leaks" are not "memory leaks".

A memory leak means a program will never free some region of memory; e.g. if it's pointer has been discarded without calling 'free'. That is certainly a matter of correctness. That is certainly a problem for finite-memory machines.

In constrast, a "space leak" is just a suboptimal usage of memory. As a classic example, we want the sum of a list of integers to fully evaluate the running total at each step, like this:

  sum(0, [1,2,3])
  sum(0+1, [2, 3])
  sum(1, [2, 3])
  sum(1+2, [3])
  sum(3, [3])
  sum(3+3, [])
  sum(6, [])
  6
However, lazy evaluation may avoid performing the additions right away; instead building up unevaluated 'thunks' (nullary functions), which only get evaluated at the end, like this:

  sum(0, [1,2,3])
  sum(0+1, [2, 3])
  sum((0+1)+2, [3])
  sum(((0+1)+2)+3, [])
  ((0+1)+2)+3
  (1+2)+3
  3+3
  6
This is a perfectly correct calculation; and everything has been 'cleaned up' at the end (no worries about 'infinite tapes', etc.). However, if we're trying to e.g. process a massive data stream from disk, these unevaluated thunks may quickly exhaust our available memory.


"A memory leak means a program will never free some region of memory"

The memory gets freed when the program terminates.


This is the memory management technique that many programs, such as GCC, used to very good results.


>I've never seen a civil engineer say they hate how we built bridges.

You must not have spoken with any of the civil engineers I've spoken with. They all describe an absolute wasteland of arbitrary and capricious regulations, competing stakeholders, and last minute requirement changes that require extensive redesign and often very expensive rework.

Software engineering really isn't all that different from all engineering, for better and for worse.


You don't think your life has been substantially improved by Google? Or the Internet more generally? You'd rather go back to the '90s, when you had to carefully watch how much Internet you consumed, so as to keep from running out of AOL hours? You want to go back to an era when you couldn't instantly pull up navigation direction in a foreign city? You like to to take photos on film, paying ridiculous prices for every photo you took, and then paying again to have some stranger paw through your family photos while developing them? You want to go back to an era when batteries were heavy, polluting NiCd bricks, which required special chargers that would completely discharge and recharge them in order to avoid things like the memory effect? You want to go back to a time when you had to call a person on a phone in order to book a flight. You want to go back to a time when, if you moved to a different country, you got to talk to your relatives in the homeland once a month, for five to ten minutes on a noisy analog phone line, because that's all the international long distance you could afford?

Don't get me wrong, I dislike social media just as much as anyone. But do I dislike tech? Would I give up the all innumerable ways that my life has gotten better thanks to Moore's Law, ubiquitous Internet, and the proliferation of tools and services that take advantage of the above two? Absolutely not.


The weird thing is that I never needed computer assisted navigation until it existed. In my local area I simply remembered where streets were, and when traveling I used a map book.

I also cherished photos and put more effort into taking them. Now I just spam the photo button and wait on Google to select the best one to automatically improve and remind me of later. Even then, I have so many that I never find myself flipping through them like I did with physical photos.

I read encyclopedias. The many volumes were an invitation to knowledge, dillineated by pages and sections. Online knowledge is an endless pit of knowledge of questionable value.

Really, it wasn't bad. I'd say peak value was around 98 or 99. Before XMLHTTPRequest became popular, when the web was still mostly documents and forms. NNTP still mattered, and Encarta was useful.

After that I've just felt like I am swimming upriver against an assault to my humanity.


>The weird thing is that I never needed computer assisted navigation until it existed. In my local area I simply remembered where streets were, and when traveling I used a map book.

You are extremely fortunate. It may not be apparent to you, but the advent of GPS, smartphones and Google Maps has been a game changer for so many people. My sense of direction is all right. I'm not a homing pigeon, but given a map, I can generally find my way around. But for other people such as my mom, every trip, outside of some well-traveled routes (like going to work, or going to the store) had to have detailed written directions, and be rehearsed ahead of time, because otherwise she'd get lost. For her, Google Maps has resulted in a substantial improvement in the quality of her life, simply by enabling her to get around in the world without the constant background terror of not knowing how to get home.


What a superficial collection of things to care about. Those things may tickle you a little when you’re taking advantage of them, but they really don’t make a substantial difference in whether you’re spending your days well and happy.

The 1990’s weren’t some dreary hellscape. You worked, you talked to people, you went out for dinner, you did some chores, you traveled. You had some crises that sent your life reeling, and some magical moments that made you grateful for what you had.

It was quite fine.

And in the ways that it was characteristically different(not better or worse), people were more engaged with what was right in front of them, had more shared experiences of life and media to relate about, and were less flooded with constant stimulation.

All the things you mention as accumulating in the years since are about as meaningful as the aisles and aisles of plastic toys I longed for at Toys-r-us as a kid. I thought they mattered, but they really don’t.


>The 1990’s weren’t some dreary hellscape.

You can totally still live a '90s lifestyle today. Cancel your high-speed Internet and tether to your phone for everything. Give up watching YouTube. Give up looking things up on Wikipedia. Film cameras are a dime-a-dozen on eBay, with even high-end SLRs from the '90s selling for less than a hundred dollars. Disable Google Maps and Google Search. Stop posting on Hacker News.

If you think the '90s were better than they were today, by all means, go back.

EDIT: For what it's worth, you can find people in the 1930s saying the same things about electricity and indoor plumbing. Every current generation's necessity is the previous generation's excess frivolity.


You're deluding yourself if you think you would get the same lifestyle by giving up your own post-90s technology. The world as a whole has moved on. Maybe map books aren't at every gas station now. Even if you don't have a phone, the people you talk to are going to be checking theirs while dining with you, or distracted when they get a notification.

I suspect you're not old enough to properly remember the '90s if you really think this would be the equivalent.

I'm not saying it was all sunshine and rainbows (people smoked everywhere, did tons of cocaine, and were probably more openly homophobic/misogynistic/racist, and if you got in an argument over a factual detail you couldn't look up the answer right away).


The 90s has my youth, so...of course they were better.

I remember discovering gopher on university library computers in 1993, like I discovered the fun of CPM on my Dad's Osbourne in 1982. There was a nice mystery back then, things are definitely "better" now, except for prices and traffic.


I’m close! And the more I’ve done so, the less ruminative and anxious I’ve found myself.

Would recommend.


I do think those things are great but I would say that culture is substantially worse. People are less kind/patient, more disrespectful, more isolated, more narcissistic, less family oriented, more politically combative, have shorter attention spans, and are less happy than prior generations.


>I do think those things are great but I would say that culture is substantially worse.

Only if you aren't a misfit. If you had some "weird" hobby or interest, like anime, or science fiction, or heck, even computers, you'd have maybe one, two other people in your life who were interested in that. If you openly talked about your "weird" hobby, you'd be as likely as not socially ostracized and made fun of.

Today, thanks to the internet and social media, one can find forums and discussion groups for any hobby, no matter how weird or esoteric, and have fun conversations with people that have nothing to do with weather, politics, or sportsball.


I was (am) some sort of misfit and I get that it feels good to find community online, but not everything that feels good is the best for us or for society.

Having a village full of people who don’t engage with each other because they’ve found more interesting people online is troubling.

And I don’t know, but it’s plausible that learning a healthy way to integrate with your local community is an important life skill that gets disrupted by these online connections and makes the big picture of one’s life worse than it would have been otherwise. Connecting online may relieve stress the way alcohol relieves stress — genuinely useful in any moment, but easily problematic if you become too reliant on it.


>it’s plausible that learning a healthy way to integrate with your local community is an important life skill that gets disrupted by these online connections and makes the big picture of one’s life worse

Yes, let's go back and tell all the kids that were being bullied in high school merely for being different that their bullies are teaching them important life skills and that they shouldn't retreat into online spaces because that will make the "big picture" of their life worse.


>Having a village full of people who don’t engage with each other because they’ve found more interesting people online is troubling.

Maybe we should think creating a world where everyone can move around and form the sorts of villages they like.


People are much more able to find community niches though, and jerks can be found quite publicly being jerks.

There's huge kindnesses enabled by tech and the internet, like organizing group support, and healthcare kickstarters and so on.

I think youve got rose coloured glasses about the past more than today being particularly worse


That’s a people problem. People can change. Technology just is.


Your observation about the culture being worse is accurate, but none of them are caused by tech.


Do you have evidence about that? Don't you think so many people being terminally online affects the way they act towards others?

Personally, I believe it does. It's even ironic to me that your comment was downvoted, like, why? Is this really how we think about the world nowadays? Dislike/like. No discussion.

Anyways, I have to say Hackers News has been the source of well spent evenings with this nice community, so there's at least one data point there that these changes in culture weren't caused by tech.


There was a Tweet from Andreessen the other day about how tech is heading toward a $100 full wall TV while college (a tech resister) to $1 million [1] - supposedly because of regulation or some other nonsense. Guess what - a college education is worth a great deal more to society and the individual that gets it than a TV of any size will ever be worth IMHO. It is this Tech fixes everything/software eats the world attitude that is pissing people off. The only people that think it is a great thing are people that never have to waste 20 minutes of their time finding their way through a phone menu at a bank because they have a minion that does everything for them and lets them avoid the very tech they create. Disruptive companies make the most money even if they really don't improve people lives.

Tech is reaching a stage where the garbage/useful ratio is > 1. This is actually what I use to like about Musk when he pretended to run his companies and smoked cigars at the Playboy mansion. He actually tried to get truly innovative and useful technology worked on. Not anymore.

[1] https://www.businessinsider.com/marc-andreessen-warns-colleg...


> Guess what - a college education is worth a great deal more to society and the individual that gets it than a TV of any size will ever be worth IMHO.

Indeed, but capitalism does not care about society, but rather is there to make the person selling you the TV even richer.


I don’t think OP is suggesting tech has been an unmitigated negative. I think most of the things that were positives were invented 1995-2008. Beyond that major tech companies seemed to transform into whores to the advertising industry, privacy was destroyed, and for this we got social media, poorer search results, constant communication, etc etc - mostly negatives.


I would give up all these things if I could. In fact that's my life plan, I want to get enough money to be comfortable (not necessarily financially independent) and move into the countryside in some temperate climate country. This is not my retirement plan, it's my life plan.

My retirement will be spent isolated from society, although that's another matter for another day. But seriously, perhaps my perspective is warped by the fact that I'm used to all these things already. Kind of like how the best software is the one you never hear or think about since it doesn't get in your way.


    For non-mainstream languages 95% of what Emacs enables can be done using
    multi-cursors and a macro language built in to the editor, and we're in
    2023. Those kinds of editors are a dime a dozen :-)
Are they? Multi-cursors, sure, but what other editors have a macro language that combines the power and accessbility of emacs lisp? The only other one that comes close is vim, and, as many critiques as I have of emacs lisp, I can firmly state: it's a lot nicer to use than vimscript.

But other than emacs and vim, which other editors allow me to interactively automate portions of my editing workflow? All the other IDEs and editors that you've cited, like IntelliJ or VSCode require you to either find or write a package. That's a much bigger step than just interactively evaluating some lisp to do a one-off thing.


There are many text editors extensible in Lua or in Python. They generally don't allow messing with the innards as much (Firefox proved that's a double edge sword with its extension, it's not an unalloyed good).

https://micro-editor.github.io/index.html

https://lite-xl.com

https://neovim.io

https://code.visualstudio.com

http://www.sublimetext.com

And Emacs Lisp doesn't feel super accessible to most software developers under 40. Almost all its conventions come from a small little island, it's like marsupials in Australia, their own little parallel evolution.

> But other than emacs and vim, which other editors allow me to interactively automate portions of my editing workflow? All the other IDEs and editors that you've cited, like IntelliJ or VSCode require you to either find or write a package. That's a much bigger step than just interactively evaluating some lisp to do a one-off thing.

Devs generally write one-off or maybe reusable shell/Python/... scripts for that. But some of the examples I listed allow you to do a lot of that using Lua.

There are a ton of workflows out there, other devs don't just bang 2 rocks together because they can't automate everything <<inside>> the editor itself :-)

Also, xkcd is always very poignant:

https://xkcd.com/1205/

Software devs routinely fall into this trap:

https://xkcd.com/1319/


> And Emacs Lisp doesn't feel super accessible to most software developers under 40.

Or over 40 (I'm 42) :)

> Almost all its conventions come from a small little island, it's like marsupials in Australia, their own little parallel evolution.

This is a great analogy. And people forget that to program any system effeciently you need to know that system. Not necessarily inside-out, but good enough. A brief look at any emacs config will show you just how many weird and inconsistent things you have to contend with: major modes, minor modes, hooks, global variables, global functions, mode-specific global functions, special lists, non-special lists, and a myriad API calls and functions in between.

Wikipedia says that emacs has 10 000 (ten thousand) built-in commands [1] That's probably on par with JVM :)

[1] https://en.wikipedia.org/wiki/Emacs


The worst part for me is that I wanted a few "simple" UI tweaks with Emacs. I really wanted to use it.

But I wanted it to have a native tab bar and I wanted to move the command bar at the top, with dropdowns instead of "expand-ups" (turns out, I read right-to-left, top-down, not right-to-left, bottom-up. You can't have either of those, in the world's most extensible editor :-(


Emacs has had tab-bar-mode since 27.1 and tab-line-mode since 27.2. As for the drop-down minibuffer (I suspect that's what you mean by "command bar"), you can use something like vertico-posframe* and put it at the top like so:

  (setq vertico-posframe-poshandler #'posframe-poshandler-frame-top-center)
* https://github.com/tumashu/vertico-posframe


> but what other editors have a macro language that combines the power and accessbility of emacs lisp?

Is it accessible? Why would I want to spend time learning the idiosyncrasies of a 40-year-old editor to write something that I might use once in a blue moon?

I have other, more interesting things in life.


    requires barely any ram
It's incredible to me that our tools have bloated to the point where we look at Emacs and think, "Ah yes, what a svelte program!"


But Emacs is incredibly light for the current generation hardware... and it's been incredibly light for probably 15 years or more.

On my "small" laptop, I only use Emacs... Trying to run IntelliJ or VSCode makes the laptop burning hot and forces the fan to keep running. With Emacs, no matter what I do, it's always quiet and cool.


One gripe I have with Emacs is that it doesn't boot instantly enough. Annoys me every time I edit a Git commit message.

Though it's not horrible enough that I found the courage to set up an Emacs server, as easy as it is… Anyway, it would be nice if it could just boot instantly by default.


> One gripe I have with Emacs is that it doesn't boot instantly enough.

You must have accidentally closed it. Closing Emacs or your Web browser are common mistakes, and as you will surely have realized almost immediately, you needed to re-open either program within minutes.


I don't get the important people put on that.

Emacs takes 10 seconds to open. Firefox takes 15. Yes, those are wasted seconds, but it's not like they are anywhere close to the largest wastes of my day. (Hell, I'm writing a comment in HN right now!)

It's different for things like VS, Pycharm, VSCode, or Eclipse. But emacs isn't there.


> You must have accidentally closed [Emacs]. Closing Emacs or your Web browser are common mistakes, and as you will surely have realized almost immediately, you needed to re-open either program within minutes.

Thank you for this! This cracked me up.


DOOM Emacs addresses this issue (and is an incredible, if opinionated, Emacs “platform”): https://github.com/doomemacs/doomemacs

> Gotta go fast. Startup and run-time performance are priorities. Doom goes beyond by modifying packages to be snappier and load lazier.


Simply set EDITOR to emacsclient -a “”. See https://www.gnu.org/software/emacs/manual/html_node/emacs/em...


Setting up emacs server is as easy as:

    alias emacs='emacsclient -c -nw -a ""'
Seriously, that's it!


Why is an emacs server needed


It makes launching emacs practically instant because it’s always running in the background—emacsclient connects to the already running instance so you get to skip the startup process.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: