Hacker Newsnew | past | comments | ask | show | jobs | submit | lemonwaterlime's commentslogin

The challenge for most people is that HTML and Modern CSS are a declarative programming paradigm. You then selectively sprinkle state on top of that.

I use HTML and Modern CSS for frontend. I sprinkle htmx when I want interactivity and a tiny amount of plain JavaScript. That gives me a mostly declarative frontend.

On the backend I use Haskell, which is also declarative unless you opt into other things.

The challenge with web development is that we’ve learned over the years that asynchronous, stateful programming is the most bug-prone programming yet we reach for it first. We should reach for those things last and where they are appropriate.

The thing about cascading style sheets is that it all cascades by default. A web page naturally resizes. It’s when we add all this other stuff that things start breaking and becoming rigid. The key to CSS is knowing when to let go.


There's also Julia.

Earlier in my career, I found that my employers would often not buy Matlab licenses, or would make everyone share even when it was a resource needed daily by everyone. Not having access to the closed-source, proprietary tool hurt my ability to be effective. So I started doing my "whiteboard coding" in Julia and still do.


Precisely; today Julia already solves many of those problems.

It also removes many of Matlab's footguns like `[1,2,3] + [4;5;6]`, or also `diag(rand(m,n))` doing two different things depending on whether m or n are 1.


An understated advantage of Julia over MATLAB is the use of brackets over parentheses for array slicing, which improves readability even further.

The most cogent argument for the use of parentheses for array slicing (which derives from Fortran, another language that I love) is that it can be thought of as a lookup table, but in practice it's useful to immediately identify if you are calling a function or slicing an array.


I don't think Julia really solves any problems that aren't already solved by Python. Python is sometimes slower (hot loops), but for that you have Numba. And if something is truly performance critical, it should be written or rewritten in C++ anyway.

But Julia also introduces new problems, such as JIT warmup (so it's not really suitable for scripting) and is still not considered trustworthy:

https://yuri.is/not-julia/


> Python is sometimes slower (hot loops), but for that you have Numba

This is a huge understatement. At the hedge fund I work at, I learned Julia by porting a heavily optimized Python pipeline. Hundreds of hours had gone into the Python version – it was essentially entirely glue code over C.

In about two weeks of learning Julia, I ported the pipeline and got it 14x faster. This was worth multiple senior FTE salaries. With the same amount of effort, my coworkers – who are much better engineers than I am – had not managed to get any significant part of the pipeline onto Numba.

> And if something is truly performance critical, it should be written or rewritten in C++ anyway.

Part of our interview process is a take-home where we ask candidates to build the fastest version of a pipeline they possibly can. People usually use C++ or Julia. All of the fastest answers are in Julia.


> People usually use C++ or Julia. All of the fastest answers are in Julia

That's surprising to me and piques my interest. What sort of pipeline is this that's faster in Julia than C++? Does Julia automatically use something like SIMD or other array magic that C++ doesn't?


I use Rust instead of C++, but I also see my Julia code being faster than my Rust code.

In my view, it's not that Julia itself is faster than Rust - on the contrary, Rust as a language is faster than Julia. However, Julia's prototyping, iteration speed, benchmarking, profiling and observability is better. By the time I would have written the first working Rust version, I would have written it in Julia, profiled it, maybe changed part of the algorithm, and optimised it. Also, Julia makes more heavy use of generics than Rust, which often leads to better code specialization.

There are some ways in which Julia produces better machine code that Rust, but they're usually not decisive, and there are more ways in which Rust produces better machine code than Julia. Also, the performance ceiling for Rust is better because Rust allows you to do more advanced, low level optimisations than Julia.


This is pretty much it – when we had follow up interviews with the C++ devs, they had usually only had time to try one or two high-level approaches, and then do a bit of profiling & iteration. The Julia devs had time to try several approaches and do much more detailed profiling.


The main thing is just that Julia has a standard library that works with you rather than working against you. The built in sort will use radix sort where appropriate and a highly optimized quicksort otherwise. You get built in matrices and higher dimensional arrays with optimized BLAS/LaPack configured for you (and CSC+structured sparse matrices). You get complex and rational numbers, and a calling convention (pass by sharing) which is the fast one by default 90% of the time instead of being slow (copying) 90% of the time. You have a built in package manager that doesn't require special configuration, that also lets you install GPU libraries that make it trivial to run generic code on all sorts of accelerators.

Everything you can do in Julia you can do in C++, but lots of projects that would take a week in C++ can be done in an hour in Julia.


To be clear, the fastest theoretically possible C++ is probably faster than the fastest theoretically possible Julia. But the fastest C++ that Senior Data Engineer candidates would write in ~2 hours was slower than the fastest Julia (though still pretty fast! The benchmark for this problem was 10ms, and the fastest C++ answer was 3 ms, and the top two Julia answers were 2.3ms and .21ms)

The pipeline was pretty heavily focused on mathematical calculations – something like, given a large set of trading signals, calculate a bunch of stats for those signals. All the best Julia and C++ answers used SIMD.


> Part of our interview process is a take-home where we ask candidates to build the fastest version of a pipeline they possibly can. People usually use C++ or Julia. All of the fastest answers are in Julia.

It would be fun if you could share a similar pipeline problem to your take-home (I know you can't share what's in your interview). I started off in scientific Python in 2003 and like noodling around with new programming languages, and it's great to have challenges like this to work through. I enjoyed the 1BRC problem in 2024.


The closest publicly available problem I can think of is the 1 billion rows challenge. It's got a bigger dataset, but with somewhat simpler statistics – though the core engineering challenges are very similar.

https://github.com/gunnarmorling/1brc


The C++ devs at your firm must be absolutely terrible if a newcomer using a scripting language can write faster software, or you are not telling the whole story. All of NumPy, Julia, MATLAB, R, and similar domain-specific, user-friendly libraries and platforms use BLAS and LAPACK for numerical calculations under the hood with some overhead depending on the implementation, so a reasonably optimized native implementation should always be faster. By the looks of it the C++ code wasn't compiled with -O3 if it can be trivially beaten by Julia.


Are you aware that Julia is a compiled language with a heavy focus on performance? It is not in the same category as NumPy/MATLAB/R


Julia is not a scripting language and can match C performance on many tasks.


As your comment already hints at, using Python often ends up a hodgepodge of libraries and tools glued together, that work for their limited scope but show their shaky foundations any time your work is outside of those parts. Having worked with researchers and engineers for years on their codebases, there is already too much "throw shit at the wall and see what sticks" temptation in this type of code (because they'd much rather be working on their research than on the code), and the Python way of doing things actively encourages that. Julia's type hierarchies, integrated easy package management, and many elements of its design make writing better code easier and even the smoother path.

> I don't think Julia really solves any problems that aren't already solved by Python.

I don't really need proper furniture, the cardboard boxes and books setup I had previously "solved" the same problems, but I feel less worried about random parts of it suddenly buckling, and it is much more ergonomic in practice too.


> using Python often ends up a hodgepodge of libraries and tools glued together

At least it has those tools and libraries, what cannot be said about Julia.


What tools/libraries you miss from Julia? Have you used the language or merely speculating?


> What tools/libraries you miss from Julia?

My experience with this website is that it would be rather pointless to enumerate, because you will then point to some poorly documented, buggy and supporting fraction of features Julia "alternatives" to Python packages or APIs that are developed and maintained by well-resourced organizations.

The same thing for tooling - unstable, buggy Julia plugin for VSCode is not the same as having products like PyCharm and official Python plugins made by Microsoft for VS and VSCode.

Now, I will admit that Julia also has some niceties that would be hard to find in Python ecosystem (mainly SciML packages), but it is not enough.

> Have you used the language or merely speculating?

I just saw the logo in Google Images.


> I don't think Julia really solves any problems that aren't already solved by Python.

But isn't the whole point of this article that Matlab is more readable than Python (i.e. solves the readability problem)? The Matlab and Julia code for the provided example are equivalent[1]: which means Julia has more readable math than Python.

[1]: Technically, the article's code will not work in Julia because Julia gives semantic meaning to commas in brackets, while Matlab does not. It is perfectly valid to use spaces as separators in Matlab, meaning that the following Julia code is also valid Matlab which is equivalent to the Matlab code block provided in the article.

    X = [ 1 2 3 ];
    Y = [ 1 2 3;
          4 5 6;
          7 8 9 ];
    Z = Y * X';
    W = [ Z Z ];


This snippet is also cleaner than one in article and more in spirit. Also the image next to whiteboard has a no-commas example.


Yes, Python code is indeed fast if you write it in C++... what a bizarre argument. The whole selling point of Julia is that I can BOTH have a dynamic language with a REPL, where I can redefine methods etc, AND that it runs so fast there is no need to go to another language.

It's wild what people get used to. Rustaceans adapt to excruciating compile times and borrowchecker nonsense, and apparently Pythonistas think it's a great argument in favor of Python that all performance sensitive Python libraries must be rewritten in another language.

In fairness, we Julians have to adapt to a script having a 10 second JIT latency before even starting...


> Pythonistas think it's a great argument in favor of Python that all performance sensitive Python libraries must be rewritten in another language.

It is, because usually someone already did it for them.


That's fair - if you work in a domain where you can solve your problems by calling into existing C libraries from Python, then Python's speed is indeed fine.


>I don't think Julia really solves any problems that aren't already solved by Python.

You read the article that compares MATLAB to Python? It's saying MATLAB, although some issues exist, still relevant because it's math-like. GP points out Julia is also math-like without those issues.


In Julia, you explicitly need to still reason about and select GPU drivers + manage residency of tensors; in RunMat we abstract that away, and just do it for you. You just write math, and we do an equivalent of a JIT to just figure out when to run it on GPU for you.

Our goal is to make a runtime that lets people stay at the math layer as much as possible, and run the math as fast as possible.


Sometimes slower? No, always slower. And no one wants to deal with the mess that is creating an interface with C or C++. And I wouldn’t want to code in that either, way too much time, effort, headache.


Why is the `[1,2,3] + [4;5;6]` syntax a footgun? It is a very concise, comprehensible and easy way to create matrices in many cases. Eg if you have a timeseries S, then `S - S'` gives all the distances/differences between all its elements. Or you have 2 string arrays and you want all combinations between the two.

The diag is admittedly unfortunate and it has confused me myself, it should actually be 2 different functions (which are sort of reverse of each other, weirdly making it sort of an involution).


What happens most of the time with unexperienced or distracted users is that they write things like `norm(S - T)` to compute how close two vectors are, but one of them is a row vector and the other is a column vector, so the result is silently completely wrong.

Matlab's functions like to create row vectors (e.g., linspace) in a world where column vectors are more common, so this is a common occurrence.

So `[1,2,3] + [4;5;6]` is a concise syntax for an uncommon operation, but unfortunately it is very similar to a frequent mistake for a much more common operation.

Julia tells the two operations (vector sum and outer sum) apart very elegantly: one is `S - T` and the other is `S .- T`: the dot here is very idiomatic and consistent with the rest of the syntax.


What does it even mean to add a 1x3 matrix to a 3x1 matrix ?


This is about how array operations in matlab work. In matlab, you can write things such as

    >> [1 2 3] + 1
    ans = [2 3 4]
In this case, the operation `+ 1` is applied in all columns of the array. In this exact manner, when you add a (1 x m) row and a (n x 1) column vector, you add the column to each row element (or you can view it the other way around). So the result is as if you repeat your (n x 1) column m times horizontally, giving you a (n x m) matrix, do the same for the row vertically n times giving you another (n x m) matrix, and then you add these two matrices. So basically adding a row and a column is essentially a shortcut for repeating adding these two (n x m) matrices (and runs faster than actually creating these matrices). This gives a matrix where each column is the old column plus the row element for that row index. For example

    >> [1 2 3] + [1; 2; 3]
    ans = [2 3 4
           3 4 5
           4 5 6]
A very practical example is, as I mentioned, getting all differences between the elements of a time series by writing `S - S'`. Another example, `(1:6)+(1:6)'` gives you the sums for all possible combinations when rolling 2 6-sided dice.

This does not work only with addition and subtraction, but with dot-product and other functions as well. You can do this across arbitrary dimensions, as long as your input matrices non-unit dimensions do not overlap.


It means the same thing in MATLAB and numpy:

   Z = np.array([[1,2,3]])
   W = Z + Z.T
   print(W)
Gives:

   [[2 3 4]
    [3 4 5]
    [4 5 6]]
It's called broadcasting [1]. I'm not a fan of MATLAB, but this is an odd criticism.

[1] https://numpy.org/devdocs/user/basics.broadcasting.html#gene...


One of the really nice things Julia does is make broadcasting explicit. The way you would write this in Julia is

    Z = [1,2,3]

    W = Z .+ Z' # note the . before the + that makes this a broadcasted
This has 2 big advantages. Firstly, it means that users get errors when the shapes of things aren't what they expected. A DimmensionMismatch error is a lot easier to debug than a silently wrong result. Secondly, it means that julia can use `exp(M)` etc to be a matrix exponential, while the element-wise exponential is `exp.(M)`. This allows a lot of code to naturally work generically over both arrays and scalars (e.g. exp of a complex number will work correctly if written as a 2x2 matrix)


Julia competes with the scientific computing aspect of matlab, which is easily the worst part of matlab and the one which the easiest to replace.

Companies do not buy matlab to do scientific computing. They buy matlab, because it is the only software package in the world where you can get basically everything you ever want to do with software from a single vendor.


In addition: Simulink, the documentation (which is superb), and support from a field application engineer is essentially a support contract and phone call away.

I say this as someone who’d be quite happy never seeing Matlab code again: Mathworks puts a lot of effort into support and engineering applications.


It is hard to explain that to people here.


I remember the pitch for Julia early on being matlab-like syntax, C-like performance. When I've heard Julia mentioned more recently, the main feature that gets highlighted is multiple-dispatch.

https://www.youtube.com/watch?v=kc9HwsxE1OY

I think it seems pretty interesting.


Julia is actually faster than C for some things.


julia is still clunky for these purposes! you can't even plot two things at the same time without it being weird and there's still a ton of textual noise when expressing linear algebra in it. (in fact, i'd argue the type system makes it worse!)

matlab is like what it would look like to put the math in an ascii email just like how python is what it would look like to write pseudocode and in both cases it is a good thing.


simulink is the matlab moat ,not just general math expression


Pictorus is a simulink alternative https://www.pictor.us/simulink-alternative


Rather than its "decline was", Perl's existence is cultural. All programming languages (or any thought tools) are reflections and projections of the cognitive values of the community who creates and maintains them. In short, the Perl language shares the structure of the typical Perl dev's mind.

A shift to Python or Ruby is fundamentally a shift to a different set of core cognitive patterns. This influences how problems are solved and how sense is made of the world, with the programming languages being tools to facilitate and, more often than not, shepherd thought processes.

The culture shift we have seen with corporations and socialized practices for collaboration, coding conventions, and more coincides with the decline of a language that does in fact have a culture that demands you RTFM. Now, the dominant culture in tech is one that either centralizes solutions to extract and rent seek or that pretends that complexity and nuance does not exist so as to move as quickly as possible, externalizing the consequences until later.

If you've been on this forum for a while, what I am saying should seem familiar, because the foundations have already been laid out in "The Pervert's Guide to Computer Programming", which applies Lacanian psychoanalysis to cognitive patterns present in various languages[1][2]. This explains the so-called decline of Perl—many people still quietly use it in the background. It also explains the conflict between Rust and C culture.

As an aside, I created a tool that can use this analysis to help companies hire devs even if they use unorthodox languages like Zig or Nim. I also briefly explored exposing it as a SaaS to help HR make sense of this (since most HR generalists don't code and so have to go with their gut on interviews, which requires them to repeat what they have already seen). With that stated, I don't believe there is a large enough market for such a tool in this hiring economy. I could be wrong.

[1] [PDF] -- "The Pervert's Guide to Computer Programming" https://s3-us-west-2.amazonaws.com/vulk-blog/ThePervertsGuid...

[2] [YouTube Vulc Coop]-- https://www.youtube.com/watch?v=mZyvIHYn2zk


This is interesting to me, as someone moving from a company that uses C++ to one that uses Rust. It feels like the whole culture of the former company is built similarly - no guardrails, no required testing, or code review, minimal "red-tape".

In effect, the core principles of the company (or at least, the development team of the company) end up informing which programming language to use.


>The Pervert's Guide to Computer Programming

That's a rather glamorous piece of discourse, yall aint sleepin


Sorry for this repetition, but how popular was/is monkeypatching in Perl?


Taken at face value, this is engineering negligence. I've done industrial design with plastics and 3D printed parts. Regardless of the forming techniques, with plastics you still need to consider properties like minimum melting temperatures, tensile stress, and so forth. Then you must test that rigorously. This is all standard procedure. That information is in the data sheet for the material.

I did a quick search and found that many plastics are governed by ISO 11357 test standard [1]. Some of the plastics I have worked with used this standard.

A spec sheet for that material is here [2].

[1]: https://www.iso.org/standard/83904.html

[2]: https://um-support-files.ultimaker.com/materials/1.75mm/tds/...


Also, strictly as a combo 3D-printing and engine enthusiast: Never with a GUN to my head would I install 3D printed parts in a CAR engine, let alone in an aircraft engine. This is spectacularly poor judgement on the part of the owner.


Then you are not up to speed with what the 3D printing world has to offer. You can 3D print full metal stress free parts and chances are very high that if you have flown in an airplane in the last five years that some of the parts of that plane (and I'm not talking about trim here) were made using additive processes.

Rocket engines can be 3D printed, in fact there are some engines that can only be made using that kind of technique due to internal structures.


Yes, real parts CAN be 3D printed and even used successfully.

The printing is the easy part.

The extensive testing and validation that it will actually work as intended and in your situation is the hard part.

Skip that hard part, especially for anything that flies, and you are risking lives, both those in the air and on the ground.

Seriously, just because the specs on the label say X and other docs say the running temperature is Y, does NOT mean it will work. Take the measurements in your situation, test the thing extensively on the ground.

Then, maybe, it'll be worth flying. Or, you'll be there after some hours of testing saying: "good thing I didn't try to fly with this", and still have a usable aircraft.

Edit: missing words, clarity.


Indeed, I think I already covered that in an older comment and didn't want to repeat the same info: https://news.ycombinator.com/item?id=46159905


Depends. Some older or rare cars have no source for parts. 3D printing has been a boon to keeping them operating. However you absolutely have to use appropriate materials to avoid problems or failures, and know where it isn't feasible.


FWIW, I wouldn't hesitate to install a 3D printed air-filter housing in my car, if I had printed it myself out of e.g. PAHT or sourced it from a trusty vendor. It's not rocket science, just engineering.


Well, there are more and less important parts of the car. I wouldn't bat an eye for 3d printed dash parts or the extreme example, a cup holder, but on flip side anywhere where there is heat is potentially bad for anything 3d printed with heat that's not metal or some hard to print high temp stuff, and anywhere where mechanical robustness = safety is spot where you want something very well tested, not "I printed it and it looks light".


> Never with a GUN to my head would I install 3D printed parts in a CAR engine, let alone in an aircraft engine.

The fabrication technology doesn't matter. The qualification process, on the other hand ...

This is the primary reason why I never got a pilot's license. I suspect I would spend far too much time making sure the maintenance was up to standard and far too little actually enjoying flying.


You should be thrilled to know that any plane you will learn to fly in typically has full maintenance records for the entire life of the plane, including who did the work, their FAA certificate number, and all of the paperwork for any parts that were involved in the repair.

The shortcut is to ask the mechanic to come for a test flight after repairs. The place I learned to fly was owned by a mechanic, and the daughter ran the flight school. Given that the daughter might be test flying the mechanic's work, I trusted him to keep his planes in good shape.


> The fabrication technology doesn't matter. The qualification process, on the other hand ...

Well, yes, but... In this case the fabrication technology and the lack of qualification process likely go hand in hand. They wouldn't have a qualification process unless they were manufacturing enough of these that plastic 3d printing wouldn't be cost effective. The shortcut is the point.


I wouldn’t be that absolute, but not until Boeing and Airbus use them in their aircraft on a regular basis.


Yes but are they printed with PLA or PETG, or even ABS? Or are they using material designed exactly for their use case, and tested thoroughly before being certified for flight?

Or do they get their parts from some vendor at a swap meet who spends most of his time fiddling with his Ender 3?


Neither of those is suitable for this application. Ultem or PEEK. Anything else would be a very bad idea, and even for those two you would want to do a lot of testing.


That was my point. They used the wrong filament. And there isn't really a right one for the cowl of a single engine aircraft


I'm sure it's fine you do it properly ([1] for example). The issue here was the utter lack of engineering, not the specific manufacturing technique (although those do seem to be highly correlated, due to low-end 3D printing having become very cheap and easy).

[1] https://www.youtube.com/watch?v=rV74KhPNg1w


I like to do data-oriented programming, and was just thinking about how I want to organize (and search through) the primary data structures/concepts for a project I'm working on. Part of that involved thinking about things like what information I might cache and what representations data might take. That lead me to looking into the nuances of things like B-Trees, AVL Trees, Quadtrees, k-d trees and so forth.

I've found the book "Foundations of Multidimensional and Metric Data Structures" by Hanan Samet to be an excellent resource when looking for a slightly deeper dive than a more introductory algorithms course. It goes in depth on the nuances of these approaches, many of which are highly similar at a cursory glance.


Dammit why these books have to be $60


Just ask anna if she has it in her archive


I see USD $36 on ebay, used. It's a smaller barrier to entry.


because they have low demand, meaning they would lose money if the price were lower


Ruby is a joy to program in. I started exploring it after using Haskell and Smalltalk and was pleasantly surprised when the language would do things like both of them.


fwiw As someone without Ruby programming skills, my experience was that it does unpredictable "magic" things, which I did not find helpful when writing mundane code. fwiw


Here are four small things to remember when working with Ruby:

1. Everything is an object and the main thinking is that you send messages to objects not call methods on objects. This is very important and the core of how the language works and moreso important when reading Ruby code.

2. `false` and `nil` are falsey. Everything else is truthy when used directly in conditionals. Eg: if variable will go on true-branch when variable is anything else than `false`/`nil`. Else it will go on the else-branch.

3. Start irb (the interactive console) and use <object>.inspect + <object>.class to see what is happening. Ruby has great introspection. Remember the first thing I said here (everything is an object) so you can inspect almost anything.

4. In Ruby paranthesis are optional. Eg: user.upgrade_to "admin" is actually user.upgrade_to("admin")


Was it in a Rails app? I'm not sure what magical things standard lib Ruby would do but Rails does a lot.


“Discrete-Time Signal Processing” by Oppenheim is another great resource.


coreutils, nix, vim, Haskell (ghc), postgresql, latex


Exactly this. With a comfy life, you can mature later. The more hardships and adversity one must overcome the faster that maturation happens. Particularly so when the hardships put one on an abnormal path.

Losing a close relative or a job is normal adversity that everyone will go through but not everyone has. Going through those other things while having a different philosophy or life ethos than those around you, thus also causing you to prioritize and pursue different things in life adds a different layer of challenge. That causes you to have to figure stuff out on your own and thus contributes to maturing in a different manner and at a different rate.


The author was a bit poetic in writing this post and most commenters seemed to have missed the point.

When he starts talking about the “hygiene of the programmer”, he is referring to the concept of a “code smell” rather than making literal statements about the literal cleanliness of programmers.

From there, he is saying that the industry has distanced itself from object-oriented programming because it often causes problems and added “smells” to the architecture of code bases. This is regardless of what your specific definition of OO is.

Finally, he ends by raising awareness around the fact that even if people claim to not use much OO in their codebases, when you look at the total architected solution, those various services like Docker and so on are themselves various Gang of Four style OO patterns. Because we talk about OO in code, we are not watching the OO that happens around the code.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: