Curiously, since the last update I have the problem where Safari tabs go completely white after a while. Refreshing doesn‘t help. I have to copy the URL and paste it into a fresh tab.
I’ve had this one for at least a year. Suspended tabs won’t come back 1/4 of the time, and the URL is gone too; the tab is effectively gone. I’m keeping important tabs in Brave.
> But then anyone could just instantiate an invalid Name without calling the parse_name function and pass it around wherever
This is nothing new in C. This problem has always existed by virtue of all struct members being public. Generally, programmers know to search the header file / documentation for constructor functions, instead of doing raw struct instantiation. Don‘t underestimate how good documentation can drive correct programming choices.
C++ is worse in this regard, as constructors don‘t really allow this pattern, since they can‘t return a None / false. The alternative is to throw an exception, which requires a runtime similar to malloc.
In C++ you would have a protected constructor and related friend utility class to do the parsing, returning any error code, and constructing the thing, populating an optional, shared_ptr, whatever… don’t make constructors fallible.
Sometimes you want the struct to be defined in a header so it can be passed and returned by value rather than pointer.
A technique I use is to leverage GCC's `poison` pragma to cause an error if attempting to access the struct's fields directly. I give the fields names that won't collide with anything, use macros to access them within the header and then `#undef` the macros at the end of the header.
Example - an immutable, pass-by-value string which couples the `char*` with the length of the string:
It just wraps `<string.h>` functions in a way that is slightly less error prone to use, and adds zero cost. We can pass the string everywhere by value rather than needing an opaque pointer. It's equivalent on SYSV (64-bit) to passing them as two separate arguments:
These DO NOT have the same calling convention. The latter is less efficient because it needs to dereference a pointer to return the out parameter. The former just returns length in `rax` and chars in `rdx` (`r0:r1`).
So returning a fat pointer is actually more efficient than returning a size and passing an out parameter on SYSV! (Though only marginally because in the latter case the pointer will be in cache).
Perhaps it's unfair to say "zero-cost" - it's slightly less than zero - cheaper than the conventional idiom of using an out parameter.
But it only works if the struct is <= 16-bytes and contains only INTEGER types. Any larger and the whole struct gets put on the stack for both arguments and returns. In that case it's probably better to use an opaque pointer.
That aside, when we define the struct in the header we can also `inline` most functions, so that avoids unnecessary branching overhead that we might have when using opaque pointers.
`#pragma GCC poison` is not portable, but it will be ignored wherever it isn't supported, so this won't prevent the code being compiled for other platforms - it just won't get the benefits we get from GCC & SYSV.
The biggest downside to this approach is we can't prevent the library user from using a struct initializer and creating an invalid structure (eg, length and actual string length not matching). It would be nice if there were some similar to trick to prevent using compound initializers with the type, then we could have full encapsulation without resorting to opaque pointers.
> The biggest downside to this approach is we can't prevent the library user from using a struct initializer and creating an invalid structure (eg, length and actual string length not matching). It would be nice if there were some similar to trick to prevent using compound initializers with the type, then we could have full encapsulation without resorting to opaque pointers.
Hmm, I found a solution and it was easier than expected. GCC has `__attribute__((designated_init))` we can stick on the struct which prevents positional initializers and requires the field names to be used (assuming -Werror). Since those names are poisoned, we won't be able to initialize except through functions defined in our library. We can similarly use a macro and #undef it.
Full encapsulation of a struct defined in a header:
Aside from horrible pointer aliasing tricks, the only way to create a `string_t` is via `string_alloc_from_chars` or other functions defined in the library which return `string_t`.
#include <stdio.h>
int main() {
string_t s = string_alloc_from_chars("Hello World!");
if (string_is_valid(s))
puts(string_to_chars(s));
string_free(s);
return 0;
}
Apparently Vista introduced a new audio stack with higher processing overhead and thus latency.
I used to boot up XP in a VM occasionally. It‘s amazing how streamlined everything felt, inviting you to be productive. This was before the Electron apocalypse. Almost all UI looked the same, behaved the same and was easy on the eyes (remember when widgets had depth?).
Didn‘t work for me. Although I did uninstall and reinstall the app first, which the page says you shouldn‘t do. Here‘s to hoping they release a proper fix soon.
I just wanted to say how impressive your documentation is. I expected an average readme.md, but not only is your readme great (the performance table is wonderful), but the full documentation is awesome. It pretty much answers all questions I had. Nice job! I wish all projects were like this.
Yup. I had the same revelation when I learned that many of the colors we perceive don't really "exist". The closest thing to hue in nature is wavelength, but there is no wavelength for purple, for example. The color purple is our visual system's interpretation of data (ratio of trichromatic cone cell activation). It doesn't exist by itself.
It's the same reason that allows RGB screens to work. No screen has ever produced "real" yellow (for which there is a wavelength), but they still stimulate our trichromatic vision very similar to how actual yellow light would.
All colors exist. Color is not the same as wavelength, color is the human perception of a collection of one or more wavelengths of light. They are all real.
I think this very quickly gets into semantics and then philosophy to the point that it’s not really a useful thing to disagree on.
We can objectively measure the properties of the radiation reaching eyeballs and we can detect sensor differences in some eyeballs in various ways. But we can’t ever know that “red” is the same sensation for both of us.
The concept of “red” is real, made concrete by there being a word for it.
But most colours can be associated with a primary wavelength… except purple. So by that definition, they don’t really exist.
> But most colours can be associated with a primary wavelength… except purple. So by that definition, they don’t really exist.
And white, and black. Physically, you'll always have a measurable spectrum of intensities, and some such spectra are typically perceived as "purple". There's no need to pretend that light can only exist in "primary wavelengths".
Even if there's no empirical way to extract some 'absolute' mental notion of perceived color, we can get a pretty solid notion of perceived differences in color, from which we can map out models of consensus color perception.
> Perhaps the did something similar to what dentists do when building on teeth so that the added material is not the only contract point when jaws are closed. That is, a contact sheet that leaves contact marks.
The article linked in this post mentions the possibility of „red clay“ being used for this purpose, as well as being a mortar.
They are efficient FIFOs (queues). You‘ll find them in many places. I know them from multimedia / audio, where you often have unsynchronized readers and writers.
In the audio domain, the reader and weiter are usually allowed to trample over each other. If you‘ve ever gamed on a PC, you might have heard this. When a game freezes, sometimes you hear a short loop of audio playing until the game unfreezes. That‘s a ringbuffer whose writer has stopped, but the async reader is still reading the entire buffer.
Zig‘s “There are too many ring buffer implementations in the standard library“ might also be interesting:
reply