I disagree. Something of type "A" should, according to basic propositional logic, also be of type "A or B". That's the case for an untagged union, but not for a tagged union (because of wrapping), which is decidedly illogical.
Well, I have outlined the usual story of logic as it corresponds to programming (as has been accepted for at least some five decades now); it strains credulity to claim that logic is illogical.
Now I do see where you are coming from; under a set-theoretic interpretation with "implies" as "subset", "or" as "union", and "and" as "intersection", the fact that "A implies (A or B)" tells us that an element of the set A is also an element of the set "A union B".
However, this is not the interpretation that leads to a straightforward correspondence between logic and programming. For example, we would like "A and B" to correspond to the type of pairs of elements of A with elements of B, which is not at all the set-theoretic intersection. And while "(A and B) implies A", we do not want to say a value of type "(A, B)" also has type "A". (E.g., if a function expects an "A" and receives an "(A, A)", we are at an impasse.)
So "implies" should not be read programmatically as a subtyping relation; instead, "A implies B" tells us that there is a function taking a value of type A to a value of type B. In the case of "A implies (A or B)", that function takes its input and tags it as belonging to the left disjunct!
Another perspective I must mention: given a proof of "A or B" and another of "(A or B) implies C", how can we combine these into a simpler proof of just "C"? A proof of "(A or B) implies C" must contain both a proof a "A implies C" and a proof of "B implies C", and we could insert into one those proofs a proof of A or a proof of B. But we have to know which one we have! (This is a very short gloss of a much deeper story, where, under Curry-Howard, proof simplification corresponds to computation, and this is another way of describing a function call that does a case-analysis of a tagged union (or "sum type").)
Now "union and intersection" types with the set-theoretic properties you are hinting at have indeed been studied (see, for example, Section 15.7 of Pierce's "Types and Programming Languages"). But they do not replace the much more familiar "sum and product types", and do not play the central role of "or" and "and" in the correspondence of programming to logic.
> However, this is not the interpretation that leads to a straightforward correspondence between logic and programming. For example, we would like "A and B" to correspond to the type of pairs of elements of A with elements of B, which is not at all the set-theoretic intersection. And while "(A and B) implies A", we do not want to say a value of type "(A, B)" also has type "A". (E.g., if a function expects an "A" and receives an "(A, A)", we are at an impasse.)
Intersection types in TypeScript and Scala 3 do work like conventional intersections / conjunctions. Something of type A&B is of type A. For example, a Set&Iterable is a Set. This makes perfect sense and is coherent with how unions work. A&A is then obviously equivalent to A. I'm not sure where you see the problem.
I suppose I erroneously assumed some familiarity with the correspondence between product types (i.e., types of pairs) and the constructive logical interpretation of "and".
Suffice it to say for now: there is an interpretation of logic that gives a tighter correspondence to programming than the set-theoretic one, under the name "Curry-Howard" or "propositions as types, proofs as programs", and which has been known and cherished by logicians, programming language theorists, and also category theorists for a long time. The logic is constructive as it must be: a program of type A tells us how to build a value of type A, a proof of proposition A tells us how to construct evidence for A. From here we get things like "a proof of A and B is a proof of A together with a proof of B" (the "BHK interpretation"), which connects "and" to product types...
I spoke up because I could not leave untouched the idea that "tagged unions are illogical". On the contrary, tagged unions (aka "disjoint unions", "sum types", "coproducts", etc.) arise forthwith from an interpretation of logic that is not the set-theoretical one, but is a more fruitful one from which programming language theory begins. You are not wrong that there is also a correspondence between (untagged) union and intersection types and a set-theoretical interpretation of propositional logic, and that union and intersection types can also be used in programming, but you are missing a much bigger and very beautiful picture (which you will find described in most any introductory course or text on PL theory).
But I'm pretty sure that even in intuitionistic logic, "A" implies "A or B". Which is not the case with tagged unions (as I said, because of wrapping).
It's true that in intuitionistic logic "A implies (A or B)"; the usual computational interpretation of that is that "there is a function taking a value of type A and returning a value of type A + B", where + is the tagged union, and, per above, that function is exactly the one which tags its input as belonging to the left disjunct.
I suspect you are still reading "A implies B" as "A is a subtype of B", derived from a set-theoretic interpretation of propositional logic. But the constructive interpretation is that a proof of "A implies B" is a method to take a proof of A and transform it into a proof of B. Computationally, a value of type "A implies B" (typically rewritten "A -> B") is a function that takes values of type A and returns values of type B.
Thanks. I think in the end the question will be: which is better, an algebraic or a set-theoretic type system? Which is more practical to use? Which is more elegant? Should both be mixed?
(One complicating aspect is that there doesn't yet exist a mainstream language with full set-theoretic type system. TypeScript and Scala 3 currently only support intersections and unions, but no complements, making certain complex types not definable. E.g. "Int & ~0", integers without zero.)