I'm trying to understand an apparent inconsistency in how TypeScript handles intersections between tuple and array types. Here's the code:
const a: [number] & Array<1> = [1]; // Error
const b: [1 | 2] & Array<1> = [1]; // OK
const c: Array<number> & Array<1> = [1]; // OK
const d: [number] & [1] = [1]; // OK
The error message for a
is:
Type '[number]' is not assignable to type '[number] & 1[]'.
Type '[number]' is not assignable to type '1[]'.
Type 'number' is not assignable to type '1'.
Why does TypeScript reject a
but accept the other cases? Is this due to some specific rule in TypeScript's type system regarding tuple and array intersections?
I'm trying to understand an apparent inconsistency in how TypeScript handles intersections between tuple and array types. Here's the code:
const a: [number] & Array<1> = [1]; // Error
const b: [1 | 2] & Array<1> = [1]; // OK
const c: Array<number> & Array<1> = [1]; // OK
const d: [number] & [1] = [1]; // OK
The error message for a
is:
Type '[number]' is not assignable to type '[number] & 1[]'.
Type '[number]' is not assignable to type '1[]'.
Type 'number' is not assignable to type '1'.
Why does TypeScript reject a
but accept the other cases? Is this due to some specific rule in TypeScript's type system regarding tuple and array intersections?
Caveat: This answer is speculative; to get an authoritative answer I'd either have to post an issue in TypeScript's GitHub repository, since I could not locate an existing one which explains this particular behavior; or step through the TypeScript compiler code to figure out what's happening. I'm not prepared to do either of those right now. If someone does this and posts an authoritative answer I'm happy to edit or delete this.
Also note: intersections of array types are ill-behaved at best in TypeScript. See microsoft/TypeScript#41874 and the issues linked within for more information. It's all well and good to use these types to probe weird TypeScript behavior, but if you find them in production code you should try to remove them, work around them, run away from them, et cetera.
TypeScript needs to infer a type for the array literal [1]
. In the absence of any context, it will widen this to number[]
, since heuristically that's what people tend to expect from array literals:
const z = [1];
// ^? const z: number[]
By annotating your variable with an intended type, you're providing a context in which TypeScript should infer the type of the array literal. Generally, when the context includes tuples, TypeScript will try to infer tuple types from an array literal. Generally, when the context includes numeric literal types like 1
, TypeScript will try to infer literal types from numeric literal values instead of just number
:
const y: [1 | 7, string?] = [1]; // okay
const x: [2 | 7, string?] = [1]; // error!
// Type '1' is not assignable to type '7 | 2'.
The top one succeeds because the value [1]
is being inferred as having the type [1]
due to the context of [1 | 7, string?]
implying a tuple with a numeric literal element, and [1]
is assignable to [1 | 7, string?]
. The bottom one fails because [1]
is being inferred as having the type [1]
due to the context of [2 | 7, string?]
implying a tuple with a numeric literal element, but [1]
is not assignable to [2 | 7, string?]
.
The issue you're seeing has to do with some detail about how TypeScript decides how the contextual type affects the inference of the array literal. My speculation is that when a context includes both a tuple and a non-tuple array type, TypeScript decides the tuple type is more specific and uses only that for contextual inference. So in the face of const x: [A] & B[] = [⋯]
or const x: B[] & [A] = [⋯]
, you'll probably see ⋯
being inferred against A
and not B
. That is consistent with what you're seeing:
const a: [number] & Array<1> = [1]; // inferred as [number], fails
const b: [1 | 2] & Array<1> = [1]; // inferred as [1], succeeds
const c: Array<number> & Array<1> = [1]; // inferred as 1[], succeeds
const d: [number] & [1] = [1]; // inferred as [1], succeeds
Playground link to code
[1]
there is[number]
, which indeed isn't assignable to1[]
- not all numbers are 1.const a: [number] & Array<1> = [1 as const];
would succeed, for example, as the type of[1 as const]
is then[1]
(which is both[number]
and1[]
), not[number]
. – jonrsharpe Commented Jan 8 at 13:08as const
required for the other lines? – Matt Timmermans Commented Jan 8 at 13:14[number]
, then that's what you get, and then that failsArray<1>
. This is an edge case, though. Any time you have an intersection of array types, you're going to get bizarre behavior in TS (plenty of GitHub issues about these) so the pragmatic answer is "use these types at your own risk". – jcalz Commented Jan 8 at 16:10