Why does TypeScript reject [number] & Array<1> but accept other similar intersections? - Stack Overflow

admin2025-04-29  3

I'm trying to understand an apparent inconsistency in how TypeScript handles intersections between tuple and array types. Here's the code:

const a: [number] & Array<1> = [1]; // Error
const b: [1 | 2] & Array<1> = [1]; // OK
const c: Array<number> & Array<1> = [1]; // OK
const d: [number] & [1] = [1]; // OK

The error message for a is:

Type '[number]' is not assignable to type '[number] & 1[]'.
  Type '[number]' is not assignable to type '1[]'.
    Type 'number' is not assignable to type '1'.

Why does TypeScript reject a but accept the other cases? Is this due to some specific rule in TypeScript's type system regarding tuple and array intersections?

I'm trying to understand an apparent inconsistency in how TypeScript handles intersections between tuple and array types. Here's the code:

const a: [number] & Array<1> = [1]; // Error
const b: [1 | 2] & Array<1> = [1]; // OK
const c: Array<number> & Array<1> = [1]; // OK
const d: [number] & [1] = [1]; // OK

The error message for a is:

Type '[number]' is not assignable to type '[number] & 1[]'.
  Type '[number]' is not assignable to type '1[]'.
    Type 'number' is not assignable to type '1'.

Why does TypeScript reject a but accept the other cases? Is this due to some specific rule in TypeScript's type system regarding tuple and array intersections?

Share Improve this question edited Jan 8 at 13:08 jonrsharpe 122k30 gold badges268 silver badges476 bronze badges asked Jan 8 at 12:55 AlexisAlexis 4,5091 gold badge27 silver badges35 bronze badges 5
  • 4 It's not rejecting the intersection, it's a valid type, it's rejecting the value. The type of [1] there is [number], which indeed isn't assignable to 1[] - not all numbers are 1. const a: [number] & Array<1> = [1 as const]; would succeed, for example, as the type of [1 as const] is then [1] (which is both [number] and 1[]), not [number]. – jonrsharpe Commented Jan 8 at 13:08
  • 2 Why isn't as const required for the other lines? – Matt Timmermans Commented Jan 8 at 13:14
  • 2 Without diving into the compiler or posting an issue on GitHub any answer I give here would just be speculation. My speculation is that when seeing both a tuple and an array in a type, TS uses the tuple as context and not the array (as the tuple is ostensibly "more specific"); since your tuple is [number], then that's what you get, and then that fails Array<1>. This is an edge case, though. Any time you have an intersection of array types, you're going to get bizarre behavior in TS (plenty of GitHub issues about these) so the pragmatic answer is "use these types at your own risk". – jcalz Commented Jan 8 at 16:10
  • (see prev comment) So how should we proceed here? You need an authoritative answer? Or would a speculative answer suffice? – jcalz Commented Jan 8 at 16:11
  • @jcalz, a speculative answer would suffice! – Alexis Commented Jan 10 at 12:04
Add a comment  | 

1 Answer 1

Reset to default 1

Caveat: This answer is speculative; to get an authoritative answer I'd either have to post an issue in TypeScript's GitHub repository, since I could not locate an existing one which explains this particular behavior; or step through the TypeScript compiler code to figure out what's happening. I'm not prepared to do either of those right now. If someone does this and posts an authoritative answer I'm happy to edit or delete this.

Also note: intersections of array types are ill-behaved at best in TypeScript. See microsoft/TypeScript#41874 and the issues linked within for more information. It's all well and good to use these types to probe weird TypeScript behavior, but if you find them in production code you should try to remove them, work around them, run away from them, et cetera.


TypeScript needs to infer a type for the array literal [1]. In the absence of any context, it will widen this to number[], since heuristically that's what people tend to expect from array literals:

const z = [1];
//    ^? const z: number[]

By annotating your variable with an intended type, you're providing a context in which TypeScript should infer the type of the array literal. Generally, when the context includes tuples, TypeScript will try to infer tuple types from an array literal. Generally, when the context includes numeric literal types like 1, TypeScript will try to infer literal types from numeric literal values instead of just number:

const y: [1 | 7, string?] = [1]; // okay
const x: [2 | 7, string?] = [1]; // error!
// Type '1' is not assignable to type '7 | 2'.

The top one succeeds because the value [1] is being inferred as having the type [1] due to the context of [1 | 7, string?] implying a tuple with a numeric literal element, and [1] is assignable to [1 | 7, string?]. The bottom one fails because [1] is being inferred as having the type [1] due to the context of [2 | 7, string?] implying a tuple with a numeric literal element, but [1] is not assignable to [2 | 7, string?].


The issue you're seeing has to do with some detail about how TypeScript decides how the contextual type affects the inference of the array literal. My speculation is that when a context includes both a tuple and a non-tuple array type, TypeScript decides the tuple type is more specific and uses only that for contextual inference. So in the face of const x: [A] & B[] = [⋯] or const x: B[] & [A] = [⋯], you'll probably see being inferred against A and not B. That is consistent with what you're seeing:

const a: [number] & Array<1> = [1]; // inferred as [number], fails
const b: [1 | 2] & Array<1> = [1]; // inferred as [1], succeeds
const c: Array<number> & Array<1> = [1]; // inferred as 1[], succeeds
const d: [number] & [1] = [1]; // inferred as [1], succeeds

Playground link to code

转载请注明原文地址:http://anycun.com/QandA/1745856887a91294.html